-- DEFINE &TableOwnerPattern = &1 -- DEFINE &TableNamePattern = &2 -- COLUMN owner FORMAT A25 HEADING 'Owner' -- COLUMN table_name FORMAT A30 HEADING 'Table' -- COLUMN constraint_name FORMAT A30 HEADING 'FK constraint' -- COLUMN column_list FORMAT A62 HEADING 'FK columns(s)' WORD WRAP WITH constr AS ( SELECT c.owner, c.table_name, c.constraint_name, listagg(col.column_name, ', ') within group (ORDER BY col.column_name) column_list_alphabetic, listagg(col.column_name, ', ') within group (ORDER BY col.position) column_list FROM dba_constraints c JOIN dba_cons_columns col ON ( col.owner = c.owner AND col.constraint_name = c.constraint_name AND col.table_name = c.table_name ) WHERE c.constraint_type = 'R' AND c.owner like '&TableOwnerPattern' AND c.table_name like '&TableNamePattern' GROUP BY c.owner, c.table_name, c.constraint_name ), idx AS ( SELECT table_owner, table_name, listagg(column_name, ', ') within group (ORDER BY column_name) column_list_alphabetic, listagg(column_name, ', ') within group (ORDER BY column_position) column_list FROM dba_ind_columns WHERE table_owner like '&TableOwnerPattern' AND table_name like '&TableNamePattern' GROUP BY table_owner, table_name, index_owner, index_name ) SELECT constr.owner, constr.table_name, constr.constraint_name, constr.column_list FROM constr WHERE NOT EXISTS ( SELECT 1 FROM idx WHERE constr.owner = idx.table_owner AND constr.table_name = idx.table_name AND ( idx.column_list_alphabetic = constr.column_list_alphabetic OR instr(idx.column_list,constr.column_list) = 1 ) );
To Scrape Data using python we are using BeautifulSoup python Package !pip install beautifulsoup4 As a first step we have to import the packages and html page that we need to scrape. In here I have used some static HTML content which was customized to scrape the data. #imports import requests from bs4 import BeautifulSoup #html HTML Sample Doing Data Science with Python Doing Data Science with Python Author: Eranda Kodagoda This will help to perform various data science activitied using python Modules Title Duration in minutes Getting Started 20 Setting Up Environment 40 Extracting Data 30 Exploring and Processing Data 45 Building Productive Model 45 To View the HTML using beautifulsoup we can use below code-lines and execute on python executor from IPython.core.display import display, HTML display(HTML(html_string)) To Print the HTML using beautifulsoup we can use below code-lines and execute on python executor ps=Be...
Comments
Post a Comment