Out of my depth here.
Have an assignment to download data from web and get it into SQL lite.
have pieced together some different code.
the key code is below. Suggestions on fixing how to get the data into the SQL table appreciated.
The API code works fine, It downloads a header row, and rows containing a country name and visitor numbers from that country. so its the SQL that i'm trying to write thats failing. No errors, just no data going in.
return service.data().ga().get(
ids='ga:' + profile_id,
start_date='2016-04-01',
end_date='today',
metrics='ga:users',
dimensions='ga:country').execute()
def print_results(results):
print()
print('Profile Name: %s' %results.get('profileInfo').get('profileName'))
print()
Print header.
output = []
for header in results.get('columnHeaders'):output.append('%30s' % header.get('name'))
print(''.join(output))
Print data table.
start databasing results
if results.get('rows', []):
for row in results.get('rows'):
output = []
for cell in row:output.append('%30s' % cell)
cur.execute('SELECT max(id) FROM Orign')
try:
row = cur.fetchone()
if row[0] is not None:start = row[0]
except:
start = 0
row = None
cur.execute('''INSERT OR IGNORE INTO Origin (id, Country, Users)
VALUES ( ?, ?, )''', ( Country, Users))
conn.commit()
cur.close()
print(''.join(output))
start = 0
else:
print('No Rows Found')
if __name__ == '__main__':
main(sys.argv)
Related
I am retrieving data from an Orchestrator (with a JSON code) to Power BI and for authentication I need to give the variable OrganizationUnitID as follows:
let
Path = "/odata/A",
Auth = Json.Document(Web.Contents(BaseUrl, [Headers=[#"Content-Type"="application/x-www-form-urlencoded"], Content=Text.ToBinary(Uri.BuildQueryString(Credentials))])),
Token = Auth[access_token],
A = Json.Document(Web.Contents(TenantUrl&Path, [Headers=[Accept="application/json", #"Authorization"="Bearer " & Token, #"OrganizationUnitId"="12345"]]))
in A
(PS: I did not post the entire query so that the post is not so long).
However, I would like to retrieve data for all OrganizationUnitIDs. These I can also retrieve as a query (= Id1 below), it currently comes as a list with 15 values:
let
Path = "/odata/Test",
Auth = Json.Document(Web.Contents(BaseUrl, [Headers=[#"Content-Type"="application/x-www-form-urlencoded"], Content=Text.ToBinary(Uri.BuildQueryString(Credentials))])),
Token = Auth[access_token],
Test = Json.Document(Web.Contents(TenantUrl&Path, [Headers=[Accept="application/json", #"Authorization"="Bearer " & Token]])),
value = Test[value],
#"Converted List to Table" = Table.FromList(value, Splitter.SplitByNothing(), null, null, ExtraValues.Error),
#"Expanded Records to Table" = Table.ExpandRecordColumn(#"Converted List to Table", "Column1", {"Name","Id"}, {"Name","Id"}),
Id1 = #"Expanded Records to Table"[Id]
in
Id1
My question is: can I / how can I combine this second query with the first one, so that in the first one I can include all the 15 OrganizationUnitIds?
I have tried some solutions posted online but so far none have worked.
Hello i am trying to write an existing xlsx file using phpspreadsheet with setActiveSheetIndexByName(sheetname) and setcellvalue with reference and value, but it updates only the last record. spent more than 12 hours on this.
i tried foreach instead of while and used a counter to increment, but none worked.
<?php
include_once('db.php');
$prospect = $_REQUEST['prospect'];
require 'vendor/autoload.php';
use PhpOffice\PhpSpreadsheet\IOFactory;
use PhpOffice\PhpSpreadsheet\Spreadsheet;
use PhpOffice\PhpSpreadsheet\Writer\Xlsx;
$sql1 = mysqli_query($db,"select filename,sheetname, row, responsecol,compliancecol,response, compliance from spreadsheet where `prospect`='$prospect' and response <>'' order by row");
//$row=1;
while($row1 = mysqli_fetch_assoc($sql1))
{
$filename= $row1['filename']; //test.xlsx
$sheetname= $row1['sheetname']; // mysheet
$responsecol= $row1['responsecol'].$row1['row']; //D1
$response= $row1['response']; //response
$compliancecol= $row1['compliancecol'].$row1['row']; //C1
$compliance= $row1['compliance']; //compliance
$spreadsheet = \PhpOffice\PhpSpreadsheet\IOFactory::load($filename);
$spreadsheet->setActiveSheetIndexByName($sheetname)
->setCellValue($compliancecol,$compliance)
->setCellValue($responsecol,$response);
//$row++;
}
$writer = IOFactory::createWriter($spreadsheet, 'Xlsx');
$writer->save("newfile.xlsx");
exit;
?>
i wish each of the row from mysqli result updates each reference cell with value.
The easys way is to set a Limit of 1 to your MySQL query. That takes only one value from your data. If you will the last you should sort DESC.
$sql1 = mysqli_query($db,"select filename,sheetname, row, responsecol,compliancecol,response, compliance from spreadsheet where `prospect`='$prospect' and response <>'' order by row DESC LIMIT 1");
My table data has 5 columns and 5288 rows. I am trying to read that data into a CSV file adding column names. The code for that looks like this :
cursor = conn.cursor()
cursor.execute('Select * FROM classic.dbo.sample3')
rows = cursor.fetchall()
print ("The data has been fetched")
dataframe = pd.DataFrame(rows, columns =['s_name', 't_tid','b_id', 'name', 'summary'])
dataframe.to_csv('data.csv', index = None)
The data looks like this
s_sname t_tid b_id name summary
---------------------------------------------------------------------------
db1 001 100 careie hello this is john speaking blah blah blah
It looks like above but has 5288 such rows.
When I try to execute my code mentioned above it throws an error saying :
ValueError: Shape of passed values is (5288, 1), indices imply (5288, 5)
I do not understand what wrong I am doing.
Use this.
dataframe = pd.read_sql('Select * FROM classic.dbo.sample3',con=conn)
dataframe.to_csv('data.csv', index = None)
I am trying to select a value from my python SQL table based on a variable that the user inputs, through Tkinter. My database has a column named employee_username and has the employee's usernames and their password in 1 row. Username is entered by the user in a tkinter window
My code looks like this:
import sqlite3
import password_database
import tkinter
conn = sqlite3.connect('passwordDb.db')
c = conn.cursor()
username=entry_user.get()
password=entry_user.get()
database_username=c.execute(SELECT * FROM passwordDb WHERE
employee_username=username)
if database_username!=' ':
print('you entered a username which is not in the database')
else:
running=True
When I run this code I am not able to get any results. How do I manage to check if the value the user enters is in my database and how to I retrieve employee's password attached to the username.
Thanks in advance
Your code is not complete but I guess i understand you.
Before you query the database, you need to strip of white spaces from the username or the input
username = input("username")
db_username = username.strip() #this removes the white space
# you can then check to see if the username is none...
records = cur.fetchall(); # this would return all matching records, loop through
if not database_user is None:
...
else:
print("your username is bad")
I see few mistakes
First: query should be as string, and it should have ? and username should be tuple (username, ) as argument
c.execute("SELECT * FROM passwordDb WHERE employee_username=?", (username, ))
Second: you have to use fetchall() or fetchone()` to get list with results or first result.
rows = cur.fetchall()
row = cur.fetchone()
Third: execute() doesn't have to return value so there is no sense to comparer with string " ". fetchall()/fetchone()` returns row(s) with result(s) so you can check how many rows it returned.
rows = cur.fetchall()
if len(rows) != 1:
print("your username is bad or duplicate")
or
rows = cur.fetchall()
if cur.rowcount != 1:
print("your username is bad or duplicate")
I am writing Python code with the BigQuery Client API, and attempting to use the async query code (written everywhere as a code sample), and it is failing at the fetch_data() method call. Python errors out with the error:
ValueError: too many values to unpack
So, the 3 return values (rows, total_count, page_token) seem to be the incorrect number of return values. But, I cannot find any documentation about what this method is supposed to return -- besides the numerous code examples that only show these 3 return results.
Here is a snippet of code that shows what I'm doing (not including the initialization of the 'client' variable or the imported libraries, which happen earlier in my code).
#---> Set up and start the async query job
job_id = str(uuid.uuid4())
job = client.run_async_query(job_id, query)
job.destination = temp_tbl
job.write_disposition = 'WRITE_TRUNCATE'
job.begin()
print 'job started...'
#---> Monitor the job for completion
retry_count = 360
while retry_count > 0 and job.state != 'DONE':
print 'waiting for job to complete...'
retry_count -= 1
time.sleep(1)
job.reload()
if job.state == 'DONE':
print 'job DONE.'
page_token = None
total_count = None
rownum = 0
job_results = job.results()
while True:
# ---- Next line of code errors out...
rows, total_count, page_token = job_results.fetch_data( max_results=10, page_token=page_token )
for row in rows:
rownum += 1
print "Row number %d" % rownum
if page_token is None:
print 'end of batch.'
break
What are the specific return results I should expect from the job_results.fetch_data(...) method call on an async query job?
Looks like you are right! The code no longer return these 3 parameters.
As you can see in this commit from the public repository, fetch_data now returns an instance of the HTTPIterator class (guess I didn't realize this before as I have a docker image with an older version of the bigquery client installed where it does return the 3 values).
The only way that I found to return the results was doing something like this:
iterator = job_results.fetch_data()
data = []
for page in iterator._page_iter(False):
data.extend([page.next() for i in range(page.num_items)])
Notice that now we don't have to manage pageTokens anymore, it's been automated for the most part.
[EDIT]:
I just realized you can get results by doing:
results = list(job_results.fetch_data())
Got to admit it's way easier now then it was before!