Mean Stack - storing of data - angularjs

I want to make website on books using mean stack so I want to store contents of books now I don't know where I have to store all those content in database or somewhere else.

The easiest way is to import a CSV file. MongoDB can import CSV files directly:
In the following example, mongoimport imports the csv formatted data
in the /opt/backups/contacts.csv file into the collection contacts in
the users database on the MongoDB instance running on the localhost
port numbered 27017.
Specifying --headerline instructs mongoimport to determine the name of
the fields using the first line in the CSV file.
mongoimport --db users --collection contacts --type csv --headerline --file /opt/backups/contacts.csv
Source:
https://docs.mongodb.com/manual/reference/program/mongoimport/
See also:
How to use mongoimport to import csv

Related

How to import an Oracle DB .dmp file using DBeaver?

I'm currently trying to import an Oracle DB .dmp (dump) file into my Oracle DB using DBeaver but have trouble doing so.
The Oracle DB in question is running in a docker container. I successfully connected to this Oracle database with DBeaver, and can thus browse the database using DBeaver.
Currently however, the DB is empty. That's where the .dmp file comes in.
I want to import this .dmp file into my database, under a certain schema but I cannot seem to do this. The dump file looks something like this: 'export.dmp' and is around 16MB big.
I'd like to import the data from the .dmp file to be able to browse the data to get familiar with it, as similar data will be stored in our own database.
I looked online but was unable to get an answer that works for me.
I tried using DBeaver but I don't seem to have the option to import or restore a DB via a .dmp file. At best, DBeaver proposes to import data using a .CSV file. I also downloaded the Oracle tool SQLDeveloper, but I can't manage to connect to my database in the docker container.
Online there is also talk of an import / export tool that supposedly can create these .dmp files and import them, but I'm unsure how to get this tool and whether that is the way to do it.
If so, I still don't understand how I can get to browse the data in DBeaver.
How can I import and browse the data from the .dmp file in my Oracle DB using DBeaver?
How to find Oracle datapump dir
presumably set to /u01/app/oracle/admin/<mydatabase>/dpdump on your system
How to copy files from host to docker container
docker cp export.dmp container_id:/u01/app/oracle/admin/<mydatabase>/dpdump/export.dmp
How do I get into a Docker container's shell
docker exec -it <mycontainer> bash
How to import an Oracle database from dmp file
If it was exported using expdp, then start the import with impdp:
impdp <username>/<password> dumpfile=export.dmp full=y
It will output the log file in the same default DATA_PUMP_DIR directory in the container.
oracle has two utilities IMPORT and IMPDP to import dumps , with IMPORT you can't use database directories and you have to specify the location . The IMPDP on other hand require database directory .
having said that you can't import oracle export dumps using dbeaver , you have to use IMPORT or IMPDP utility from OS.

How to set data type automatically connecting .csv files in DBeaver?

I tried to import .csv files to my database in DBeaver,
and I found a difference between
connecting to a folder containing the .csv files and importing all of them like a full database
importing .csv files as tables to a database I've already made.
When I connect to a folder, DBeaver set the datatypes of columns all to 'String'. However, it interprets them accordingly if I import .csv files individually.
So, I'd like to know how to set data type automatically connecting .csv files in DBeaver.
The string column type used by default for CSV driver: https://github.com/simoc/csvjdbc/blob/master/docs/doc.md#columntypes
You can try to change "columnTypes" setting in DBeaver-> CSV connection -> Connection Settings -> Driver Properties.

How to export all tables within an Sybase 12.5 database to multiple flat files?

My customers runs an very old (seems to me) Sybase 12.5.2 database. I want/need to export all tables from a database to multiple (for each table) flat (text) files. I have access to ISQL command line prompt with the admin user. I havent worked ever with an Sybase database before.
Sybase Adaptive Server Enterprise (ASE) allows multiple databases to be hosted. You don't specify whether only one of the databases in the database server needs to be exported or if all of them do.
For each database, the following query will list the names of the tables
select name from sysobjects where type = 'U'
Sybase ASE also comes with a tool called "bcp" which stands for "Bulk Copy". It is an easy way of creating a flat file of a table's contents.
bcp database.schema.table out file_name -c -U username -S server_name
It has more options that may be of interest, especially around field and row terminators. Documentation for the most relevant version (12.5.1) can be found here:
http://infocenter.sybase.com/help/index.jsp?topic=/com.sybase.dc30191_1251/html/utility/BABGCCIC.htm
i have been using BCP commands to export data from sybase environments.bcp is a command line utility which you can use it to export data from multiple types of databases
below is a very example and you can try it for
bcp Table Name out OUTPUT FILE PATH\FILENAME.dat -S SERVER NAME -U USERNAME -P PASSWORD -F Format -r row_terminator -e error output file path and name
You can create a batch file with such commands and do multiple exports on one hit.
If you have access to any ETL tool you can exporting the data using the same as well.

SQL SERVER - reading excel files content and transfer to sql database using xp_cmdshell

I was shocked when I learned that importing the excel data to sql database using OPENROWSET has downsides as it truncates the cells' values of it to 255-length-characters before it passes to the database. I'm now thinking of using xp_cmdshell to read the excel file's data and transfer it to database. however I'm clueless on how I could do that. could somebody help me to achieve that?
Yes BCP could be used to import data from excel(.xlsx) files into Sql Server tables. Only thing to remember here is from MS documentation -
Prerequisite - Save Excel data as text To use the rest of the methods
described on this page - the BULK INSERT statement, the BCP tool, or
Azure Data Factory - first you have to export your Excel data to a
text file.
In Excel, select File | Save As and then select Text (Tab delimited)
(.txt) or CSV (Comma delimited) (.csv) as the destination file type.
A sample BCP command to import data from a excel file (converted to tab delimited) into Sql Server table -
bcp.exe "MyDB_Copy.dbo.Product" in "C:\Users\abhishek\Documents\BCPSample.txt" -c -t"\t" -r"\n" -S ABHISHEK-HP -T -h TABLOCK
Read more about BCP and import here.

import dump.sql file into postresql

I am new to PostgreSQL. I have a file name dump.sql. I want to import this into my POstgreSQL database. I created the database using CREATE database_name. Then I used psql database_name < /Downloads/web-task/dump.sql. After running this command it shows not output. I assume it did not import anything from dump.sql. How can I import this file into my PostgreSQL DB?
We have determined in chat that you were trying to import your dump.sql file from within the psql prompt, which obviously couldn't work... The data is now imported.

Resources