Now that I got myself a bit acquainted with the Database as a Service offering, my next self assigned goal is to import my JSON Database API dumpfile into the Oracle Cloud.
As far as I can see, this will challenge me with the following task:
- Creating a JSON user environment my presentation demo dumpfile created with the Data Pump export utility.
- That said, be aware: JSON native database functionality needs database version 12.1.0.2 or higher!
Moving a dump file into the Cloud
There are several options, if you search the Cloud documentation, of moving data into your Cloud environment.
- SQL*Developer
- APEX
- Based on SSH, FTP or REST
Some of the Documentation out there might help:
- Data Movement and the Oracle Database Cloud Service
- Setting Up Secure FTP User Accounts
- Using Oracle SQL Developer, the Oracle Application Express SQL Workshop Data Upload Utility or the Oracle Application Express Data Load utility.
But I don’t want to do anything with SQL*Developer “carts” or otherwise. I have a prepped dumpfile.
This is luckily properly described in the “Using Oracle Database Cloud – Database as a Service” manual.
Here the following options are described in the “Loading Data into the Oracle Database in a Database as a Service Instance” – section of the online manual.
- Using SQL*Loader to Load Data into the Database
- Using Oracle Data Pump Import to Load Data into the Database
- Using Transportable Tablespaces to Load Data into the Database
- Using Pluggable Databases (PDBs) to Load Data into the Database
You will have to jump through the manual a bit to find the actual interesting section called “Migration Methods“. Here you find the link to the doc I need: “Data Pump Conventional Import/Export“.
I already have a dumpfile so I can started by creating a dump directory on my cloud environment
And while I am at it, create the needed (if you want a specific directory for those files) dump directory alias in the database. Depending on where you want to have the dump file content, remember we are in a multi-tenant database environment, you need the directory in that environment and/or grant the privileges needed to the actual user that will be doing the import.
In all, nothing new so far.
Using a Secure Copy client
I downloaded in the meantime on my laptop at home, the dump file in the directory where also my “oracleCloud” SSH key file resides. Be aware that you have to enable port access for the different access methods in the “Compute Cloud Section” (as described in an earlier post).
Following the manual, and if you don’t have “scp” on your machine, download the one from the Putty site (if on Windows), I then uploaded the dumpfile in the newly created directory in the Cloud via:
The \ sign at the end of the statement in the picture, signals the terminal that, despite the [ENTER] stroke, it is still part of the same command statement / line.
Now, while still on the laptop at home, we can reconnect to the Cloud environment, set the Oracle Environment parameters via using “oraenv” and then import the dump file.
Using a powerful user like the SYSTEM account will create the JSON_WIKI user defined in the dumpfile. Also be aware that I am creating this user in the pluggable database called “pdb1.mgralike.oracle.cloud.internal” (and the Easy Connect method to connect to it). You can find the connection details of the different pluggable database environments via the Oracle Database Cloud console.
If not sure and/or if you’re still on the command prompt of the Cloud environment, you also can attempt to find the details via the “lsnrctl status” command (while using a Linux account that has the privileges like the oracle Linux user). All registered services will be shown via this statement.
SQL*Developer Access
Via SQL*Developer you can now attempt to connect. You will need, besides a value for the connection, the following information
- Hostname or TCP/IP number (which is given in the Database Cloud Instance overview under “advanced”)
- Pluggable database or database service name (pdb1.mgralike.oracle.cloud.internal)
- User account (JSON_WIKI in the dumpfile)
- User password (case-sensitive value “json” in the dumpfile)
- Port number (the default, that is 1521)
- Connection type (I used the default)
So for this exercise here, my connection details are:
After a successful SQL*Developer connection, you can start to use it, alter it, try it (but don’t forget to create a wallet and access via defining ACL first).
Finally…
In principle there is not that much new stuff going around. The most “difficult” bits of this exercise was finding all the connection method information and the specific details for your Cloud environment. I am guessing that especially creating (the link is mentioned here) FTP access is a bit more work. Putting a dump file on the right spot to be imported, is mostly figuring out “which client / method to use”, “how to create a SSH key”, “where to find the TCP/IP” details”, etc.
I hope that this post helped a bit finding the missing pieces (and/or if not described here, had a look at earlier “Oracle Cloud” posts on this site).
HTH/M.