HomeMogDBMogDB StackUqbar
v2.1

Documentation:v2.1

Supported Versions:

Exporting a Single Database

Exporting a Database

You can use gs_dump to export data and all object definitions of a database from MogDB. You can specify the information to export as follows:

  • Export full information of a database, including its data and all object definitions.

    You can use the exported information to create a database containing the same data as the current one.

  • Export all object definitions of a database, including the definitions of the database, functions, schemas, tables, indexes, and stored procedures.

    You can use the exported object definitions to quickly create a database that is the same as the current one, except that the new database does not have data.

  • Export data of a database.

Procedure

  1. Log in as the OS user omm to the primary node of the database.

  2. Use gs_dump to export data of the userdatabase database.

    gs_dump -U jack -f /home/omm/backup/userdatabase_backup.tar -p 8000 postgres -F t
    Password:

    Table 1 Common parameters

    Parameter Description Example Value
    -U Username for database connection.
    NOTE:
    If the username is not specified, the initial system administrator created during installation is used for connection by default.
    -U jack
    -W User password for database connection.
    - This parameter is not required for database administrators if the trust policy is used for authentication.
    - If you connect to the database without specifying this parameter and you are not a database administrator, you will be prompted to enter the password.
    -W abcd@123
    -f Folder to store exported files. If this parameter is not specified, the exported files are stored in the standard output. -f /home/omm/backup/postgres_backup.tar
    -p TCP port or local Unix-domain socket file extension on which the server is listening for connections. -p 8000
    dbname Name of the database to export. postgres
    -F Select the format of file to export. The values of -F are as follows:
    - p: plaintext
    - c: custom
    - d: directory
    - t: .tar
    -F t

    For details about other parameters, see "Tool Reference > Server Tools > gs_dump" in the Reference Guide.

Examples

Example 1: Run gs_dump to export full information of the postgres database. The exported files are in .sql format.

gs_dump -f /home/omm/backup/postgres_backup.sql -p 8000 postgres -F p
Password:
gs_dump[port='8000'][postgres][2017-07-21 15:36:13]: dump database postgres successfully
gs_dump[port='8000'][postgres][2017-07-21 15:36:13]: total time: 3793  ms

Example 2: Run gs_dump to export data of the postgres database, excluding object definitions. The exported files are in a custom format.

gs_dump -f /home/omm/backup/postgres_data_backup.dmp -p 8000 postgres -a -F c
Password:
gs_dump[port='8000'][postgres][2017-07-21 15:36:13]: dump database postgres successfully
gs_dump[port='8000'][postgres][2017-07-21 15:36:13]: total time: 3793  ms

Example 3: Run gs_dump to export object definitions of the postgres database. The exported files are in .sql format.

gs_dump -f /home/omm/backup/postgres_def_backup.sql -p 8000 postgres -s -F p
Password:
gs_dump[port='8000'][postgres][2017-07-20 15:04:14]: dump database postgres successfully
gs_dump[port='8000'][postgres][2017-07-20 15:04:14]: total time: 472 ms

Example 4: Run gs_dump to export object definitions of the postgres database. The exported files are in text format and are encrypted.

gs_dump -f /home/omm/backup/postgres_def_backup.sql -p 8000 postgres --with-encryption AES128 --with-key 1234567812345678 -s -F p
Password:
gs_dump[port='8000'][postgres][2018-11-14 11:25:18]: dump database postgres successfully
gs_dump[port='8000'][postgres][2018-11-14 11:25:18]: total time: 1161  ms

Exporting a Schema

You can use gs_dump to export data and all object definitions of a schema from MogDB. You can export one or more specified schemas as needed. You can specify the information to export as follows:

  • Export full information of a schema, including its data and object definitions.
  • Export data of a schema, excluding its object definitions.
  • Export the object definitions of a schema, including the definitions of tables, stored procedures, and indexes.

Procedure

  1. Log in as the OS user omm to the primary node of the database.

  2. Run gs_dump to export the hr and public schemas.

    gs_dump -W Bigdata@123 -U jack -f /home/omm/backup/MPPDB_schema_backup -p 8000 human_resource -n hr -n public -F d

    Table 1 Common parameters

    Parameter Description Example Value
    -U Username for database connection. -U jack
    -W User password for database connection.
    - This parameter is not required for database administrators if the trust policy is used for authentication.
    - If you connect to the database without specifying this parameter and you are not a database administrator, you will be prompted to enter the password.
    -W Bigdata@123
    -f Folder to store exported files. If this parameter is not specified, the exported files are stored in the standard output. -f /home/omm/backup/MPPDB*_*schema_backup
    -p TCP port or local Unix-domain socket file extension on which the server is listening for connections. -p 8000
    dbname Name of the database to export. human_resource
    -n Names of schemas to export. Data of the specified schemas will also be exported.
    - Single schema: Enter -n schemaname.
    - Multiple schemas: Enter -n schemaname for each schema.
    - Single schemas:-n hr
    - Multiple schemas:-n hr -n public
    -F Select the format of file to export. The values of -F are as follows:
    - p: plaintext
    - c: custom
    - d: directory
    - t: .tar
    -F d

    For details about other parameters, see "Tool Reference > Server Tools > gs_dump" in the Reference Guide.

Examples

Example 1: Run gs_dump to export full information of the hr schema. The exported files stored in text format.

gs_dump -W Bigdata@123 -f /home/omm/backup/MPPDB_schema_backup.sql -p 8000 human_resource -n hr -F p
gs_dump[port='8000'][human_resource][2017-07-21 16:05:55]: dump database human_resource successfully
gs_dump[port='8000'][human_resource][2017-07-21 16:05:55]: total time: 2425  ms

Example 2: Run gs_dump to export data of the hr schema. The exported files are in .tar format.

gs_dump -W Bigdata@123 -f /home/omm/backup/MPPDB_schema_data_backup.tar -p 8000 human_resource -n hr -a -F t
gs_dump[port='8000'][human_resource][2018-11-14 15:07:16]: dump database human_resource successfully
gs_dump[port='8000'][human_resource][2018-11-14 15:07:16]: total time: 1865  ms

Example 3: Run gs_dump to export the object definitions of the hr schema. The exported files are stored in a directory.

gs_dump -W Bigdata@123 -f /home/omm/backup/MPPDB_schema_def_backup -p 8000 human_resource -n hr -s -F d
gs_dump[port='8000'][human_resource][2018-11-14 15:11:34]: dump database human_resource successfully
gs_dump[port='8000'][human_resource][2018-11-14 15:11:34]: total time: 1652  ms

Example 4: Run gs_dump to export the human_resource database excluding the hr schema. The exported files are in a custom format.

gs_dump -W Bigdata@123 -f /home/omm/backup/MPPDB_schema_backup.dmp -p 8000 human_resource -N hr -F c
gs_dump[port='8000'][human_resource][2017-07-21 16:06:31]: dump database human_resource successfully
gs_dump[port='8000'][human_resource][2017-07-21 16:06:31]: total time: 2522  ms

Example 5: Run gs_dump to export the object definitions of the hr and public schemas. The exported files are in .tar format.

gs_dump -W Bigdata@123 -f /home/omm/backup/MPPDB_schema_backup1.tar -p 8000 human_resource -n hr -n public -s -F t
gs_dump[port='8000'][human_resource][2017-07-21 16:07:16]: dump database human_resource successfully
gs_dump[port='8000'][human_resource][2017-07-21 16:07:16]: total time: 2132  ms

Example 6: Run gs_dump to export the human_resource database excluding the hr and public schemas. The exported files are in a custom format.

gs_dump -W Bigdata@123 -f /home/omm/backup/MPPDB_schema_backup2.dmp -p 8000 human_resource -N hr -N public -F c
gs_dump[port='8000'][human_resource][2017-07-21 16:07:55]: dump database human_resource successfully
gs_dump[port='8000'][human_resource][2017-07-21 16:07:55]: total time: 2296  ms

Example 7: Run gs_dump to export all tables (views, sequences, and foreign tables are also included) in the public schema and the staffs table in the hr schema, including data and table definition. The exported files are in a custom format.

gs_dump -W Bigdata@123 -f /home/omm/backup/MPPDB_backup3.dmp -p 8000 human_resource -t public.* -t hr.staffs -F c
gs_dump[port='8000'][human_resource][2018-12-13 09:40:24]: dump database human_resource successfully
gs_dump[port='8000'][human_resource][2018-12-13 09:40:24]: total time: 896  ms

Exporting a Table

You can use gs_dump to export data and definition of a table-level object from MogDB. Views, sequences, and foreign tables are special tables. You can export one or more specified tables as needed. You can specify the information to export as follows:

  • Export full information of a table, including its data and definition.
  • Export data of a table.
  • Export the definition of a table.

Procedure

  1. Log in as the OS user omm to the primary node of the database.

  2. Run gs_dump to export the hr.staffs and hr.employments tables.

    gs_dump -W Bigdata@123 -U jack -f /home/omm/backup/MPPDB_table_backup -p 8000 human_resource -t hr.staffs -t hr.employments -F d

    Table 1 Common parameters

    Parameter Description Example Value
    -U Username for database connection. -U jack
    -W User password for database connection.
    - This parameter is not required for database administrators if the trust policy is used for authentication.
    - If you connect to the database without specifying this parameter and you are not a database administrator, you will be prompted to enter the password.
    -W Bigdata@123
    -f Folder to store exported files. If this parameter is not specified, the exported files are stored in the standard output. -f /home/omm/backup/MPPDB_table_backup
    -p TCP port or local Unix-domain socket file extension on which the server is listening for connections. -p 8000
    dbname Name of the database to export. human_resource
    -t Table (or view, sequence, foreign table) to export. You can specify multiple tables by listing them or using wildcard characters. When you use wildcard characters, quote wildcard patterns with single quotation marks (") to prevent the shell from expanding the wildcard characters.
    - Single table: Enter -t schema.table.
    - Multiple tables: Enter -t schema.table for each table.
    - Single table: -t hr.staffs
    - Multiple tables:-t hr.staffs -t hr.employments**
    -F Select the format of file to export. The values of -F are as follows:
    - p: plaintext
    - c: custom
    - d: directory
    - t: .tar
    -F d

    For details about other parameters, see "Tool Reference > Server Tools > gs_dump" in the Reference Guide.

Examples

Example 1: Run gs_dump to export full information of the hr.staffs table. The exported files are in text format.

gs_dump -W Bigdata@123 -f /home/omm/backup/MPPDB_table_backup.sql -p 8000 human_resource -t hr.staffs -F p
gs_dump[port='8000'][human_resource][2017-07-21 17:05:10]: dump database human_resource successfully
gs_dump[port='8000'][human_resource][2017-07-21 17:05:10]: total time: 3116  ms

Example 2: Run gs_dump to export data of the hr.staffs table. The exported files are in .tar format.

gs_dump -W Bigdata@123 -f /home/omm/backup/MPPDB_table_data_backup.tar -p 8000 human_resource -t hr.staffs -a -F t
gs_dump[port='8000'][human_resource][2017-07-21 17:04:26]: dump database human_resource successfully
gs_dump[port='8000'][human_resource][2017-07-21 17:04:26]: total time: 2570  ms

Example 3: Run gs_dump to export the definition of the hr.staffs table. The exported files are stored in a directory.

gs_dump -W Bigdata@123 -f /home/omm/backup/MPPDB_table_def_backup -p 8000 human_resource -t hr.staffs -s -F d
gs_dump[port='8000'][human_resource][2017-07-21 17:03:09]: dump database human_resource successfully
gs_dump[port='8000'][human_resource][2017-07-21 17:03:09]: total time: 2297  ms

Example 4: Run gs_dump to export the human_resource database excluding the hr.staffs table. The exported files are in a custom format.

gs_dump -W Bigdata@123 -f /home/omm/backup/MPPDB_table_backup4.dmp -p 8000 human_resource -T hr.staffs -F c
gs_dump[port='8000'][human_resource][2017-07-21 17:14:11]: dump database human_resource successfully
gs_dump[port='8000'][human_resource][2017-07-21 17:14:11]: total time: 2450  ms

Example 5: Run gs_dump to export the hr.staffs and hr.employments tables. The exported files are in text format.

gs_dump -W Bigdata@123 -f /home/omm/backup/MPPDB_table_backup1.sql -p 8000 human_resource -t hr.staffs -t hr.employments -F p
gs_dump[port='8000'][human_resource][2017-07-21 17:19:42]: dump database human_resource successfully
gs_dump[port='8000'][human_resource][2017-07-21 17:19:42]: total time: 2414  ms

Example 6: Run gs_dump to export the human_resource database excluding the hr.staffs and hr.employments tables. The exported files are in text format.

gs_dump -W Bigdata@123 -f /home/omm/backup/MPPDB_table_backup2.sql -p 8000 human_resource -T hr.staffs -T hr.employments -F p
gs_dump[port='8000'][human_resource][2017-07-21 17:21:02]: dump database human_resource successfully
gs_dump[port='8000'][human_resource][2017-07-21 17:21:02]: total time: 3165  ms

Example 7: Run gs_dump to export data and definition of the hr.staffs table, and the definition of the hr.employments table. The exported files are in .tar format.

gs_dump -W Bigdata@123 -f /home/omm/backup/MPPDB_table_backup3.tar -p 8000 human_resource -t hr.staffs -t hr.employments --exclude-table-data hr.employments -F t
gs_dump[port='8000'][human_resource][2018-11-14 11:32:02]: dump database human_resource successfully
gs_dump[port='8000'][human_resource][2018-11-14 11:32:02]: total time: 1645  ms

Example 8: Run gs_dump to export data and definition of the hr.staffs table, encrypt the exported files, and store them in text format.

gs_dump -W Bigdata@123 -f /home/omm/backup/MPPDB_table_backup4.sql -p 8000 human_resource -t hr.staffs --with-encryption AES128 --with-key 1212121212121212 -F p
gs_dump[port='8000'][human_resource][2018-11-14 11:35:30]: dump database human_resource successfully
gs_dump[port='8000'][human_resource][2018-11-14 11:35:30]: total time: 6708  ms

Example 9: Run gs_dump to export all tables (views, sequences, and foreign tables are also included) in the public schema and the staffs table in the hr schema, including data and table definition. The exported files are in a custom format.

gs_dump -W Bigdata@123 -f /home/omm/backup/MPPDB_table_backup5.dmp -p 8000 human_resource -t public.* -t hr.staffs -F c
gs_dump[port='8000'][human_resource][2018-12-13 09:40:24]: dump database human_resource successfully
gs_dump[port='8000'][human_resource][2018-12-13 09:40:24]: total time: 896  ms

Example 10: Run gs_dump to export the definition of the view referencing to the test1 table in the t1 schema. The exported files are in a custom format.

gs_dump -W Bigdata@123 -U jack -f /home/omm/backup/MPPDB_view_backup6 -p 8000 human_resource -t t1.test1 --include-depend-objs --exclude-self -F d
gs_dump[port='8000'][jack][2018-11-14 17:21:18]: dump database human_resource successfully
gs_dump[port='8000'][jack][2018-11-14 17:21:23]: total time: 4239  ms
Copyright © 2011-2024 www.enmotech.com All rights reserved.