#49900 Issue 49858 - Add backup/restore and import/export functionality to CLI
Closed 3 years ago by spichugi. Opened 5 years ago by spichugi.
spichugi/389-ds-base bakdb_add  into  master

@@ -850,7 +850,7 @@ 

    </div>

  

    <div class="modal fade" id="import-ldif-form" data-backdrop="static" tabindex="-1" role="dialog" aria-labelledby="import-label" aria-hidden="true">

-     <div class="modal-dialog ds-modal">

+     <div class="modal-dialog">

        <div class="modal-content">

          <div class="modal-header">

            <button type="button" class="close" data-dismiss="modal" aria-hidden="true" aria-label="Close">
@@ -859,11 +859,32 @@ 

            <h4 class="modal-title" id="import-label">Initialize Database</h4>

          </div>

          <div class="modal-body">

-           <form class="form-horizontal">

-             <label for="ldif-file" class="" title=

-               "Full path to LDIF file">

-               LDIF File</label><input class="ds-form-input" type="text" id="ldif-file" size="40"/>

-           </form>

+           <div class="ds-inline">

+             <div>

+               <label for="root-suffix-import" class="ds-config-label-lrg" title=

+                       "Suffix where to import">

+                 Root Suffix</label><input class="ds-input" type="text" id="root-suffix-import" size="40" disabled/>

+             </div>

+             <div>

+               <label for="ldif-file-import" class="ds-config-label-lrg" title=

+                       "LDIF filename without an extension. LDIF files are written to the server's LDIF directory (nsslapd-ldifdir)">

+                 LDIF Filename</label><input class="ds-input" type="text" id="ldif-file-import" size="40"/>

+             </div>

+             <div>

+               <label for="exclude-suffix-import" class="ds-config-label-lrg" title=

+                       "Exclude Suffix (Optional)">

+                 Exclude Suffix (Optional)</label><input class="ds-input" type="text" id="exclude-suffix-import" size="40"/>

+             </div>

+             <div>

+               <label for="include-suffix-import" class="ds-config-label-lrg" title=

+                       "Include Suffix (Optional)">

+                 Include Suffix (Optional)</label><input class="ds-input" type="text" id="include-suffix-import" size="40"/>

+             </div>

+           </div>

+           <div id="import-ldif-spinner" class="ds-center" hidden>

+             <p></p>

+             <p><span class="spinner spinner-xs spinner-inline"></span> Running LDIF import task...</p>

+           </div>

          </div>

          <div class="modal-footer ds-modal-footer">

            <button type="button" class="btn btn-default" data-dismiss="modal">Cancel</button>
@@ -874,7 +895,7 @@ 

    </div>

  

    <div class="modal fade" id="export-ldif-form" data-backdrop="static" tabindex="-1" role="dialog" aria-labelledby="export-label" aria-hidden="true">

-     <div class="modal-dialog ds-modal">

+     <div class="modal-dialog">

        <div class="modal-content">

          <div class="modal-header">

            <button type="button" class="close" data-dismiss="modal" aria-hidden="true" aria-label="Close">
@@ -883,10 +904,32 @@ 

            <h4 class="modal-title" id="export-label">Export Database</h4>

          </div>

          <div class="modal-body">

-           <form class="form-horizontal">

-             <label for="export-ldif-file" class="" title="Full path to LDIF file">

-             LDIF File</label><input class="ds-form-input" type="text" id="export-ldif-file" size="40"/>

-           </form>

+           <div class="ds-inline">

+             <div>

+               <label for="root-suffix-export" class="ds-config-label-lrg" title=

+                       "Suffix where to export">

+                 Root Suffix</label><input class="ds-input" type="text" id="root-suffix-export" size="40" disabled/>

+             </div>

+             <div>

+               <label for="ldif-file-export" class="ds-config-label-lrg" title=

+                       "LDIF filename without an extension. LDIF files are written to the server's LDIF directory (nsslapd-ldifdir)">

+                 LDIF Filename (Optional)</label><input class="ds-input" type="text" id="ldif-file-export" size="40"/>

+             </div>

+             <div>

+               <label for="exclude-suffix-export" class="ds-config-label-lrg" title=

+                       "Exclude Suffix (Optional)">

+                 Exclude Suffix (Optional)</label><input class="ds-input" type="text" id="exclude-suffix-export" size="40"/>

+             </div>

+             <div>

+               <label for="include-suffix-export" class="ds-config-label-lrg" title=

+                       "Include Suffix (Optional)">

+                 Include Suffix (Optional)</label><input class="ds-input" type="text" id="include-suffix-export" size="40"/>

+             </div>

+           </div>

+           <div id="export-ldif-spinner" class="ds-center" hidden>

+             <p></p>

+             <p><span class="spinner spinner-xs spinner-inline"></span> Running LDIF export task...</p>

+           </div>

          </div>

          <div class="modal-footer ds-modal-footer">

          <button type="button" class="btn btn-default" data-dismiss="modal">Cancel</button>

@@ -291,7 +291,7 @@ 

    <!-- Manage Backups/restore -->

    <div class="modal fade" id="restore-form" data-backdrop="static" tabindex="-1" role="dialog" aria-labelledby="restore-label" aria-hidden="true">

      <div class="modal-dialog">

-       <div class="modal-content ds-modal-wide">

+       <div class="modal-content">

          <div class="modal-header">

            <button type="button" class="close" data-dismiss="modal" aria-hidden="true" aria-label="Close">

              <span class="pficon pficon-close"></span>
@@ -467,7 +467,7 @@ 

    -->

    <div class="modal fade" id="backup-form" data-backdrop="static" tabindex="-1" role="dialog" aria-labelledby="backup-label" aria-hidden="true">

      <div class="modal-dialog">

-       <div class="modal-content ds-modal">

+       <div class="modal-content">

          <div class="modal-header">

            <button type="button" class="close" data-dismiss="modal" aria-hidden="true" aria-label="Close">

              <span class="pficon pficon-close"></span>

@@ -36,6 +36,9 @@ 

        "label": "Initialize Suffix",

         "icon": "glyphicon glyphicon-circle-arrow-right",

         "action": function (data) {

+          var suffix_id = $(node).attr('id');

+          var parent_suffix = suffix_id.substring(suffix_id.indexOf('-')+1);

+          $("#root-suffix-import").val(parent_suffix);

           $("#import-ldif-file").val("");

           $("#import-ldif-form").modal('toggle');

         }
@@ -44,6 +47,9 @@ 

         "label": "Export Suffix",

         "icon": "glyphicon glyphicon-circle-arrow-left",

         "action": function (data) {

+          var suffix_id = $(node).attr('id');

+          var parent_suffix = suffix_id.substring(suffix_id.indexOf('-')+1);

+          $("#root-suffix-export").val(parent_suffix);

           $("#export-ldif-file").val("");

           $("#export-ldif-form").modal('toggle');

         }
@@ -757,9 +763,42 @@ 

  

      // Init Suffix (import)

      $("#import-ldif-save").on("click", function() {

-       $("#import-ldif-form").css('display', 'none');

-       // Do the actual save in DS

-       // Update html

+       var root_suffix_import = $("#root-suffix-import").val();

+       var ldif_file_import = $("#ldif-file-import").val();

+       var exclude_suffix_import = $("#exclude-suffix-import").val();

+       var include_suffix_import = $("#include-suffix-import").val();

+       var cmd = [DSCONF, server_inst, 'backend', 'import', root_suffix_import];

+       // Process and validate parameters

+       if (ldif_file_import == ""){

+         popup_msg("Error", "LDIF file should be specified");

+         return;

+       } else if (ldif_file_import.indexOf(' ') >= 0) {

+         popup_msg("Error", "LDIF file can not contain any spaces");

+         return;

+       } else if (ldif_file_import.startsWith('/')) {

+         popup_msg("Error", "LDIF file can not start with a forward slash. " +

+                            "LDIF files are written to the server's LDIF directory (nsslapd-ldifdir)");

+         return;

+       } else {

+         cmd.push(ldif_file_import);

+       }

+       if (include_suffix_import != "") {

+         cmd.push.apply(cmd, ["-s", include_suffix_import]);

+       }

+       if (exclude_suffix_import != "") {

+         cmd.push.apply(cmd, ["-x", exclude_suffix_import]);

+       }

+       $("#import-ldif-spinner").show();

+       cockpit.spawn(cmd, { superuser: true, "err": "message", "environ": [ENV]}).

+       done(function(data) {

+         $("#import-ldif-spinner").hide();

+         popup_success("LDIF has been imported");

+         $("#import-ldif-form").modal('toggle');

+       }).

+       fail(function(data) {

+         $("#import-ldif-spinner").hide();

+         popup_err("Error", "Failed to import LDIF\n" + data.message);

+       })

      });

  

      // Export Suffix (import)
@@ -770,9 +809,39 @@ 

        $("#export-ldif-form").css('display', 'none');

      });

      $("#export-ldif-save").on("click", function() {

-       $("#export-ldif-form").css('display', 'none');

-       // Do the actual save in DS

-       // Update html

+       var root_suffix_export = $("#root-suffix-export").val();

+       var ldif_file_export = $("#ldif-file-export").val();

+       var exclude_suffix_export = $("#exclude-suffix-export").val();

+       var include_suffix_export = $("#include-suffix-export").val();

+       var cmd = [DSCONF, server_inst, 'backend', 'export', root_suffix_export];

+       // Process and validate parameters

+       if (ldif_file_export.indexOf(' ') >= 0) {

+         popup_msg("Error", "LDIF file can not contain any spaces");

+         return;

+       } else if (ldif_file_export.startsWith('/')) {

+         popup_msg("Error", "LDIF file can not start with a forward slash. " +

+                            "LDIF files are written to the server's LDIF directory (nsslapd-ldifdir)");

+         return;

+       } else if (ldif_file_export != ""){

+         cmd.push.apply(cmd, ["-l", ldif_file_export]);

+       }

+       if (include_suffix_export != "") {

+         cmd.push.apply(cmd, ["-s", include_suffix_export]);

+       }

+       if (exclude_suffix_export != "") {

+         cmd.push.apply(cmd, ["-x", exclude_suffix_export]);

+       }

+       $("#export-ldif-spinner").show();

+       cockpit.spawn(cmd, { superuser: true, "err": "message", "environ": [ENV]}).

+       done(function(data) {

+         $("#export-ldif-spinner").hide();

+         popup_success("LDIF has been exported");

+         $("#export-ldif-form").modal('toggle');

+       }).

+       fail(function(data) {

+         $("#export-ldif-spinner").hide();

+         popup_err("Error", "Failed to export LDIF\n" + data.message);

+       })

      });

  

      $("#create-ref-save").on("click", function() {

@@ -1166,18 +1166,38 @@ 

          return;

        }

        if (backup_name.startsWith('/')) {

-         popup_msg("Error", "Backup name can not start with a forward slash.  Backups are written to the server's backup directory (nsslapd-bakdir)");

+         popup_msg("Error", "Backup name can not start with a forward slash. " +

+                            "Backups are written to the server's backup directory (nsslapd-bakdir)");

          return;

        }

-       var cmd = [DSCTL, server_inst, 'db2bak', backup_name];

+       var cmd = ['status-dirsrv', server_inst];

        $("#backup-spinner").show();

-       cockpit.spawn(cmd, { superuser: true, "err": "message", "environ": [ENV]}).done(function(data) {

-         $("#backup-spinner").hide();

-         popup_success("Backup has been created");

-         $("#backup-form").modal('toggle');

-       }).fail(function(data) {

-         $("#backup-spinner").hide();

-         popup_err("Error", "Failed to backup the server\n" + data.message);

+       cockpit.spawn(cmd, { superuser: true}).

+       done(function() {

+         var cmd = [DSCONF, server_inst, 'backup', 'create',  backup_name];

+         cockpit.spawn(cmd, { superuser: true, "err": "message", "environ": [ENV]}).

+         done(function(data) {

+           $("#backup-spinner").hide();

+           popup_success("Backup has been created");

+           $("#backup-form").modal('toggle');

+         }).

+         fail(function(data) {

+           $("#backup-spinner").hide();

+           popup_err("Error", "Failed to backup the server\n" + data.message);

+         })

+       }).

+       fail(function() {

+         var cmd = [DSCTL, server_inst, 'db2bak', backup_name];

+         cockpit.spawn(cmd, { superuser: true, "err": "message", "environ": [ENV]}).

+         done(function(data) {

+           $("#backup-spinner").hide();

+           popup_success("Backup has been created");

+           $("#backup-form").modal('toggle');

+         }).

+         fail(function(data) {

+           $("#backup-spinner").hide();

+           popup_err("Error", "Failed to backup the server\n" + data.message);

+         });

        });

      });

  
@@ -1211,15 +1231,34 @@ 

        var restore_name = data[0];

        popup_confirm("Are you sure you want to restore this backup:  <b>" + restore_name + "<b>", "Confirmation", function (yes) {

          if (yes) {

-           var cmd = [DSCTL, server_inst, 'bak2db', restore_name];

+           var cmd = ['status-dirsrv', server_inst];

            $("#restore-spinner").show();

-           cockpit.spawn(cmd, { superuser: true, "err": "message", "environ": [ENV]}).done(function(data) {

-             popup_success("The backup has been restored");

-             $("#restore-spinner").hide();

-             $("#restore-form").modal('toggle');

-           }).fail(function(data) {

-             $("#restore-spinner").hide();

-             popup_err("Error", "Failed to restore from the backup\n" + data.message);

+           cockpit.spawn(cmd, { superuser: true}).

+           done(function() {

+             var cmd = [DSCONF, server_inst, 'backup', 'restore',  restore_name];

+             cockpit.spawn(cmd, { superuser: true, "err": "message", "environ": [ENV]}).

+             done(function(data) {

+               $("#restore-spinner").hide();

+               popup_success("The backup has been restored");

+               $("#restore-form").modal('toggle');

+             }).

+             fail(function(data) {

+               $("#restore-spinner").hide();

+               popup_err("Error", "Failed to restore from the backup\n" + data.message);

+             });

+           }).

+           fail(function() {

+             var cmd = [DSCTL, server_inst, 'bak2db', restore_name];

+             cockpit.spawn(cmd, { superuser: true, "err": "message", "environ": [ENV]}).

+             done(function(data) {

+               $("#restore-spinner").hide();

+               popup_success("The backup has been restored");

+               $("#restore-form").modal('toggle');

+             }).

+             fail(function(data) {

+               $("#restore-spinner").hide();

+               popup_err("Error", "Failed to restore from the backup\n" + data.message);

+             });

            });

          }

        });

@@ -1005,7 +1005,7 @@ 

  

    <!-- Create SASL Mapping -->

    <div class="modal fade" id="sasl-map-form" data-backdrop="static" tabindex="-1" role="dialog" aria-labelledby="sasl-header" aria-hidden="true">

-     <div class="modal-dialog ds-modal-wide">

+     <div class="modal-dialog">

        <div class="modal-content">

          <div class="modal-header">

            <button type="button" class="close" data-dismiss="modal" aria-hidden="true" aria-label="Close">

file modified
+2
@@ -25,6 +25,7 @@ 

  from lib389.cli_conf import health as cli_health

  from lib389.cli_conf import saslmappings as cli_sasl

  from lib389.cli_conf import pwpolicy as cli_pwpolicy

+ from lib389.cli_conf import backup as cli_backup

  from lib389.cli_conf.plugins import memberof as cli_memberof

  from lib389.cli_conf.plugins import usn as cli_usn

  from lib389.cli_conf.plugins import rootdn_ac as cli_rootdn_ac
@@ -77,6 +78,7 @@ 

  cli_automember.create_parser(subparsers)

  cli_sasl.create_parser(subparsers)

  cli_pwpolicy.create_parser(subparsers)

+ cli_backup.create_parser(subparsers)

  

  argcomplete.autocomplete(parser)

  

file modified
+53 -15
@@ -79,9 +79,11 @@ 

      update_newhost_with_fqdn,

      formatInfData,

      ensure_bytes,

-     ensure_str)

+     ensure_str,

+     format_cmd_list)

  from lib389.paths import Paths

  from lib389.nss_ssl import NssSsl

+ from lib389.tasks import BackupTask, RestoreTask

  

  # mixin

  # from lib389.tools import DirSrvTools
@@ -2879,14 +2881,15 @@ 

          if not archive_dir:

              self.log.error("bak2db: backup directory missing")

              return False

+         elif not archive_dir.startswith("/"):

+             archive_dir = os.path.join(self.ds_paths.backup_dir, archive_dir)

  

          try:

-             result = subprocess.check_output([

-                 prog,

-                 'archive2db',

-                 '-a', archive_dir,

-                 '-D', self.get_config_dir()

-             ], encoding='utf-8')

+             cmd = [prog,

+                    'archive2db',

+                    '-a', archive_dir,

+                    '-D', self.get_config_dir()]

+             result = subprocess.check_output(cmd, encoding='utf-8')

          except subprocess.CalledProcessError as e:

              self.log.debug("Command: %s failed with the return code %s and the error %s",

                             format_cmd_list(cmd), e.returncode, e.output)
@@ -2914,17 +2917,16 @@ 

          if archive_dir is None:

              # Use the instance name and date/time as the default backup name

              archive_dir = self.get_bak_dir() + "/" + self.serverid + "-" + datetime.now().strftime("%Y_%m_%d_%H_%M_%S")

-         elif archive_dir[0] != "/":

+         elif not archive_dir.startswith("/"):

              # Relative path, append it to the bak directory

-             archive_dir = self.get_bak_dir() + "/" + archive_dir

+             archive_dir = os.path.join(self.ds_paths.backup_dir, archive_dir)

  

          try:

-             result = subprocess.check_output([

-                 prog,

-                 'db2archive',

-                 '-a', archive_dir,

-                 '-D', self.get_config_dir()

-             ], encoding='utf-8')

+             cmd = [prog,

+                    'db2archive',

+                    '-a', archive_dir,

+                    '-D', self.get_config_dir()]

+             result = subprocess.check_output(cmd, encoding='utf-8')

          except subprocess.CalledProcessError as e:

              self.log.debug("Command: %s failed with the return code %s and the error %s",

                             format_cmd_list(cmd), e.returncode, e.output)
@@ -3409,3 +3411,39 @@ 

          for ent in sorted(ents, key=lambda e: len(e.dn), reverse=True):

              self.log.debug("Delete entry children %s", ent.dn)

              self.delete_ext_s(ent.dn, serverctrls=serverctrls, clientctrls=clientctrls)

+ 

+     def backup_online(self, archive=None, db_type=None):

+         """Creates a backup of the database"""

+ 

+         if archive is None:

+             # Use the instance name and date/time as the default backup name

+             tnow = datetime.now().strftime("%Y_%m_%d_%H_%M_%S")

+             archive = os.path.join(self.ds_paths.backup_dir,

+                                    "%s_%s" % (self.serverid, tnow))

+         elif archive[0] != "/":

+             # Relative path, append it to the bak directory

+             archive = os.path.join(self.ds_paths.backup_dir, archive)

+ 

+         task = BackupTask(self)

+         task_properties = {'nsArchiveDir': archive}

+         if db_type is not None:

+             task_properties['nsDatabaseType'] = db_type

+         task.create(properties=task_properties)

+ 

+         return task

+ 

+     def restore_online(self, archive, db_type=None):

+         """Restores a database from a backup"""

+ 

+         # Relative path, append it to the bak directory

+         if archive[0] != "/":

+             archive = os.path.join(self.ds_paths.backup_dir, archive)

+ 

+         task = RestoreTask(self)

+         task_properties = {'nsArchiveDir': archive}

+         if db_type is not None:

+             task_properties['nsDatabaseType'] = db_type

+ 

+         task.create(properties=task_properties)

+ 

+         return task

@@ -6,6 +6,7 @@ 

  # See LICENSE for details.

  # --- END COPYRIGHT BLOCK ---

  

+ from datetime import datetime

  import ldap

  from lib389._constants import *

  from lib389.properties import *
@@ -21,6 +22,7 @@ 

  # We need to be a factor to the backend monitor

  from lib389.monitor import MonitorBackend

  from lib389.index import Indexes

+ from lib389.tasks import ImportTask, ExportTask

  

  # This is for sample entry creation.

  from lib389.configurations import get_sample_entries
@@ -561,6 +563,24 @@ 

          return indexes

      # Future: add reindex task for this be.

  

+     def import_ldif(self, ldifs, chunk_size=None, encrypted=False, gen_uniq_id=False, only_core=False,

+                     include_suffixes=None, exclude_suffixes=None):

+         """Do an import of the suffix"""

+ 

+         bs = Backends(self._instance)

+         task = bs.import_ldif(self.rdn, ldifs, chunk_size, encrypted, gen_uniq_id, only_core,

+                               include_suffixes, exclude_suffixes)

+         return task

+ 

+     def export_ldif(self, ldif=None, use_id2entry=False, encrypted=False, min_base64=False, no_uniq_id=False,

+                     replication=False, not_folded=False, no_seq_num=False, include_suffixes=None, exclude_suffixes=None):

+         """Do an export of the suffix"""

+ 

+         bs = Backends(self._instance)

+         task = bs.export_ldif(self.rdn, ldif, use_id2entry, encrypted, min_base64, no_uniq_id,

+                               replication, not_folded, no_seq_num, include_suffixes, exclude_suffixes)

+         return task

+ 

  

  class Backends(DSLdapObjects):

      """DSLdapObjects that represents DN_LDBM base DN
@@ -579,3 +599,75 @@ 

          self._filterattrs = ['cn', 'nsslapd-suffix', 'nsslapd-directory']

          self._childobject = Backend

          self._basedn = DN_LDBM

+ 

+     def import_ldif(self, be_name, ldifs, chunk_size=None, encrypted=False, gen_uniq_id=None, only_core=False,

+                     include_suffixes=None, exclude_suffixes=None):

+         """Do an import of the suffix"""

+ 

+         if not ldifs:

+             self.log.error("import_ldif: LDIF filename is missing")

+             return False

+         ldif_paths = []

+         for ldif in list(ldifs):

+             if not ldif.startswith("/"):

+                 ldif = os.path.join(self._instance.ds_paths.ldif_dir, "%s.ldif" % ldif)

+                 ldif_paths.append(ldif)

+ 

+         task = ImportTask(self._instance)

+         task_properties = {'nsInstance': be_name,

+                            'nsFilename': ldif_paths}

+         if include_suffixes is not None:

+             task_properties['nsIncludeSuffix'] = include_suffixes

+         if exclude_suffixes is not None:

+             task_properties['nsExcludeSuffix'] = exclude_suffixes

+         if encrypted:

+             task_properties['nsExportDecrypt'] = 'true'

+         if only_core:

+             task_properties['nsImportIndexAttrs'] = 'false'

+         if chunk_size is not None:

+             task_properties['nsImportChunkSize'] = chunk_size

+         if gen_uniq_id is not None:

+             if gen_uniq_id in ("none", "empty") or gen_uniq_id.startswith("deterministic"):

+                 raise ValueError("'gen_uniq_id should be none (no unique ID) |"

+                                  "empty (time-based ID) | deterministic namespace (name-based ID)")

+             task_properties['nsUniqueIdGenerator'] = gen_uniq_id

+ 

+         task.create(properties=task_properties)

+ 

+         return task

+ 

+     def export_ldif(self, be_names, ldif=None, use_id2entry=False, encrypted=False, min_base64=False, no_dump_uniq_id=False,

+                     replication=False, not_folded=False, no_seq_num=False, include_suffixes=None, exclude_suffixes=None):

+         """Do an export of the suffix"""

+ 

+         task = ExportTask(self._instance)

+         task_properties = {'nsInstance': be_names}

+         if ldif is not None and not ldif.startswith("/"):

+             task_properties['nsFilename'] = os.path.join(self._instance.ds_paths.ldif_dir, "%s.ldif" % ldif)

+         else:

+             tnow = datetime.now().strftime("%Y_%m_%d_%H_%M_%S")

+             task_properties['nsFilename'] = os.path.join(self._instance.ds_paths.ldif_dir,

+                                                          "%s_%s_%s.ldif" % (self._instance.serverid,

+                                                                             "_".join(be_names), tnow))

+         if include_suffixes is not None:

+             task_properties['nsIncludeSuffix'] = include_suffixes

+         if exclude_suffixes is not None:

+             task_properties['nsExcludeSuffix'] = exclude_suffixes

+         if use_id2entry:

+             task_properties['nsUseId2Entry'] = 'true'

+         if encrypted:

+             task_properties['nsExportDecrypt'] = 'true'

+         if replication:

+             task_properties['nsExportReplica'] = 'true'

+         if min_base64:

+             task_properties['nsMinimalEncoding'] = 'true'

+         if not_folded:

+             task_properties['nsNoWrap'] = 'true'

+         if no_dump_uniq_id:

+             task_properties['nsDumpUniqId'] = 'false'

+         if no_seq_num:

+             task_properties['nsPrintKey'] = 'false'

+ 

+         task = task.create(properties=task_properties)

+         return task

+ 

@@ -92,6 +92,8 @@ 

  def connect_instance(dsrc_inst, verbose):

      dsargs = dsrc_inst['args']

      if '//' not in dsargs['ldapurl']:

+         # Connecting to the local instance

+         dsargs['server-id'] = dsargs['ldapurl']

          # We have an instance name - generate url from dse.ldif

          ldapurl, certdir = get_ldapurl_from_serverid(dsargs['ldapurl'])

          if ldapurl is not None:

@@ -27,10 +27,25 @@ 

  RDN = 'cn'

  

  

+ def _search_backend_dn(inst, be_name):

+     found = False

+     be_insts = MANY(inst).list()

+     for be in be_insts:

+         cn = ensure_str(be.get_attr_val('cn')).lower()

+         suffix = ensure_str(be.get_attr_val('nsslapd-suffix')).lower()

+         del_be_name = be_name.lower()

+         if cn == del_be_name or suffix == del_be_name:

+             dn = be.dn

+             found = True

+             break

+     if found:

+         return dn

+ 

+ 

  def backend_list(inst, basedn, log, args):

      if 'suffix' in args:

          result = {"type": "list", "items": []}

-         be_insts = Backends(inst).list()

+         be_insts = MANY(inst).list()

          for be in be_insts:

              if args.json:

                  result['items'].append(ensure_str(be.get_attr_val_utf8_l('nsslapd-suffix')).lower())
@@ -54,22 +69,13 @@ 

  

  

  def backend_create(inst, basedn, log, args):

-     kwargs = _get_attributes(args, Backend._must_attributes)

+     kwargs = _get_attributes(args, SINGULAR._must_attributes)

      _generic_create(inst, basedn, log.getChild('backend_create'), MANY, kwargs, args)

  

  

  def backend_delete(inst, basedn, log, args, warn=True):

-     found = False

-     be_insts = Backends(inst).list()

-     for be in be_insts:

-         cn = ensure_str(be.get_attr_val('cn')).lower()

-         suffix = ensure_str(be.get_attr_val('nsslapd-suffix')).lower()

-         del_be_name = args.be_name.lower()

-         if cn == del_be_name or suffix == del_be_name:

-             dn = be.dn

-             found = True

-             break

-     if not found:

+     dn = _search_backend_dn(inst, args.be_name)

+     if dn is None:

          raise ValueError("Unable to find a backend with the name: ({})".format(args.be_name))

  

      if warn and args.json is False:
@@ -77,6 +83,53 @@ 

      _generic_delete(inst, basedn, log.getChild('backend_delete'), SINGULAR, dn, args)

  

  

+ def backend_import(inst, basedn, log, args):

+     log = log.getChild('backend_import')

+     dn = _search_backend_dn(inst, args.be_name)

+     if dn is None:

+         raise ValueError("Unable to find a backend with the name: ({})".format(args.be_name))

+ 

+     mc = SINGULAR(inst, dn)

+     task = mc.import_ldif(ldifs=args.ldifs, chunk_size=args.chunks_size, encrypted=args.encrypted,

+                           gen_uniq_id=args.gen_uniq_id, only_core=args.only_core, include_suffixes=args.include_suffixes,

+                           exclude_suffixes=args.exclude_suffixes)

+     task.wait()

+     result = task.get_exit_code()

+ 

+     if task.is_complete() and result == 0:

+         log.info("The import task has finished successfully")

+     else:

+         raise ValueError("The import task has failed with the error code: ({})".format(result))

+ 

+ 

+ def backend_export(inst, basedn, log, args):

+     log = log.getChild('backend_export')

+ 

+     # If the user gave a root suffix we need to get the backend CN

+     be_cn_names = []

+     if not isinstance(args.be_names, str):

+         for be_name in args.be_names:

+             dn = _search_backend_dn(inst, be_name)

+             if dn is not None:

+                 mc = SINGULAR(inst, dn)

+                 be_cn_names.append(mc.rdn)

+             else:

+                 raise ValueError("Unable to find a backend with the name: ({})".format(args.be_names))

+ 

+     mc = MANY(inst)

+     task = mc.export_ldif(be_names=be_cn_names, ldif=args.ldif, use_id2entry=args.use_id2entry,

+                           encrypted=args.encrypted, min_base64=args.min_base64, no_dump_uniq_id=args.no_dump_uniq_id,

+                           replication=args.replication, not_folded=args.not_folded, no_seq_num=args.no_seq_num,

+                           include_suffixes=args.include_suffixes, exclude_suffixes=args.exclude_suffixes)

+     task.wait()

+     result = task.get_exit_code()

+ 

+     if task.is_complete() and result == 0:

+         log.info("The export task has finished successfully")

+     else:

+         raise ValueError("The export task has failed with the error code: ({})".format(result))

+ 

+ 

  def create_parser(subparsers):

      backend_parser = subparsers.add_parser('backend', help="Manage database suffixes and backends")

  
@@ -96,10 +149,62 @@ 

  

      create_parser = subcommands.add_parser('create', help='create')

      create_parser.set_defaults(func=backend_create)

-     populate_attr_arguments(create_parser, Backend._must_attributes)

+     populate_attr_arguments(create_parser, SINGULAR._must_attributes)

  

      delete_parser = subcommands.add_parser('delete', help='deletes the object')

      delete_parser.set_defaults(func=backend_delete)

      delete_parser.add_argument('be_name', help='The backend name or suffix to delete')

  

- 

+     import_parser = subcommands.add_parser('import', help="do an online import of the suffix")

+     import_parser.set_defaults(func=backend_import)

+     import_parser.add_argument('be_name', nargs='?',

+                                help='The backend name or the root suffix where to import')

+     import_parser.add_argument('ldifs', nargs='*',

+                                help="Specifies the filename of the input LDIF files."

+                                     "When multiple files are imported, they are imported in the order"

+                                     "they are specified on the command line.")

+     import_parser.add_argument('-c', '--chunks_size', type=int,

+                                help="The number of chunks to have during the import operation.")

+     import_parser.add_argument('-E', '--encrypted', action='store_true',

+                                help="Decrypts encrypted data during export. This option is used only"

+                                     "if database encryption is enabled.")

+     import_parser.add_argument('-g', '--gen_uniq_id',

+                                help="Generate a unique id. Type none for no unique ID to be generated"

+                                     "and deterministic for the generated unique ID to be name-based."

+                                     "By default, a time-based unique ID is generated."

+                                     "When using the deterministic generation to have a name-based unique ID,"

+                                     "it is also possible to specify the namespace for the server to use."

+                                     "namespaceId is a string of characters"

+                                     "in the format 00-xxxxxxxx-xxxxxxxx-xxxxxxxx-xxxxxxxx.")

+     import_parser.add_argument('-O', '--only_core', action='store_true',

+                                help="Requests that only the core database is created without attribute indexes.")

+     import_parser.add_argument('-s', '--include_suffixes', nargs='+',

+                                help="Specifies the suffixes or the subtrees to be included.")

+     import_parser.add_argument('-x', '--exclude_suffixes', nargs='+',

+                                help="Specifies the suffixes to be excluded.")

+ 

+     export_parser = subcommands.add_parser('export', help='do an online export of the suffix')

+     export_parser.set_defaults(func=backend_export)

+     export_parser.add_argument('be_names', nargs='+',

+                                help="The backend names or the root suffixes from where to export.")

+     export_parser.add_argument('-l', '--ldif',

+                                help="Gives the filename of the output LDIF file."

+                                     "If more than one are specified, use a space as a separator")

+     export_parser.add_argument('-C', '--use_id2entry', action='store_true', help="Uses only the main database file.")

+     export_parser.add_argument('-E', '--encrypted', action='store_true',

+                                help="""Decrypts encrypted data during export. This option is used only

+                                        if database encryption is enabled.""")

+     export_parser.add_argument('-m', '--min_base64', action='store_true',

+                                help="Sets minimal base-64 encoding.")

+     export_parser.add_argument('-N', '--no_seq_num', action='store_true',

+                                help="Enables you to suppress printing the sequence number.")

+     export_parser.add_argument('-r', '--replication', action='store_true',

+                                help="Exports the information required to initialize a replica when the LDIF is imported")

+     export_parser.add_argument('-u', '--no_dump_uniq_id', action='store_true',

+                                help="Requests that the unique ID is not exported.")

+     export_parser.add_argument('-U', '--not_folded', action='store_true',

+                                help="Requests that the output LDIF is not folded.")

+     export_parser.add_argument('-s', '--include_suffixes', nargs='+',

+                                help="Specifies the suffixes or the subtrees to be included.")

+     export_parser.add_argument('-x', '--exclude_suffixes', nargs='+',

+                                help="Specifies the suffixes to be excluded.")

@@ -0,0 +1,54 @@ 

+ # --- BEGIN COPYRIGHT BLOCK ---

+ # Copyright (C) 2018 Red Hat, Inc.

+ # All rights reserved.

+ #

+ # License: GPL (version 3 or any later version).

+ # See LICENSE for details.

+ # --- END COPYRIGHT BLOCK ---

+ 

+ 

+ def backup_create(inst, basedn, log, args):

+     log = log.getChild('backup_create')

+ 

+     task = inst.backup_online(archive=args.archive, db_type=args.db_type)

+     task.wait()

+     result = task.get_exit_code()

+ 

+     if task.is_complete() and result == 0:

+         log.info("The backup create task has finished successfully")

+     else:

+         raise ValueError("The backup create task has failed with the error code: ({})".format(result))

+ 

+ 

+ def backup_restore(inst, basedn, log, args):

+     log = log.getChild('backup_restore')

+ 

+     task = inst.restore_online(archive=args.archive, db_type=args.db_type)

+     task.wait()

+     result = task.get_exit_code()

+ 

+     if task.is_complete() and result == 0:

+         log.info("The backup restore task has finished successfully")

+     else:

+         raise ValueError("The backup restore task has failed with the error code: ({})".format(result))

+ 

+ 

+ def create_parser(subparsers):

+     backup_parser = subparsers.add_parser('backup', help="Manage online backups")

+ 

+     subcommands = backup_parser.add_subparsers(help="action")

+ 

+     create_parser = subcommands.add_parser('create', help="Creates a backup of the database")

+     create_parser.set_defaults(func=backup_create)

+     create_parser.add_argument('archive', nargs='?', default=None,

+                                help="The directory where the backup files will be stored."

+                                     "The /var/lib/dirsrv/slapd-instance/bak directory is used by default."

+                                     "The backup file is named according to the year-month-day-hour format.")

+     create_parser.add_argument('-t', '--db-type', default="ldbm database",

+                                help="Database type (default: ldbm database).")

+ 

+     restore_parser = subcommands.add_parser('restore', help="Restores a database from a backup")

+     restore_parser.set_defaults(func=backup_restore)

+     restore_parser.add_argument('archive', help="The directory of the backup files.")

+     restore_parser.add_argument('-t', '--db-type', default="ldbm database",

+                                 help="Database type (default: ldbm database).")

@@ -235,6 +235,66 @@ 

          return abort_task

  

  

+ class ImportTask(Task):

+     """Create the import ldif task

+ 

+     :param instance: The instance

+     :type instance: lib389.DirSrv

+     """

+ 

+     def __init__(self, instance, dn=None):

+         self.cn = 'import_' + Task._get_task_date()

+         dn = "cn=" + self.cn + ",cn=import," + DN_TASKS

+         self._properties = None

+ 

+         super(ImportTask, self).__init__(instance, dn)

+ 

+ 

+ class ExportTask(Task):

+     """Create the export to ldif task

+ 

+     :param instance: The instance

+     :type instance: lib389.DirSrv

+     """

+ 

+     def __init__(self, instance, dn=None):

+         self.cn = 'export_' + Task._get_task_date()

+         dn = "cn=" + self.cn + ",cn=export," + DN_TASKS

+         self._properties = None

+ 

+         super(ExportTask, self).__init__(instance, dn)

+ 

+ 

+ class BackupTask(Task):

+     """Create the backup DB task

+ 

+     :param instance: The instance

+     :type instance: lib389.DirSrv

+     """

+ 

+     def __init__(self, instance, dn=None):

+         self.cn = 'backup_' + Task._get_task_date()

+         dn = "cn=" + self.cn + ",cn=backup," + DN_TASKS

+         self._properties = None

+ 

+         super(BackupTask, self).__init__(instance, dn)

+ 

+ 

+ class RestoreTask(Task):

+     """Create the restore DB task

+ 

+     :param instance: The instance

+     :type instance: lib389.DirSrv

+     """

+ 

+     def __init__(self, instance, dn=None):

+         self.cn = 'restore_' + Task._get_task_date()

+         dn = "cn=" + self.cn + ",cn=restore," + DN_TASKS

+         self._properties = None

+ 

+         super(RestoreTask, self).__init__(instance, dn)

+ 

+ 

  class Tasks(object):

      proxied_methods = 'search_s getEntry'.split()

  

@@ -6,18 +6,21 @@ 

  # See LICENSE for details.

  # --- END COPYRIGHT BLOCK ---

  

+ import os

  import pytest

  

- from lib389.cli_conf.backend import backend_list, backend_get, backend_get_dn, backend_create, backend_delete

+ from lib389.cli_conf.backend import backend_list, backend_get, backend_get_dn, backend_create, backend_delete, backend_export, backend_import

  

  from lib389.cli_base import LogCapture, FakeArgs

  from lib389.tests.cli import topology

+ from lib389.topologies import topology_st

  

  from lib389.utils import ds_is_older

  pytestmark = pytest.mark.skipif(ds_is_older('1.4.0'), reason="Not implemented")

  

+ 

  # Topology is pulled from __init__.py

- def test_backend_cli(topology):

+ def test_basic(topology):

      # 

      args = FakeArgs()

      backend_list(topology.standalone, None, topology.logcap.log, None)
@@ -54,3 +57,44 @@ 

      topology.logcap.flush()

      # Done!

  

+ 

+ def test_import_export(topology_st):

+     BE_NAME = 'userRoot'

+     EXCLUDE_SUFFIX = "ou=Groups,dc=example,dc=com"

+     LDIF_PATH = os.path.join(topology_st.standalone.ds_paths.ldif_dir, "test_import_export.ldif")

+     topology_st.logcap = LogCapture()

+     args = FakeArgs()

+     # Export the backend

+     args.be_names = [BE_NAME]

+     args.ldif = LDIF_PATH

+     args.use_id2entry = None

+     args.encrypted = None

+     args.min_base64 = None

+     args.no_dump_uniq_id = None

+     args.replication = None

+     args.not_folded = None

+     args.no_seq_num = None

+     args.include_suffixes = None

+     args.exclude_suffixes = [EXCLUDE_SUFFIX]

+     backend_export(topology_st.standalone, None, topology_st.logcap.log, args)

+     # Assert the right ldif was created

+     os.path.exists(LDIF_PATH)

+     assert os.path.exists(LDIF_PATH)

+     with open(LDIF_PATH, 'r') as ldif:

+         for line in ldif:

+             assert not line.endswith("%s\n" % EXCLUDE_SUFFIX)

+     # Assert the ldif was created

+     os.path.exists(LDIF_PATH)

+     # Import the backend

+     args.be_name = BE_NAME

+     args.ldifs = [LDIF_PATH]

+     args.chunks_size = None

+     args.encrypted = None

+     args.gen_uniq_id = None

+     args.only_core = None

+     args.include_suffixes = None

+     args.exclude_suffixes = None

+     backend_import(topology_st.standalone, None, topology_st.logcap.log, args)

+     # No error has happened! Done!

+     # Clean up

+     os.remove(LDIF_PATH)

@@ -0,0 +1,31 @@ 

+ import os

+ import pytest

+ import shutil

+ 

+ from lib389.cli_conf.backup import backup_create, backup_restore

+ from lib389.cli_base import LogCapture, FakeArgs

+ from lib389.topologies import topology_st

+ from lib389.utils import ds_is_older

+ pytestmark = pytest.mark.skipif(ds_is_older('1.4.0'), reason="Not implemented")

+ 

+ 

+ def test_basic(topology_st):

+     BACKUP_DIR = os.path.join(topology_st.standalone.ds_paths.backup_dir, "basic_backup")

+     topology_st.logcap = LogCapture()

+     args = FakeArgs()

+     # Clean the backup dir first

+     if os.path.exists(BACKUP_DIR):

+         shutil.rmtree(BACKUP_DIR)

+     # Create the backup

+     args.archive = BACKUP_DIR

+     args.db_type = None

+     backup_create(topology_st.standalone, None, topology_st.logcap.log, args)

+     assert os.listdir(BACKUP_DIR)

+     # Restore the backup

+     args.archive = topology_st.standalone.ds_paths.backup_dir

+     args.db_type = None

+     backup_restore(topology_st.standalone, None, topology_st.logcap.log, args)

+     # No error has happened! Done!

+     # Clean up

+     if os.path.exists(BACKUP_DIR):

+         shutil.rmtree(BACKUP_DIR)

Description: dsconf tool now has:
'dsconf backup create' and 'dsconf backup restore';
'dsconf localhost backend import' and 'dsconf localhost backend export'.

https://pagure.io/389-ds-base/issue/49858

Reviewed by: ?

It is a CLI part.
I will add a Web/UI part next thing.

For UI part, the backup and restore should be dynamic under the instance "Actions" button dropdown. If the server is running, do the backup/restore via dsconf, if the server is stopped use dsctl. Fyi :-)

Reviewing patch now...

Hey mate
I think this looks excellent but, you are missing test cases.

It's possible to test the CLI framework, and I think especially for backup/import etc, it's critical that we have tests and assertions of it's correctness.

Can you please add tests for this?

Thanks!

Multiple arguments support was added.
Everything works manually with CLI.

Also I've added some basic tests. Import-export passes but backup-restore currently fails with this:

[14/Aug/2018:08:33:51.958866540 -0400] - ERR - dblayer_backup - Log archive error

here ldap/servers/slapd/back-ldbm/dblayer.c:

5937     /* repeat this until the logfile sets match... */
5938     do {
5939         /* get the list of logfiles currently existing */
5940         if (priv->dblayer_enable_transactions) {
5941             return_value = LOG_ARCHIVE(priv->dblayer_env->dblayer_DB_ENV,
5942                                        &listA, DB_ARCH_LOG, (void *)slapi_ch_malloc);
5943             if (return_value || (listA == NULL)) {
5944                 slapi_log_err(SLAPI_LOG_ERR,
5945                               "dblayer_backup", "Log archive error\n");
5946                 if (task) {
5947                     slapi_task_log_notice(task, "Backup: log archive error\n");
5948                 }
5949                 return_value = -1;
5950                 goto bail;
5951             }
5952         } else {
5953             ok = 1;
5954         }

So it looks like there is something wrong with the log or database filenames...

Actually, the CLI lib389 tests don't work at all in the current state. Nobody was supporting them while adding new parts in lib389/CLI. The setup fails and the logging doesn't work properly.

So as long as the CLI feature I add works manually, I think it is okay to rework current test suite later. I need to continue with other CLI/WebUI work first...

rebased onto 68de588177a664860b74da071a70cbbf4142efe7

5 years ago

@spichugi I think we should work out why these tests are failing first, because backup and restore are critical to administrative trust in upgrades and rollbacks. We need to guarantee we never have issues in this area because this is possibly the most critical interface for supportability.

Can we work out why these are failing?

@spichugi I think we should work out why these tests are failing first, because backup and restore are critical to administrative trust in upgrades and rollbacks. We need to guarantee we never have issues in this area because this is possibly the most critical interface for supportability.
Can we work out why these are failing?

The DS tasks itself work. And CLI/WebUI parts also work (manually).
But our testing topologies and the tests itself for CLI don't work (one error I've posted above, also we have issues with JSON and other optional args in FakeArgs).

So we need to rethink the CLI testing infrastructure and I don't want to do it in rush.
I will create a separate issue for this. And any contributions are welcome :)

rebased onto 75c199ac2fdead5da628f6bf6fb504044c06e2e3

5 years ago

Added a new commit. Please, review

For "export/import" in the UI, when you right click on a suffix node, that suffix should be autopopulated in the modal. In fact you should not be allowed to change it (you can not specify a subtree). So it should either be in the title or the root suffix field should be read-only.

Making the modal size dynamic has also broken other modals layouts that were expecting it to be a fixed size: new local password policy, and reload schema files from actions menu(?)

Not sure if it's related but the "Create Winysnc Agreement", "Create SASL Mapping", "Add CA Cert", "Import Cert", etc modals are on the far left of the screen.

2 new commits added

  • Issue 49858 - Add WebUI part for online backup/restore and import/export
  • Issue 49858 - Add backup/restore and import/export functionality to CLI
5 years ago

For "export/import" in the UI, when you right click on a suffix node, that suffix should be autopopulated in the modal. In fact you should not be allowed to change it (you can not specify a subtree). So it should either be in the title or the root suffix field should be read-only.
Yes, I agree.

For this, we should have the functioning db-tree but it is just an HTML template now (with hard-coded field 'id's.
I think it is better to implement the feature in the separate PR because it will be bigger than a couple of lines and my PR is already big enough. Probably we will use jstree here...
I did implement my current part with this idea in mind. So when we'll be implementing our db-tree we can just replace (remove) existing 'Root suffix' field with something dynamic. The rest of the import/export logic won't change.
I left 'TODO' comment in the place where we should work out the issue after implementing db-tree.

Making the modal size dynamic has also broken other modals layouts that were expecting it to be a fixed size: new local password policy, and reload schema files from actions menu(?)
Not sure if it's related but the "Create Winysnc Agreement", "Create SASL Mapping", "Add CA Cert", "Import Cert", etc modals are on the far left of the screen.

I really think we should have dynamic sized fields. I did remove the CSS change for now. We can work it out later. I'll create a bunch of tickets for all of the issues mentioned in this PR.

P.S. Actually, I think it will be enough to remove ds-modal-wide and ds-modal. I did it for Create SASL Mapping and it looks good on my machine. We should check why you were not happy with it...

I did it for Create SASL Mapping and it looks good on my machine. We should check why you were not happy with it...

Not sure if it's related but the "Create Winysnc Agreement", "Create SASL Mapping", "Add CA Cert", "Import Cert", etc modals are on the far left of the screen.

These are not centered on the screen (they are on the left edge of the screen), are they centered for you?

I did it for Create SASL Mapping and it looks good on my machine. We should check why you were not happy with it...
Not sure if it's related but the "Create Winysnc Agreement", "Create SASL Mapping", "Add CA Cert", "Import Cert", etc modals are on the far left of the screen.

These are not centered on the screen (they are on the left edge of the screen), are they centered for you?

  1. Before my PR they were centered but "Manage Backups" were not centered and went off the right side of the screen.
    https://pagure.io/389-ds-base/blob/master/f/src/cockpit/389-console/index.html#_291

  2. I changed ds.css ds-modal-wide to the automatic recenter (instead of hardcoded pixels) and it has started to work for me well

  3. But "Create SASL Mapping" went off the screen because of this as you noticed :)

  4. If I remove ds-modal-wide from here
    https://pagure.io/389-ds-base/blob/master/f/src/cockpit/389-console/servers.html#_1008
    "Create SASL Mapping" works well for me and centered.

So I want to open another issue where we can resolve all of the wrong alignments.
My initial thought was to remove ds-modal and ds-modal-wide because existing cockpit models works okay without it. But if it doesn't work for you we should modify it then so they'll be automatic and flexible. :)

So I want to open another issue where we can resolve all of the wrong alignments.
My initial thought was to remove ds-modal and ds-modal-wide because existing cockpit models works okay without it. But if it doesn't work for you we should modify it then so they'll be automatic and flexible. :)

Okay, this gets my ack, but if you can get into a single commit that would be nice.

rebased onto c393394

5 years ago

Pull-Request has been merged by spichugi

5 years ago

@spichugi Hey mate, I know you finished and merged this, but would it be possible to more assertions after you do the restore? Like search for some items or check the configuration is still the same as it was?

@spichugi Hey mate, I know you finished and merged this, but would it be possible to more assertions after you do the restore? Like search for some items or check the configuration is still the same as it was?

Sure, I'll add it :) Thanks!

No thank you! This is really exciting to see you contributing so much more to the project, and I hope my feedback is helping. I look forward to reviewing the next part.

389-ds-base is moving from Pagure to Github. This means that new issues and pull requests
will be accepted only in 389-ds-base's github repository.

This pull request has been cloned to Github as issue and is available here:
- https://github.com/389ds/389-ds-base/issues/2959

If you want to continue to work on the PR, please navigate to the github issue,
download the patch from the attachments and file a new pull request.

Thank you for understanding. We apologize for all inconvenience.

Pull-Request has been closed by spichugi

3 years ago