Folder Mapping Job
Learn how folder mapping jobs create child jobs to transfer content.
Table of Contents
Overview
Folder mapping jobs are ideal for migrations where you wish to control the transfer at a granular level without the effort of creating individual jobs. DryvIQ will automatically create a unique job for each folder in your hierarchy, inheriting configurations from the parent job. The folder mapping job can be controlled and manipulated like a transfer job, but when executed, it will not transfer data. Instead, each execution creates, modifies, or deletes its child jobs, which are responsible for transferring content. For example, if your source contains three subfolders, a folder mapping job would create a child job for each of those three folders. As new folders are created in the source, additional child jobs are created for them automatically. Data will be transferred when the child jobs are run.

Duplicate Folder Names
If there are duplicate folder names, DryvIQ will only create the child job for the first folder it encounters and skip the duplicate folder. Therefore, you must verify there are no duplicate folder names before creating your folder mapping job.
Performance Considerations
The parent job for a folder mapping job has minimal impact on performance. However, the child jobs it creates follow the same performance considerations as any job type available in DryvIQ. If DryvIQ is installed on a single instance, Parallel Writes will be a limiting factor, regardless of how many child jobs are created. More concurrent jobs may be configured without impacting performance in a multi-node scenario.
Adding Locations
In addition to choosing the source and destination platforms for the Folder Mapping job, you need to specify the impersonation options and paths that should be used to create the child jobs.

Run as user (Impersonation)
The ability to impersonate users is required for folder mapping jobs. Impersonation allows the connection to access all folders on a site. To enable impersonation, the given connection must be configured using an administrator account. Only some supported platforms support impersonation. They include
- Box
- Dropbox for Business
- Google Workspace
- Google Team Drives
- Microsoft Office 365
- Microsoft OneDrive for Business
- Microsoft SharePoint.
Only connections created for these platforms are listed in the Locations step when creating a Folder Mapping job. Refer to the Platform Comparison within the DryvIQ Platform for the most current support list.
Source and Destination Path
If you wish to transfer all content, leave the source path blank. A child job will be created for every top-level folder. If a folder is selected for the source path, a child job will be created for every subfolder within the parent folder.
Child Job Source and Destination Path
This directory within each folder will be used as the source.
- Target the root of each folder: The child job will be created for the first-level folder relative to the source path.
- Target a specific directory within each folder: If a folder exists in every directory, you can define it with this option.
Setting the Job Schedule
You will be prompted to schedule the job after configuring policies, behaviors, and advanced features. This schedule will be applied to the child jobs. The parent job will run immediately to create the child jobs. After the child jobs are created, the default schedule is set to run every six hours to review the source for any new content. You can edit the schedules for the parent and child jobs anytime.

Viewing Child Jobs
When viewing the folder mapping job, go to the Child Jobs page to view all the created child jobs.

Deleting Child Jobs
If you delete a child job from a folder mapping job, the job will eventually be recreated when the parent job runs again. However, the child job will not be recreated until the “Delete Pending Jobs” system job runs. You can filter the Jobs list to show just the system jobs and manually run this job. Once this job completes, the deleted child job for the folder mapping job will be recreated the next time the parent job runs.