Contact Us

If you still have questions or prefer to get help directly from an agent, please submit a request.
We’ll get back to you as soon as possible.

Please fill out the contact form below and we will reply as soon as possible.

  • Contact Us
  • Home
  • API, CLI, and SDK Documentation
  • REST API Documentation
  • Jobs Using REST API
  • Creating Jobs Using REST API

Transfer Job Configuration Options Using REST API

Learn how to configure the transfer block when creating a migration job using the DryvIQ REST API.

Written by Andrea Harvey

Updated at April 30th, 2025

Contact Us

If you still have questions or prefer to get help directly from an agent, please submit a request.
We’ll get back to you as soon as possible.

Please fill out the contact form below and we will reply as soon as possible.

  • Insights
    Prebuilt Insights Custom Insights
  • Content
  • Accounts
  • Activity Log
  • Content Scans
  • Migrations
    Migration Jobs Migration Reports Maps Flagged Items Migration Tools
  • Connections
    Supported Platform Connections Creating Connections Connection Maintenance Connection Pools
  • Entity Types
    DryvIQ Available Entity Types Custom Entity Types Entity Type Maintenance
  • Action Sets
    Creating Action Sets Action Sets Maintenance
  • Settings
    License Performance Notifications Extensions Entity Types Settings Display Settings Configuration
  • API, CLI, and SDK Documentation
    REST API Documentation Command-line Interface SDK Development
  • POC Offering
  • Release Notes
+ More

Table of Contents

Usage Options

Usage

To use the options listed below, you would need to add the option to the "transfer": { } block when creating/editing a job using the REST API. Each colon (:) indicates a nested property. For example, to use source:size_estimate:bytes, you should include it as:

POST v1/jobs
{
    "name": "test copy",
    "kind": "transfer",
    "transfer": {
        "transfer_type": "copy",
        "source": {
            "connection": {
                "id": "5dc531df34554edd96c31272262ad950"
            },
            "target": {
                "path": "/C/data"
            },
            "size_estimate": {
                "bytes": 10240
            }
        },
        "destination": {
            "connection": {
                "id": "bb44a17816004f2c9fa763a347d7ebbc"
            },
            "target": {
                "path": "/Documents/test"
            }
        }
    },
    "schedule": {
        "mode": "manual"
    }
}

The following example shows how to use empty_containers with a setting of skip, in conjunction with an exclusion filter.

POST v1/jobs
{
 "name":"Copy Job Skip Empty Folders",
 "kind": "transfer",
 "transfer": {
     "audit_level": "trace",
     "transfer_type": "copy",
     "empty_containers": "skip",
      "filter": {
           "source": [{
             "action": "exclude",
                "rules": [{
                   "extensions": [
                        "wav",
                        "jpg"
                    ],
                    "type": "filter_extension"
                }],
                "type": "filter_rule"
               }
           ]
      },
     "source": {
       "connection": { "id": "{{nfs_connection}}" },
       "target": {
         "path": "/EmptyTest_Source"
       }
     },
     "destination": {
       "connection": { "id": "{{cloud_connection}}" },
       "target": {
         "path": "/EmptyTest_Destination"
       }
 },
 "schedule": {
   "mode": "manual"
  }
 }
}

Options

Key Description/Notes Values Default
transfer_type The default transfer type to use for jobs when not specified default, sync, publish,
move, migrate, copy, taxonomy (copy folder structure)
sync
source:type   directory, file, control_file,
control_endpoint, custom
 
source:event_position Not customer-facing (read-only) string  
source:size_estimate:count Estimates number of files on the source, used to estimate job progress long  
source:size_estimate:bytes Estimated size of the source in bytes, used to estimate job progress long  
source:connection   connection  
source:impersonate_as   AccountDefinition  
source:target:path   string  
source:target:uri   string  
source:target:item   PlatformItemID  
source:authenicate   true, false  
source:options   custom  
destination:
    event_position
    connection
    impersonate_as
    target
    authenticate
    options
  same as source  
performance:retries Recovery policy interger (null)
performance:parallel_writes:requested The default number of parallel writes to use during transfer execution integer 2
performance:parallel_writes:max   integer  
performance:upload:bytes_per_second Bandwidth throttling long  
performance:upload:window

window is start, midnight is end

relative to time job is started

potentially change window definition

Array of 
{time: {hr: int, min: int, sec: int, ms: int} ,
bytes_per_second: long}
 
performance:download:bytes_per_second   long  
performance:download:window   Array of 
{time: {hr: int, min: int, sec: int, ms: int} ,
bytes_per_second: long}
 
audit_level The default audit level to use for transfer jobs (none, trace, debug, info, warn, error none, trace, debug, info, warn, error info
failure_policy The default failure policy to use for transfer jobs (continue, halt) continue, halt continue
rendition The default rendition selection policy to use for transfer jobs (original, rendition) original, rendition original
batch_mode The default batch mode usage policy to use for transfer jobs (none, initial, always) none, initial, always always
permissions The default permission preservation policy to use for transfer jobs (none, add, diff)

none : permissions are not transfered

 

add : permissions are added only, existing permissions on dest are not touched

 

diff: permissions are reconciled / synced (i.e. if dest has more permissions than source it would be removed)

none
preserve_owners The default audit trail preservation option true, false false
restricted_content The default restricted content (retricted extensions) handling policy  fail, warn, skip, convert convert
large_item The default large file handling policy

fail: fail and add to audit log as failure 

 

skip: skip and add to audit log; if audit level is set to low, skip would be ignored

fail
item_overwrite The default item overwrite policy fail, skip, overwrite overwrite
segment_transform The default flag indicating if segment transformation is enabled true, false true
encode_invalid_characters

Encodes invalid characters instead of replacing them with an underscore

The UTF8 bytes for invalid characters are used and converted to a hex string.

Example: 123白雜.txt would be converted to 123E799BDE99B9CE8.txt.

false, true false
filter:source   complex with options for
ContentFilter,
SharedItemFilter,
HiddenItemFilter,
DateRangeFilter,
PatternContentFilter,
SizeRangeFilter,
PredefinedContentFilter,
MetadataContentFilter
 
filter:destination   1  
tracking:detection The default change tracking policy none, native, crawl native
tracking:reset:on_increment The default number of executions before resetting change tracking state long (null)
tracking:reset:on_interval The default interval before resetting change tracking state {value: double, unit: d|h|m|s|ms|us|ns} (null)
conflict_resolution The default conflict resolution policy to use for transfer jobs  copy, latest, source,
destination, failure
copy
delete_propagation The default delete propagation policy to use for transfer jobs  mirror, ignore_source,
ignore_destination, ignore_both
ignore_both
duplicate_names The default duplicate name resolution policy to use for transfer jobs (i.e., Google native docs come back as duplicate files) warn, rename rename
empty_containers

The default empty container policy to use for transfer jobs 

empty folders + folders with all content filtered out 

create, skip create
versioning:preserve The default version preservation policy to use for transfer jobs  none, native native
versioning:select The default version selection policy to use for transfer jobs all, published, unpublished, latest (version.preserve=none) all
versioning:from_source The default number of versions to maintain on the source platform integer (null)
versioning:from_destination

The default number of versions to maintain on the destination platform

Not all platforms support version deletes. When a specific transfer value is set and the destination platform doesn’t support version deletes, DryvIQ will use the following logic to determine how it handles transferring the versions:

  • If the file doesn’t exist on destination, DryvIQ will respect the version limit set and only transfer the set number of versions during the initial copy/migration.
  • If the file exists on destination, DryvIQ will migrate all new versions of the file from the source to the destination, even if it results in exceeding the file version limit set on version count. This ensures all new content is transferred. DryvIQ will log a warning to inform the user that the transfer took place and resulted in the transfer count being exceeded.
integer (null)
lock_propagation The default lock propagation option ignore, mirror_owner, mirror_lock ignore
timestamps The default timestamp preservation policy to use for transfer jobs true, false true
trust_mode when file is there, DryvIQ assumes files are same true, false  
metadata_map:schemas   Array of {
    source: { id: string },
    destination: { id: string },
    default: bool,
    mappings: Array of {
        source: { property: { name: string }},
        destination: { property: { name: string }}
        choices: Array of {
             source: { name: string, value: string },
             destination: { name: string, value: string }
         }
    }
}
 
account_map   AccountMap  
group_map   GroupMap  
metadata_import   PropertyValueImportSpecification  
permissions_import   PermissionsImportSpecification  

 


 

api rest api job migration transfer block

Was this article helpful?

Yes
No
Give feedback about this article

Related Articles

  • Connection Management Using REST API
  • Connection Pools Using REST API
  • LDAP Account and Group Maps Using REST API

Copyright 2025 – DryvIQ, Inc.

Knowledge Base Software powered by Helpjuice

Expand