This cover the information how the SharePoint Online Migration API (in preview when I am writing) help using Azure to migrate files and metadata from fileshares/SharePoint on-premises to SharePoint Online.
5. Speed?
The Type of content does impact the rate of ingestion
Using Backend Resources
Lots of small, scenario specific, tweaking that can help
get the best out of the API
Preliminary data suggest 5X the speed of CSOM before
throttling.
6. Source
SharePoint/ODB
Final Destination
Process Overview
File Share,
SharePoint On-Prem,
Potentially any other
Data Source.
Create Package for the
API to be able to
accept it.
Use the Power of
ingestion of Azure to
bring the content
faster in Microsoft
network.
Timer Job Based import
in a scalable way that
will not hurt the service
using back end
resources.
7. SPO Management Shell Commands
New-SPOMigrationPackage[/Export-SPWeb]
ConvertTo-SPOMigrationTargetedPackage
Set-SPOMigrationPackageAzureSource
Submit-SPOMigrationJob
New
Convert
Set
Submit
8. What you need?
•Azure Subscription
•Azure storage account
•Office 365 with
•Existing active directory O365 integration with on-
premises environment
•Normal user added to O365
•SharePoint Online Management Shell MSI
16. Limitations
• Azure
• TB per storage account – 500 TB
• Max Size of blob (blob/queue)– 500 TB
• Target throughput for single blob – up
to 60 MBPS or upto 500 requests per
second
• Max number of blob container, blobs,
file shares – only limit is 500 TB storage
account capacity
• SPO
• Package size – 2 -4 GB
• File size – 2 GB
• Target size – target site should remain
non-accessible to end user till
migration is complete.
• SPO Limits apply
20. Prerequisite
• An Azure subscription
• An Azure Storage account
• Provision your Office 365 with either:
• Your existing active directory Office 365 integration with on-premises
environments or
• Use one of the other options for adding accounts to Office 365 Add users
to Office 365 for business
• Download and install the SharePoint Online Management Shell MSI from
this site. Use the Control Pane to uninstall any previous versions.
21. Setup
1. Install SPO Management Shell
2. Setup Temp and Final Folder. Write down the path.
3. Setup Azure storage account. Write down account name and
primary key
22. Script
Connect-SPOService -Url "https://<tenant>-admin.sharepoint.com" -Credential "admin@<tenant>.onmicrosoft.com"
$creds = (Get-Credential "admin@s<tenant>.onmicrosoft.com")
$sourceFiles = "fileserverrootfolder"
$sourcePackage = "C:FileShareMigTemp"
$targetPackage = "C:FileShareMigFinal"
$targetWeb = "https://<destination_web_url>"
$targetDocLib = “<destination_lib_title>"
$azureAccountName = “<azure_storage_account_name>"
$azureAccountKey = "<azure_storage_account_primary_key>"
$azureQueueName = “<Any_name_unique_indifier_for_status>"
Write-Host "Variable setup completed"
#Create new content package from an on-premises file share
New-SPOMigrationPackage -SourceFilesPath $sourceFiles -OutputPackagePath $sourcePackage -NoADLookup
Write-Host "Successfully created package"
#Convert the content package for your target site
ConvertTo-SPOMigrationTargetedPackage -SourceFilesPath $sourceFiles -SourcePackagePath $sourcePackage -OutputPackagePath $targetPackage -TargetWebUrl $targetWeb -TargetDocumentLibraryPath $targetDocLib -Credentials $creds -
NoAzureADLookup
Write-Host "Successfully converted the package for SPO"
#Create azure containers and upload package
$al = Set-SPOMigrationPackageAzureSource -SourceFilesPath $sourceFiles -SourcePackagePath $targetPackage -AzureQueueName $azureQueueName -AccountName $azureAccountName -AccountKey $azureAccountKey
$al|fl
Write-Host "Successfully created azure container"
#submit content package data to site collection
Submit-SPOMigrationJob -TargetWebUrl $targetWeb -MigrationPackageAzureLocations $al -Credentials $creds
Write-Host "Successfully submitted package"
23. Special cases
• New-SPOMigrationPackage : if you local domain is not setup for single
sign-on with Azure, use the paramert –NoADLookup.
• ConvertTo-SPOMigrationTargetedPackage : if above or you do not
want to map local user to cloud user (if you have different identities),
then use the parameter –NoAzureADLookup. If you want to map, use
the switch –usermapping file. For more details, check the command.
Notes de l'éditeur
Microsoft has released the new migration API for SharePoint online on last week of June. The APIs are in preview and not yet final for production use.
The focus here is taking the old file shares and SharePoint on-premises to Cloud with much ease.
Intro about myself
In the entire collaboration suite on SharePoint, there are multiple components involved. Folder/Files and metadata about the elements is also important.
I will be focusing on the moving of the files/folders with metadata and security like author/who modified it, it is shared with whom all and permission.
Rest triangle shows that based on the component you are moving what kind of effort is involved and what is complexity level.
The general approach recommended by Microsoft is plan prepare Migrate Adapt. This is iterative to get the information correct at end point.
The migration process can be identified using where we are at to where we are landing. The plan should focus on what we know and how we can use the skills to migrate the information.
Why these new API tools?
This is utilising the Azure for faster moving to cloud and keeping it simple to help any IT Pro and Dev to migrate the information to cloud.
On investigation, this process is overall 5X faster than CSOM calls.
Still in preview though and only available with tools like Metalogix, AvePoint, Sharegate.
The impact on process depends who many and size of objects being processed. Small objects may be processed faster and large object may be taking more time.
Limitations of platform is discussed later in this.
This is high level information on how you are going to migrate the file/folders.
Find out whether you are looking to migrate the file shares or SharePoint on-premises. This is important to know here as the temp package will depend on the command you would execute.
Then you have the temp package which is including the 8 xml files with metadata of files/folder and security. You will transform this temp package to final package keeping the destination location in mind. This again generate 8 XML files which will include mapping as the files/folder metadata to library specific information.
Then you submit this final package to Azure Blog and Timer job queue. This will move the 8 xml files + Other content files to Azure Blobs separately and take a snapshot.
Now you tell SPO to take these files to SharePoint Online or OneDrive for Business. This will be done by running timer job and you can use tools to monitor this.
Commands and explain what they do
Prerequisites information
If you are using file share, create the temp package
If looking to move from SharePoint on-premises, creating the temp package.
Covert the package for migration and this is final package
Submit the files and other element xml files to azure. This will create queue and blob storage automatically and snapshot them before submission.
Submit the package. The timer job will automatically pick them move to destination as SPO or ODFB.
You can map the user if you are synced with Azure AD on-prem or by defining a usermapping csv during final package migration.
The CSOM APIs are also live to use the same job submission to cloud. You can use powershell or CSOM to submit the job.
CSOM is better in file size limit now as support till 10 GB file upload. Azure path could be more time consuming if you are creating more jobs based on the location to migrate.