Thursday, July 31, 2014

Uploading Large Documents into SharePoint Online with REST,CSOM, and RPC using C#

Technorati Tags: ,,

There are many articles that give great examples on how to upload documents to SharePoint Online using jQuery and REST. These are useful to get around the message size limitation of use CSOM/JSOM when uploading documents. This message size limitation is not configurable in SharePoint Online. There are few examples on how to upload large documents using C#. In this blog post I will show you how to use C# and the SharePoint REST, Managed CSOM and RPC to upload large documents (up to 2GB) to SharePoint Online. There are a few things you need to take care of to get all these  to work with SharePoint Online.

Credentials and Cookie Containers

In the code examples below both REST and RPC use the HttpWebRequest class to communicate with SharePoint. When using this class from C# you must set the Credentials and the CookieContainer properties of the HttpWebRequest object. The following helper methods creates the Microsoft.SharePoint.Client.SharePointOnlineCredentials and gets the System.Net.CookieContainer for the SharePointOnlineCredentials.

public static class Utils
{

public static CookieContainer GetO365CookieContainer(SharePointOnlineCredentials credentials, string targetSiteUrl)
{

Uri targetSite = new Uri(targetSiteUrl);
string cookieString = credentials.GetAuthenticationCookie(targetSite);
CookieContainer container = new CookieContainer();
string trimmedCookie = cookieString.TrimStart("SPOIDCRL=".ToCharArray());
container.Add(new Cookie("FedAuth", trimmedCookie, string.Empty, targetSite.Authority));
return container;


}

public static SharePointOnlineCredentials GetO365Credentials(string userName, string passWord)
{
SecureString securePassWord = new SecureString();
foreach (char c in passWord.ToCharArray()) securePassWord.AppendChar(c);
SharePointOnlineCredentials credentials = new SharePointOnlineCredentials(userName, securePassWord);
return credentials;
}



}

Uploading Large Documents With REST


The following code takes the site URL, document library title, and a file path to a local file and adds the file to the root folder collection of the site. If you want to use folders you can modify this code to handle it. The REST call requires a form digest value to be set so I have included the code that makes a REST call to the contextinfo to get it. Please make sure to set the time out on the HttpWebRequest to about 10 minutes because large files will exceed the default time out of 100 seconds. 10 minutes should be adequate to cover the unpredictable upload speeds of ISP’s and SharePoint Online.

public static void UploadRest(string siteUrl, string libraryName, string filePath)
{
byte[] binary = IO.File.ReadAllBytes(filePath); ;
string fname = IO.Path.GetFileName(filePath);
string result = string.Empty;
string resourceUrl = siteUrl + "/_api/web/lists/getbytitle('" + libraryName + "')/rootfolder/files/add(url='" + fname + "',overwrite=true)";

HttpWebRequest wreq = HttpWebRequest.Create(resourceUrl) as HttpWebRequest;
wreq.UseDefaultCredentials = false;
SharePointOnlineCredentials credentials = Utils.GetO365Credentials("your login", "your password");
wreq.Credentials = credentials;
wreq.CookieContainer = Utils.GetO365CookieContainer(credentials, siteUrl);

string formDigest = GetFormDigest(siteUrl, credentials, wreq.CookieContainer);
wreq.Headers.Add("X-RequestDigest", formDigest);
wreq.Method = "POST";
wreq.Timeout = 1000000;
wreq.Accept = "application/json; odata=verbose";
wreq.ContentLength = binary.Length;


using (IO.Stream requestStream = wreq.GetRequestStream())
{
requestStream.Write(binary, 0, binary.Length);
}

WebResponse wresp = wreq.GetResponse();
using (IO.StreamReader sr = new IO.StreamReader(wresp.GetResponseStream()))
{
result = sr.ReadToEnd();
}


}
public static string GetFormDigest(string siteUrl, ICredentials credentials, CookieContainer cc)
{
string formDigest = null;

string resourceUrl = siteUrl +"/_api/contextinfo";
HttpWebRequest wreq = HttpWebRequest.Create(resourceUrl) as HttpWebRequest;

wreq.Credentials = credentials;
wreq.CookieContainer = cc;
wreq.Method = "POST";
wreq.Accept = "application/json;odata=verbose";
wreq.ContentLength = 0;
wreq.ContentType = "application/json";
string result;
WebResponse wresp = wreq.GetResponse();

using (IO.StreamReader sr = new IO.StreamReader(wresp.GetResponseStream()))
{
result = sr.ReadToEnd();
}

var jss = new JavaScriptSerializer();
var val = jss.Deserialize>(result);
var d = val["d"] as Dictionary;
var wi = d["GetContextWebInformation"] as Dictionary;
formDigest = wi["FormDigestValue"].ToString();

return formDigest;

}

Uploading Large Documents with CSOM


At one time I thought you could not do this with CSOM, however fellow MVP Joris Poelmans brought to my attention that the AMS sample Core.LargeFileUpload was able to upload over 3 mb files O365 Development Patterns and Practices. This can only be done if you are setting the FileCreationInfo ContentStream property with an open stream to the file. This gets around the message size limit of CSOM because the ContentStream is using the MTOM optimizations and sending the raw binary rather than a base64 encoded binary. This is much more efficient and is faster that the other methods. This appears to be a later change in CSOM and optimized for SharePoint Online. The CSOM code does not need a cookie container. I also tried using File.SaveBinaryDirect method but I received “Cannot Invoke HTTP Dav Request” since this is not supported in SharePoint Online.

 public static void UploadDocumentContentStream(string siteUrl, string libraryName, string filePath)
{
ClientContext ctx = new ClientContext(siteUrl);
ctx.RequestTimeout = 1000000;
ctx.Credentials = Utils.GetO365Credentials("your login", "your password");
Web web = ctx.Web;

using (IO.FileStream fs = new IO.FileStream(filePath, IO.FileMode.Open))
{
FileCreationInformation flciNewFile = new FileCreationInformation();

// This is the key difference for the first case - using ContentStream property
flciNewFile.ContentStream = fs;
flciNewFile.Url = IO.Path.GetFileName(filePath);
flciNewFile.Overwrite = true;


List docs = web.Lists.GetByTitle(libraryName);
Microsoft.SharePoint.Client.File uploadFile = docs.RootFolder.Files.Add(flciNewFile);

ctx.Load(uploadFile);
ctx.ExecuteQuery();
}
}

Uploading Large Documents with RPC


RPC still lives and is supported in SharePoint Online. The code below is simplified. RPC can be hard to understand because the syntax for the different parameters is from years ago. RPC is basically an HTTP POST to C++ dll. It can be fast but it was not faster than CSOM.  The parameters and binary must be combined and separated by a line feed into one common byte array before posting. The libraryName parameter cannot be the title of document library but the actual URL for it. Instead of Documents you must use Shared Documents. You will note many of the parameters are URL Encoded because RPC is very particular about characters in the URL. Finally, note that the code feeds the byte array to the request stream in chunks. This helps prevent triggering of SharePoint Online throttling limits.

 public static void UploadDocumentRPC(string siteUrl, string libraryName, string filePath)
{
string method = HttpUtility.UrlEncode("put document:14.0.2.5420");
string serviceName = HttpUtility.UrlEncode(siteUrl);
string document = HttpUtility.UrlEncode(libraryName + "/" + IO.Path.GetFileName(filePath));
string metaInfo = string.Empty;
string putOption = "overwrite";
string keepCheckedOutOption = "false";
string putComment = string.Empty;
string result = string.Empty;

string fpRPCCallStr = "method={0}&service_name={1}&document=[document_name={2};meta_info=[{3}]]&put_option={4}&comment={5}&keep_checked_out={6}";
fpRPCCallStr = String.Format(fpRPCCallStr, method, serviceName, document, metaInfo, putOption, putComment, keepCheckedOutOption);

byte[] fpRPCCall = System.Text.Encoding.UTF8.GetBytes(fpRPCCallStr + "\n");
byte[] postData = IO.File.ReadAllBytes(filePath);
byte[] data;

if (postData != null && postData.Length > 0)
{
data = new byte[fpRPCCall.Length + postData.Length];
fpRPCCall.CopyTo(data, 0);
postData.CopyTo(data, fpRPCCall.Length);
}
else
{
data = new byte[fpRPCCall.Length];
fpRPCCall.CopyTo(data, 0);
}

HttpWebRequest wReq = WebRequest.Create(siteUrl + "/_vti_bin/_vti_aut/author.dll" ) as HttpWebRequest;
SharePointOnlineCredentials credentials = Utils.GetO365Credentials("your login", "your password");
wReq.Credentials = credentials;
wReq.CookieContainer = Utils.GetO365CookieContainer(credentials, siteUrl);
wReq.Method="POST";
wReq.Timeout = 1000000;
wReq.ContentType="application/x-vermeer-urlencoded";
wReq.Headers.Add("X-Vermeer-Content-Type", "application/x-vermeer-urlencoded");
wReq.ContentLength=data.Length;

using (IO.Stream requestStream = wReq.GetRequestStream())
{
int chunkSize = 2097152;
int tailSize;
int chunkNum = Math.DivRem(data.Length, chunkSize, out tailSize);

for (int i = 0; i < chunkNum; i++)
{
requestStream.Write(data, chunkSize * i, chunkSize);
}

if (tailSize > 0)
requestStream.Write(data, chunkSize * chunkNum, tailSize);

}

WebResponse wresp = wReq.GetResponse();
using (IO.StreamReader sr = new IO.StreamReader(wresp.GetResponseStream()))
{
result = sr.ReadToEnd();
}

}

Three Ways of Uploading Large Documents to SharePoint Online


All of the above code examples are good ways to upload large documents to SharePoint Online. All of them utilize the Client Object Model to create the credentials and cookie that is required for SharePoint Online. Getting the cookie is rather complicated without using the Client Object Model. All three methods require that you set the request timeout to a large value because uploading to SharePoint Online is much slower than SharePoint On-Premises. Experiment with the code samples. I tested these with 200mb files and the CSOM was the fastest but your results may vary. I like variety and having multiple ways of accomplishing a task.

12 comments:

Anonymous said...

how to do it in sharepoint 2010

Petri Koskela said...

Excellent CSOM code, I have tried two days upload big files by CSOM and C# code without success. With this example code everything works in our environment. Perfect !

ashlu said...

I've been looking at the REST method, I've tried it with 310mb file and the file uploads after 15 minutes but I dont get a response.
WebResponse wresp = wreq.GetResponse();

It just hangs, I've increased the timeout to an hour. If the file takes 15 minutes to upload, it shouldnt take that much longer for the response should it?

Anonymous said...

Hello, is it possible to upload a file and then for example, rename it, all in a batch operation?

Anonymous said...

Hi Steve, Thank you for this article, I have used your CSOM option and test and upload a 12 MB file, it uploads in about 30 seconds.

Jeremy Ostendorf said...

I am trying to use your REST example but am getting a 403 error when trying to POST : "http://sharepointUrl/_api/contextinfo" : to get the context info. Have you seen this?

Steve Curran said...

Jeremy, if anonymous access is turned on then posting will return a 403. It is my understanding that REST does not work with anonymous sites.

Jeremy Ostendorf said...

I figured out the issue. SPO recently changed how you need to authenticate. You will need to change your cookie container code above. The Authentication process has been changed for SharePoint Online and now requires "SPOIDCRL" intead of "FedAuth".

public static CookieContainer GetO365CookieContainer(SharePointOnlineCredentials credentials, string targetSiteUrl)
{

Uri targetSite = new Uri(targetSiteUrl);
string cookieString = credentials.GetAuthenticationCookie(targetSite);
CookieContainer container = new CookieContainer();
string trimmedCookie = cookieString.TrimStart("SPOIDCRL=".ToCharArray());
container.Add(new Cookie("SPOIDCRL", trimmedCookie, string.Empty, targetSite.Authority));
return container;


}

On the same topic, I had been using a similar process to load larger files into SharePointOnline, but it stopped working on 2/6. This is when I stumbled upon your post. I like the idea of using the CSOM functionality, but it doesn't allow you to work with base64.

This also works for large files:

public DocumentResponse AddDocumentByBase64String(string sListName, string sDocName, string sUrl, string InData)
{
Uri u = new Uri(ConfigurationManager.AppSettings["SharepointLoginUri"]);
byte[] data = Convert.FromBase64String(InData);
string sSharePointUrl = sUrl + "/" + sListName + "/" + sDocName;

CookieContainer c = new CookieContainer();
c = bpfHelper.GetAuthenticationCookies(sSharePointUrl, u);

System.Net.ServicePointManager.Expect100Continue = false;
HttpWebRequest request = HttpWebRequest.Create(sSharePointUrl) as HttpWebRequest;
request.Method = "PUT";
request.Accept = "*/*";
request.ContentType = "multipart/form-data; charset=utf-8";
request.CookieContainer = c;
request.UserAgent = "Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; WOW64; Trident/5.0)";
request.Headers.Add("Accept-Language", "en-us");
request.Headers.Add("Translate", "F"); request.Headers.Add("Cache-Control", "no-cache"); request.ContentLength = data.Length;
request.AllowWriteStreamBuffering = false;


using (Stream req = request.GetRequestStream())
{ req.Write(data, 0, data.Length); }

HttpWebResponse response = (HttpWebResponse)request.GetResponse();
DocumentResponse dr = new DocumentResponse();
dr.AbsolutePath = response.ResponseUri.AbsolutePath.ToString();
dr.AbsoluteUri = response.ResponseUri.AbsoluteUri.ToString();
dr.StatusCode = response.StatusCode.ToString();
return dr;
}

Syed Shoaib Adil said...

hi,

I used same thing "Upload a document through RPC" but i want to enable/disbale event recievers after/before the upload. i tried with following code but its not working.

EventHandlerHelper eventHelper = new EventHandlerHelper();
eventHelper.CustomDisableEventFiring();

can you help me in this regard?

Also i just wanted to check, will it (RPC) call work if we didnt give the correct SERVER EXTENSION/VERSION in
string method = HttpUtility.UrlEncode("put document:14.0.2.5420");

Gil Roitto said...

Thank you Jeremy. I got 403 forbidden using the old GetO365CookieContainer method. Testing with Postman got me further astray since I got 403 forbidden but of different reasons. I only wish SharePoint some day would provide better / more detailed error messages.

Animesh said...

Hi, in my case, I need to update the PublishingPageContent which is larger than 2mb using CSOM.

ListItam page = GetMyPageItem();
page["PublishingPageContent"] =

Any help will be great!

prasuna chowdry said...

I implemented this and I can upload /download files using the authentication given, But is there anything like sharepointonline credintails to built in c++, I want to USE c++ basic intenetopen and request commands in wininet libraries in c++, Would there be any pointers for the same

Thanks for your response

Post a Comment