André Krijnen

Tag: sharepoint

SharePoint: Javascript from asynchronous to synchronous with Deferred and Promise

by on Jun.14, 2016, under blog, javascript, SharePoint

The last few years it has become more and more common that we use JavaScript for our SharePoint Projects. Whether it’s Angular, Knockout or another preferred framework it’s all very usable. But sometimes JavaScript can be a pain to use. For example, handling data in a synchronous way in Custom Actions. Recently I had to write code for sending document data to Dynamics AX. Since we use Business Connectivity Services to send the data to Dynamics AX and we are still using farm solutions I had to write a Custom Action.
When selecting multiple documents and handling the data in two for loops it’s a pain, because if you are using variables in your class that must be used between functions it will be overwritten while your process is still running. Let’s give a code example:

var ctx = '';
var web;
var list;
var listOrg;
var item;
var listItem;
var props;
var xmlDoc;

function startWork() {
ctx = new SP.ClientContext.get_current();
    	var items = SP.ListOperation.Selection.getSelectedItems(ctx);
	if (items.length >= 1) {
        	for (idx in items) {
			var listId = SP.ListOperation.Selection.getSelectedList();
    			listOrg = ctx.get_web().get_lists().getById(listId);
			
    			web = ctx.get_web();

    			list = web.get_lists().getByTitle('AX Documents');

    			var camlQuery = new SP.CamlQuery();
    			camlQuery.set_viewXml('<View><RowLimit>100</RowLimit></View>');
    			listItem = list.getItems(camlQuery);
    			item = listOrg.getItemById(id);
    			props = web.get_allProperties();

    			ctx.load(web);
    			ctx.load(listOrg);
    			ctx.load(props);
    			ctx.load(listItem);
    			ctx.load(item, 'EncodedAbsUrl', 'AX_Nummer', 'AX_Nummer2', 'AX_Nummer3', 'AX_Nummer4', 'AX_Nummer', 'LookupSoort', 'LookupSoort2', 'LookupSoort3', 'LookupSoort4', 'LookupSoort5');

    			ctx.executeQueryAsync(Function.createDelegate(this, onQuerySucceeded, Function.createDelegate(this, onQueryFailed));

		}
	}
}

Function onQuerySucceeded() {

	var myProps = props;
var myPropValues = myProps.get_fieldValues();
	var myValue = myPropValues['Sil.AA.DMS.Common.Configurations'];
	var xmlDoc = $.parseXML(myValue);
	
if (xmlDoc) {
		var areaId = $(xmlDoc).find('Configuration').find('AreaId').text();

		for (var j = 0; j <= 6; j++) {
		       if (j === 0) {

				var lookupField = item.get_item('LookupSoort');
			       var lookupValue = lookupField.get_lookupValue();
		              var updateItem = listItem.itemAt(0);
				updateItem.set_item('DocumentIdentificationType', lookupValue);
				updateItem.set_item('DocumentIdentification1', item.get_item('AX_Nummer'));
				updateItem.set_item('AreaId', areaId);
				updateItem.set_item('DocumentUrl', item.get_item('EncodedAbsUrl'));
				updateItem.update();

				ctx.executeQueryAsync(onUploadSucceeded, onQueryFailed);

}
			else {
				var lookupFieldName = 'LookupSoort' + (j + 1);
				var lookupField = item.get_item('LookupSoort' + (j + 1));
				
if (lookupField !== null) {
					var updateItem = listItem.itemAt(0);
				       var lookupValue = lookupField.get_lookupValue();
				       updateItem.set_item('DocumentIdentificationType', lookupValue);
				       updateItem.set_item('DocumentIdentification1', item.get_item('AX_Nummer' + (j + 1)));
				       updateItem.set_item('AreaId', areaId);
				       updateItem.set_item('DocumentUrl', item.get_item('EncodedAbsUrl'));
				       updateItem.update();

				       ctx.executeQueryAsync(onUploadSucceeded, onQueryFailed);
			        }
			
	              }
	      }
	}
}

Function onUploadSucceeded() {
	Alert(‘File has been processed’);
}

Function onQueryFailed(sender, args) {
	Alert(‘File not processed: ‘ + args);
}

Let’s say I run the above code with two items, it will see both items, but the processing will be asynchronous. Since executeQueryAsync will be run the first time, but the first for loop will be processed at the same time. I have two items:

FileLeafReg Title AX_Number LookupSoort AX_Number2 LookupSoort2 AX_Number3 LookupSoort3 AX_Number4 LookupSoort4 AX_Number5 LookupSoort5
http://sp2013/Documents/picture1.png Picture 1 VN12345 Sales VN12346 Sales VN12347 Sales Null Null Null Null
http://sp2013/Documents/picture2.png Picture 2 VN67890 Sales VN67891 Sales Null Null Null Null Null Null
                       

It will result in processing the second items 5 times, with the properties of the second file. Because the for loop will put the values of the second file over the variables of the first file, since the first for loop is faster in processing. And the onQuerySucceeded will only be processed when the first for loop is done. So it will break every piece of code in the chain.
I talked with several people, experts in JavaScript and even on StackOverflow, and I never got a decent answer how to fix it properly. It always resulted that the chain was broken. I tried promise and deferred in different manners, still not as it should be. I was getting headaches because of it, since none of the answers sufficed in a proper way.
Normally I am absolutely not a morning person, and the breakthroughs are normally at night when everybody is asleep. But two weeks ago, when it was still very early in the morning I woke up and started to code, just before driving to the office, I found my answer in a way that I could live with it.
I shuffled and refactored all my notepad++ tabs in one singular tab and there it was. A solution that was clear to read. Let’s go through the promise and deferred part. The steps are marked in Red from Step 1 ‘till 6. So that the code is easy to follow.

var ctx = '';

function startWork() {
ctx = new SP.ClientContext.get_current();
var items = SP.ListOperation.Selection.getSelectedItems(ctx);
   	if (items.length >= 1) {
       	 for (idx in items) {
			fixLinkInAxapta(items[idx].id).then(	// Step 1: Instead of running the first clientContext executeQueryAsync. We first go into the function and prepare the Promise
				function (listItem, item, props) { // Step 4: is running this in synchronous mode.
					var myProps = props;
				    	var myPropValues = myProps.get_fieldValues();
				    	var myValue = myPropValues['Sil.AA.DMS.Common.Configurations'];
				    	var xmlDoc = $.parseXML(myValue);
				    	if (xmlDoc) {
				        	var areaId = $(xmlDoc).find('Configuration').find('AreaId').text();
						for (var j = 0; j <= 6; j++) {
				            if (j === 0) {
				                var lookupField = item.get_item('LookupSoort');
				                var lookupValue = lookupField.get_lookupValue();
				                var updateItem = listItem.itemAt(0);
				                updateItem.set_item('DocumentIdentificationType', lookupValue);
				                updateItem.set_item('DocumentIdentification1', item.get_item('AX_Nummer'));
				                updateItem.set_item('AreaId', areaId);
				                updateItem.set_item('DocumentUrl', item.get_item('EncodedAbsUrl'));
				                updateItem.update();

				                ctx.executeQueryAsync(onUploadSucceeded, onQueryFailed);

				            }
				            else {
				                var lookupFieldName = 'LookupSoort' + (j + 1);
				                var lookupField = item.get_item('LookupSoort' + (j + 1));
				                if (lookupField !== null) {
				                    var updateItem = listItem.itemAt(0);
				                    var lookupValue = lookupField.get_lookupValue();
				                    updateItem.set_item('DocumentIdentificationType', lookupValue);
				                    updateItem.set_item('DocumentIdentification1', item.get_item('AX_Nummer' + (j + 1)));
				                    updateItem.set_item('AreaId', areaId);
				                    updateItem.set_item('DocumentUrl', item.get_item('EncodedAbsUrl'));
				                    updateItem.update();

				                    ctx.executeQueryAsync(onUploadSucceeded, onQueryFailed);
				                }
				                else {
				                    break;
				                }
				            }
				            if (j === 6) dfd.resolve(); // Step 5: Resolve the Deferred
				        }
				    }

				},
				function (sender, args) {

				});
        }
    }

    alert('All items have been processed to Dynamics AX');
}
function fixLinkInAxapta(id) {
    var dfd = $.Deferred(); // Step 2: Setup the Deferred method
    var listId = SP.ListOperation.Selection.getSelectedList();
    var listOrg = ctx.get_web().get_lists().getById(listId);
    var item;
    var props;
    var list;
    var listItem;
    var web;

    web = ctx.get_web();

    list = web.get_lists().getByTitle('AX Documents');

    var camlQuery = new SP.CamlQuery();
    camlQuery.set_viewXml('<View><RowLimit>1</RowLimit></View>');
    listItem = list.getItems(camlQuery);
    item = listOrg.getItemById(id);
    props = web.get_allProperties();

    ctx.load(web);
    ctx.load(listOrg);
    ctx.load(props);
    ctx.load(listItem);
    ctx.load(item, 'EncodedAbsUrl', 'AX_Nummer', 'AX_Nummer2', 'AX_Nummer3', 'AX_Nummer4', 'AX_Nummer', 'LookupSoort', 'LookupSoort2', 'LookupSoort3', 'LookupSoort4', 'LookupSoort5');
// Step 3: Resolve the listItem, item and the properties and call the function(listItem, item, props) in the previous method and wait for it to finish. If an error occurs go to reject
    ctx.executeQueryAsync(Function.createDelegate(this, function () { dfd.resolve(listItem, item, props); }), Function.createDelegate(this, function (sender, args) { dfd.reject(sender, args); }));

    return dfd.promise(); // Step 6: When the ClientContext.executeQueryAsync is resolved make it promise and return to the For Loop
}

Of course if you have any questions, don’t hesitate to comment on this blogpost and I will get back to you as soon as possible. More blog posts will follow soon.

Leave a Comment :, , , , , , more...

SharePoint 2010: (pilot , poc) Sites, webs, contentdb’s, what you should(n’t) do

by on Oct.28, 2011, under SharePoint 2010, SharePoint Foundation

Before I begin my blog I will thank everyone who thought this allready over and made it available thru all types of conferences worldwide.

First of all, this is my own comment about how to do work around with Sites, Webs, content database, etc. This is my own opinion based on what I’ve learned the last years and my based on my own experience combined with the experience from others.

So, let’s start with an example which causes me to write this blog:

A mid-size company started of with a simple pilot of SharePoint 2010. This pilot was to discover how SharePoint works, and does it fit within the organization. We agreed before we started to start all over again when the company decided to go further with SharePoint. Well after an half year the company decided to go further with SharePoint, but yet the pilot environment was allready promoted to production before we could started all over again.

Allright I thought, when everything is ok, and we don’t have any issues regarding the environment we can do that, because the physical as the logical architecture was future proof designed. Multiple Content Databases, multiple site collections, etc. The only thing we had to was give some servers more memory. That’s all.

I had installed, configured and designed the infrastructure, so I knew that kind of issues can happen, when a pilot environment will be promoted as a production environment. I wasn’t committed when others had the intranet designed with sites,templates, etc for the Pilot. But well, I’ve made some critical decisions in the infrastructure, but maybe I forget to mention it correctly. My bad.

In februari I was asked to go further with the environment, and we attached Reporting Services, Datawarehousing, etc. No issues there.

But now, when the point arrives we grow larger and larger, and when I checked the databases it was all good, because all databases where used. But then I saw one big problem. Yes, they didn’t use the Site Collections I’ve made in the start, but created just sub SPWebs. Ouch, yes here we go.

So, now we have to migrate 25 GB of subwebs to one of the Site Collections I created at the start. Well. it takes days to do it well, everything you have done in the early stages are allready critical in the organization, so you can’t make mistakes. Everything has to work as before when your migrated. Well convert to SPSite is one hell of task to do.

I think I’ve tested the migration over and over again. Used the information given by Gary Lapointe, and went further with it. Came up with more then 257 errors. Well okay, that isn’t much, but it doesn’t tell the errors you’ll get functional. Yes functional!!

Every site, list, workflow, CQWP has to be tested. And I mean, really tested if it works as before. I can tell you, I have to replace every CQWP, some lists are getting corrupted, content types are missing, list pages are gone, pages are not behaving as they should.

So before you decide to start of with a pilot think as a production site, because it will help you a lot and other a lot. SharePoint 2010 is scalable, ensure that your sites are scalable as well. Think scalable act scalable.

Do’s

  • Think as production!
  • Start with multiple content databases
  • Start with multiple Site Collections (For example: departments, teams, HR, sales, etc)
  • If you think the SPWeb can grow large (over 50GB) use a Site Collection
  • Start with a decent infrastructure (production type)
  • Ensure that the people working with SharePoint are using these Site Collections instead of making all kinds of Sub Webs
  • Don’ts

  • Think it is really a Pilot or PoC environment
  • Think that you can start all over again when the Pilot or PoC is over
  • That you can handle a full blown with a single Site collection and Single Content database
  • Leave a Comment :, , , , , , , more...

    Restore Site SharePoint 2010 issues with site collection owners

    by on Sep.27, 2011, under SharePoint 2010, SharePoint Foundation

    We’ve had some multiple issues regarding the Site Collection owners, and we had to fix these in a fashioned time. After restoring we had the issue that we couldn’t change the primary and secondary site collection Administrators. So it was difficult where the bug was for us. But it wasn’t a bug.

    After a new fresh install of the SP farm we still had these problems… so we had to search it out. After a long search it seemed that the UserAccountDirectoryPath was the issue. When the User Account Directory Path is filled in the table [dbo].[AllSites] of your content database you can do what you want, but you can’t change the owners.

    So we had to remove the UserAccountDirectoryPath: Set-SPSite -Identity “http://site” -UserAccountDirectoryPath “”

    And voìla it was fixed. With two Quotes it removes the UserAccountDirectoryPath. You have to this allways if you’re restoring your Site Collection to another farm.

    Leave a Comment :, , , , , , more...

    Reporting Services 2008 R2 Service Pack 1 with SharePoint 2010 Integration

    by on Aug.14, 2011, under Reporting Services, SharePoint 2010, SharePoint Foundation

    Today was the day I could install Service Pack 1 for Reporting Services 2008 R2 at a customer which has SharePoint 2010 with Reporting Services 2008 r2 integrated. Before Service Pack 1 we had alot of performance issues with Reporting Services. It didn’t perform while the hardware was more then sufficient. Sometimes it took more then 30 ms to load a simple report.

    Today I installed the Service Pack 1 for Reporting Services, and it didn’t take alot of time to install it. Even while it is virtualized.

    I installed the following updates:

    • SQL Server 2008 R2 Service Pack 1

    Also what I did is installing the following features from the SQL Server 2008 R2 Service Pack 1 Feature Pack:

    • SQLSERVER2008_ASADOMD10.msi
    • SQLSERVER2008_ASAMO10.msi
    • SQLSERVER2008_ASOLEDB10.msi
    • SqlCmdLnUtils.msi
    • SharedManagementObjects.msi
    • PowerShellTools.msi

    All are installed on the server which runs SharePoint 2010 and Reporting Services 2008 r2.

    After reboot. I checked the performance again to check if it installed correctly without any errors. I did, SharePoint Server ran, and started some reports. While it took the same time to run the Reports like before I took the time to install some more from the feature pack.

    I installed the SQL Server 2008 Native Client on each server running SharePoint. When you run the prerequisite installer it installs also the SQL Server 2008 Native Client. So when it was done, I rebooted the servers which had these updates.

    When the SharePoint Server cames up, I ran the rsSharePoint.msi from the Feature Pack on every server in the SharePoint farm.

    Run the installer like this: msiexec /i rsSharePoint.msi skipca=1

    It checks if the addin is installed, and asks if you want to update the Package. Of course, yes.

    It takes about 30 seconds to install and it is done.

    Now I started SharePoint and checked the Reports which I ran before, and the performance increased more then I could imagine. Even SharePoint reacted more snappy as before. It looks like Microsoft made some major performance increasements on the Native Client and the Reporting Services Add-In for SharePoint.

    My advice: install it!

    1 Comment :, , , , , , , , more...

    SQL Server 2008 R2 Service Pack 1 is released (SharePoint info)

    by on Jul.30, 2011, under maintenance, Reporting Services, SharePoint 2010, SharePoint Foundation, sql server, Update

    So after some time Microsoft just released there first Service Pack for SQL Server 2008 R2. This package contains alot of improvements regarding to performances for Analysis Services.

    You can download SP1 here: http://www.microsoft.com/download/en/details.aspx?id=20302

    Also when you’ve running SharePoint Server 2010 with Reporting Services integrated you should update your Reporting Services to the latest Service Pack. Also an important notice to this, you should also download the SQL Server 2008 R2 SP1 Feature Pack to accomplish some other improvements.

    You can download SP1 Feature Packs here: http://www.microsoft.com/download/en/details.aspx?id=26728

    When you’ve downloaded the Feature Packs you should also install the following components to all SharePoint Servers:

    • rsSharePoint.msi
    • sqlncli.msi
    • SQLSERVER2008_ASADOMD10.msi

    The first one is the SQL Server 2008 r2 Reporting Services Add-In for SharePoint Server 2010

    The second is the SQL Server Client 2008 which you also have to install when you install the prerequisites for SharePoint Server 2010.

    The third is also a component which is used in the prerequisite for SharePoint Server 2010.

     

     

    Leave a Comment :, , , , , , , , , more...

    There was an error in the callback content and structure

    by on Jun.16, 2011, under iis, Internet Information Services, SharePoint 2010, SharePoint Foundation

    I ran today in this message in the SiteManager.aspx of SharePoint Server 2010: there was an error in the callback content and structure

    So allright, I checked the search engines and did find some answers regarding this problem, but you know what. When restart your IIS server on your Web Front End the message doesn’t go away. Hmm odd, how could that be, well I figured out that it does a call back to another server, in my case it was one of the other SharePoint Servers in the farm. After resetting this server it removes the message and everything works fine again.

    Leave a Comment :, , , , , more...

    Powershell: Update all Document Libraries with MajorVersionLimit and MajorWithMinorVersionsLimit

    by on Jun.15, 2011, under blog, Powershell, SharePoint 2010

    I had to write a Powershell script to run thru all sites in a Web Application to enable Versioning. As well Major as Minor versions. With of course a limit on Major and Minor versions.

    Next script should do the trick:

    Add-PSSnapin Microsoft.SharePoint.PowerShell -erroraction SilentlyContinue
    $siteURL = $args[0]
    $site = Get-SPSite($siteURL)
    foreach($web in $site.AllWebs) {
     Write-Host "Inspecting " $web.Title
     foreach ($list in $web.Lists) {
      if($list.BaseType -eq "DocumentLibrary") {
      Write-Host "Versioning enabled: " $list.EnableVersioning
      $host.UI.WriteLine()
      Write-Host "MinorVersioning Enabled: "  $list.EnableMinorVersions
      $host.UI.WriteLine()
      Write-Host "EnableModeration: " $list.EnableModeration
      $host.UI.WriteLine()
      Write-Host "Major Versions: " $list.MajorVersionLimit
      $host.UI.WriteLine()
      Write-Host "Minor Versions: " $list.MajorWithMinorVersionsLimit
      $host.UI.WriteLine()
      $list.EnableVersioning = $true
      $list.EnableMinorVersions = $true
      $list.MajorVersionLimit = 2
      $list.MajorWithMinorVersionsLimit = 5
      $list.Update()
      Write-Host $list.Title " is updated with MajorVersionLimit 2 and MajorwithMinorVersionsLimit = 5"
      }
     }
    }

     

    Leave a Comment :, , , , , , more...

    Kerberos, Reporting Services and SharePoint Integrated http 401: Unauthorized

    by on Apr.06, 2011, under Active directory, Kerberos, Reporting Services, SharePoint 2010

    When it comes to Reporting Services integrated with SharePoint it is difficult to solve problems when you don’t know where to start. Alot of people having issues when solving problems, or configuring Kerberos that way that Windows Integrated security is working properly.

    At my work I’ve been at different customers, and still having some problems when it comes to Kerberos, why? Because every environment is different, every server is different, and when it comes to Reporting Services integrated with SharePoint it is some times a hell to fix issues.

    So I’ve done multiple integrations with Reporting Services and SharePoint, and yet I know alot about Kerberos. Setting up delegations between App Pools and SSRS, SSRS and SSAS, SSAS and MSSQL, SSRS and MSSQL.

    So I’ve ran last monday in a problem with SharePoint and Reporting Services, and why I did I ran into it? Simply, not every environment is configured properly when it comes to DNS, AD, etc.

    So I’ve used Fiddler, DelegConfig v1, DelegConfig v2 Beta, ProcessMonitor, but yet I couldn’t figure it out. Even with HTTP streaming, etc I couldn’t see any information. The only thing I could see that was every time I tried I was succesfully logon.

    Everytime I get with the integration was the following error: The request failed with HTTP status 401: Unauthorized

    Probably everyone that has configured SSRS with SharePoint has seen this error in his life, right? Well If you hit google or bing for it, it will always show Reporting Services Add-In SharePoint. Yes, this is the one everyone is talking about.

    Well I used all the tooling a SharePoint dude has to know. But yet I didn’t receive any request on the server running SSRS, and you know why? Because somebody forget to add the http:///ReportServer to the Intranet list. Yet, I added the server to the Intranet list, it solved the problem. Do not add it to the Trusted Sites, because It won’t do anything.

    4 Comments :, , , , , , more...

    SharePoint Timer Job deploying on Farm with wrong SPServer

    by on Mar.24, 2011, under SharePoint 2010

    It started all out with Reporting Services not running Workflows when Integrated, but yet I had still to do some workflow activities. So I thought of using a simple Timer Job that runs every hour or so to check if data has been changed. Well it worked out very well, but when deployed on a farm with multiple servers in it, I get a problem. It started to send e-mails… and not just one every time in the hour, but every content database it ran thru. (After I figured it out on the web by Robin of Sevenseas. Link

    So I thought well allright, I should be using the SPJobLockType.Job with a Server specified. Well I did so. And as I did but with the wrong server name. Well it couldn’t be worse, it deployed the Timer Job successfully, and I started to debug. Didn’t notice the wrong server name though, but I set the server name into Title of the Timer Job, to check if the server is specified. Easy trick, easy done.

    But what I see is that it added a totally wrong server name. Yes a server name not from the farm, it wasn’t even there in the domain. But still no error messages, and the timer job still runs like a charm. Yes you can fool the OWSTimer with wrong SPServer specified… because it doesn’t validate it.

    1 Comment :, , , , , , , more...

    Service principal names, kerberos, IIS 7.0 and error 401: The requested resource requires user authentication

    by on Aug.07, 2010, under Internet Information Services, SharePoint 2010, Software, Windows Server 2008

    The last couple of days I was working at a customer where Kerberos was needed for SharePoint 2010. Of course I started to set the different Service Principal Names for my App Pool accounts, farm accounts, machines, etc. Not to hard to do it, but I ran everytime in a 401 error: The requsted resources requires user authentication.
    Strange I thought, but yet I sended the Domain Administrator more commando’s and it didn’t help. So I checked everything, checked for duplicates, etc. Still I ran into these errors.

    After some search I found out that there are some problems with IIS 7.0 regarding Kerberos, and I needed to configure the applicationHost.config to solve these issues with Kerberos. enabled the kernel activation mode, etc. But, it didn’t make any difference, rebooted several times, removed the Kernel Activation Mode and removed again the changed on the applicationHost.config.

    I knew that we’ve made C-Name records and it gave me a wonderfull idea to change the C-Name records to A records. These changes where applied, and wow, in less time as expected I opened IE and opened the different web apps. In less then a second the page was displayed from my web app. When you run in these problems, change your C-Name record to A-record and it will fix all your problems with SPN’s, Kerberos and IIS 7.0

    2 Comments :, , , , , , , , , more...

    Looking for something?

    Use the form below to search the site:

    Still not finding what you're looking for? Drop a comment on a post or contact us so we can take care of it!

    Blogroll

    A few highly recommended websites...