Quantcast
Channel: The Microsoft MVP Award Program Blog

Join us in congratulating MVP awardees!

$
0
0

woman MVP

We’re kicking off 2016 with 1,074 new and renewing Microsoft Most Valuable Professionals–MVPs!

These are exceptional community leaders who have demonstrated their deep commitment to helping others make the most of their technology, voluntarily sharing their passion and real-world knowledge of Microsoft products with the community. If you’re involved with technical communities, the chances are good that one of them has helped you. Please join us in saying—congratulations!

This is the first quarter that incoming MVPs have been awarded in Microsoft’s new categories, designed to recognize the way today’s technologists often design, build and innovate across multiple platforms, devices and technologies. (You can find out more about the new MVP Award categories here.)

This is also the first quarter that MVPs have had the opportunity to be recognized for their contributions across multiple award categories, and 31 MVPs have achieved dual award recognitions! These range from awards in Visual Studio and Development Technologies as well as Windows Development to awards in Enterprise Mobility and Windows and Devices for IT.

Each year, Microsoft honors around 4,000 MVPs. They are nominated by Microsoft, other community individuals, or in some cases themselves. Candidates are rigorously evaluated for their technical expertise, community leadership, and voluntary community contributions for the previous year. They reflect Microsoft’s global community, today coming from more than 90 countries and speaking more than 40 different languages—reaching around a million Microsoft customers each day!

MVPs are recognized each quarter for this annual award, which continues to grow and evolve to reflect the development of Microsoft technologies. Congratulations to the new MVPs, and welcome back to renewed MVPs. We are very excited to recognize your amazing accomplishments!

And if you know (or are!) an awesome community leader, go here to make a nomination.


Freeing your Azure data with F# Type Providers

$
0
0

Editor’s note: The following post was written by Visual Studio and Development Technologies MVP Isaac Abraham as part of our Technical Tuesday series with support from his technical editor, Visual Studio and Development Technologies MVP Steffen Forkmann.

F# is a mature, open source, cross-platform, functional-first programming language. It empowers users and organizations to tackle complex computing problems with simple, maintainable and robust code. In this post, I want to discuss how we can use F# to reduce the friction and barrier to entry to dealing with cloud storage within .NET compared to conventional mechanisms that you might be used to.

F#, Type Providers and the Cloud

One of the features that I love showing people new to F# are Type Providers, not only because they are fantastically powerful, but also because they’re just plain awesome to demo! An F# type provider is a component that provides types, properties, and methods for use in your program without you needed to manually author and maintain these types. As we start to deal with more and more of these disparate – and distant – data sources – it’s crucial that we make accessing such systems as painless as possible. Azure Storage is one such system. It’s cheap, readily available and quickly scalable. Blobs and Tables are two elements of Azure Storage that we’ll review in this article.

Working with Blobs

When working with blobs in .NET, we normally use the .NET Azure SDK, which allows us to interrogate our Storage assets relatively easily. Here’s a C# snippet that shows how we might interrogate a container and a blob that has a well-known path: –

1.1

 

 

Of course, we’re having to use magic strings here. There’s no compile-time safety to ensure that the container or blob actually exists. Indeed, we can’t actually validate this until we run our application and reach this section of code, unless we resort to unit tests or perhaps copy and paste our code into LINQPad or similar.

The F# Azure Storage Type Provider solves all these problems in one go, by generating a strongly-typed object model at edit and compile time that matches the contents of your blob storage. Here’s how we would achieve the same code as above in F#: –

2.1

 

In two lines we can achieve the same thing in a completely strongly-typed manner. You won’t need to write a console test runner either – you can simply open an F# script file and start exploring your storage assets. We can’t mistype the name of a container because we get full IntelliSense as we “dot into” each level of blobs: –

3.1

 

 

 

And because this is statically typed, and checked at compile time, if the blob is removed from your container, your code will not even compile. Of course, if you do need to fall back to weak-typing for e.g. dynamically generated blobs etc., you can easily fall back to the standard SDK directly from within the Type Provider (as seen from the AsCloudBlobContainer() method above).

Working with large data sets

In the example above, we’re downloading the entire contents of the blob to our application. When working with large files in blob storage, this might be a problem, so the type provider allows us to treat text files as streams of lines: –

4.1

 

 

Here we’re lazily streaming a potentially large file, and reading just up until we find the first 10 lines that contain the word “alice” – we don’t have to download the entire file, and we are using standard sequence query functionality such as filter and take (you can think of Seq as equivalent to LINQ’s IEnumerable extension methods).

Working with Tables

Tables are another Storage component that is simple and relatively easy to reason about. Cheap and lightweight, it’s a good way to start storing and querying tabular data in Azure. The trade-off is that it contains relatively few computational features e.g. Tables do not allow relationship or aggregations. Here’s how we might query a table structure that looks like this: –

5.1

 

 

The need for stronger typing

If we wish to query this using the standard SDK, we’ll need to manually create a POCO that implements ITableEntity, or inherits from TableEntity, and have properties that match the table type (again, we’ll only know if this is correct at runtime). Then, we need to create a query. The Azure SDK is somewhat inconsistent here in that you can create queries in several ways, and none of them are particularly satisfactory.

Firstly, we can use the weakly-typed TableQuery builder class to manually build the Azure Table query string – this offers us little or no compile-time safety whatsoever. Alternatively, we can use the TableQuery<T> query builder. Unfortunately, this API is somewhat awkward to use in that you can create it in two different ways – and depending on how you construct it, certain methods on the class must not be called. Failing to adhere to this will lead to runtime exceptions: –

6.1

There’s also an IQueryable implementation for tables. This suffers from the fact that you can generate compile-time safe queries that will fail at runtime as Azure Tables offer an extremely limited query feature set, so it’s extremely using to write a query that compiles, but at runtime will result in an exception: –

7.1

 

 

Smarter and quicker Tables with F#

Again, it’s the F# Type Provider to the rescue. Firstly, we don’t need to worry about the hassle of navigating to a specific table, nor about manually building a POCO to handle the incoming data – the TP will create all this for you based on the schema that is inferred from the EDM metadata on the table, so we can immediately get up and running with any table we already have: –

81.

 

 

This will output to the F# REPL within Visual Studio the following: –

9.1

 

 

We also have access to a strongly typed Azure Table query DSL that is statically generated based on the schema of the table. This is guaranteed to only generated queries that are supported by the Azure Table runtime, yet also gives us an IQueryable-like flexibility: –

10.1

Notice that query methods for each field are typed for that field – Cost takes in floats; Team takes in strings etc. etc. so there’s no chance of supplying data of the incorrect type at runtime.

Conclusion

Using the Storage Type Provider allows us to point to any Azure Storage account that you might already have and start working with your data in less than a minute and change the way we start interacting with our Azure Storage assets.

Download the Azure Storage Type Provider via NuGet, create a F# script file, provide your connection string, and then just use Visual Studio (or Visual Studio Code) to immediately start navigating through your storage assets.

There’s no need to leave your IDE to an external tool – you can continue with your standard workflow, using an F# script to explore your data. When you’re happy with your code, you can easily move this into a full F# assembly which can be called from C# as part of your existing solution.

More than just using a REPL and a script though, the combination of F# and the Storage Type Provider gives us an unparalleled experience through a stronger type system that lets us be more productive and confident when working with Azure cloud assets.

ia

 

About the author

Isaac is an F# MVP and a .NET developer since .NET 1.0 with an interest in cloud computing and distributed data problems. He nowadays lives in both the UK and Germany, and is the director of Compositional IT. He specializes in consultancy, training and development, helping customers adopt high-quality, functional-first solutions on the .NET platform. You can follow his blog here.

 

Here’s your #FridayFives!

APGC Community Leaders Forge New Bonds in Japan

$
0
0

Last month MVPs (Microsoft Most Valuable Professionals) and RDs (Regional Directors) from Japan, Korea, Taiwan, and Hong Kong came together with Microsoft teams at the Microsoft Japan Tokyo Office for the first “Asia MVP/RD Meetup.” Recognized for their passion and support of Microsoft technologies, the event attracted 171 MVPs and three RDs. It featured a dozen technical sessions on topics ranging from Windows, Microsoft Azure, and Office 365 delivered by 21 speakers—including MVPs and RDs.

 

Throughout the two-day event, participants worked together to solve challenges, share ideas and best practices, and forge important new community relationships. In the end, 92% said they would join in this event next time!

Securing Windows 10 with BitLocker Drive Encryption

$
0
0

Editor’s note: The following post was written Office Servers and Services MVP Zubair Alexander as part of our Technical Tuesday series.

Windows 10 includes several security features. Perhaps one of the most important features is BitLocker Drive Encryption, which provides data protection in case of a loss or stolen device. It also provides security for decommissioned computers. BitLocker has been around for several years and can be used with Windows Vista, Windows 7 and Windows 8/8.1 operating systems. However, the focus of this article is on securing Windows 10 with BitLocker.

BitLocker Drive Encryption is built into the Windows 10 operating system and uses Advanced Encryption Standard (AES) with configurable key lengths of either 128-bit (default) or 256-bit (configurable using Group Policy). The idea behind the BitLocker Drive Encryption is that once you secure your drive, only you, or someone who has your password and recovery key, will be able to get to your data. Although this kind of protection provides enhanced security on mobile devices, such as laptops, there is no reason why you shouldn’t take advantage of BitLocker encryption on your desktop computers.

NOTE: BitLocker is not available on Windows 10 Home edition. It’s available on Windows 10 Pro, Enterprise and Education editions.

Unlike Encrypted File System (EFS) in previous Windows operating systems, BitLocker Drive Encryption encrypts your entire drive. This is much better than encrypting certain files or folders not only because of its ease but also because it offers a much higher level of security. By encrypting the entire drive, you can rest assured that all the files on your encrypted partition are protected.

In this article, I will share some insights into Windows 10 BitLocker Drive Encryption. I will walk you through step-by-step configuration of BitLocker on Windows 10 and also share some best practices.

BitLocker Requirements

It is important to understand the following BitLocker requirements before you implement BitLocker on your computer. These requirements are fairly minimal and an average user is likely to easily implement them on his/her computer.

Hardware

TPM v1.2 Chip – If you have a computer that you purchased in the last few years, chances are that it includes a Trusted Platform Module (TPM) chip. This is common on most laptops these days. To properly secure your Windows computer with BitLocker, Microsoft recommends you use TPM version 1.2 or later. If you are not sure whether your computer has a TPM chip, type tpm.msc in the Windows search box to load TPM Console. It will show you the TPM if it exists, otherwise you will see a message Compatible TPM cannot be found. Support for USB – Your computer must support booting from a USB flash drive. If you don’t have a TPM chip, you can still use BitLocker Drive Encryption with a USB flash drive. For example, if you have a Windows 10 desktop computer that doesn’t have a TPM chip, you can use the USB flash drive to save the BitLocker recovery key. You will insert the flash drive when the computer is started or resumed from hibernation and it will unlock the computer for you.

NOTE: BitLocker doesn’t support Dynamic Disks. Dynamic disks are very rare and if you don’t know what a dynamic disk is, chances are that you have not converted your Basic Disk to a Dynamic Disk.

Software

No Additional Software Required – BitLocker is integrated into the Windows operating system and therefore doesn’t require any additional software.

Partitions

Because pre-startup authentication and system integrity verification must take place on a partition other than the encrypted operating system drive, BitLocker requires that your computer has at least two partitions:

  1. An operating system partition (usually drive C) that is formatted with NTFS.
  2. A system partition that is at least 350MB. This is where Windows stores files needed to load Windows at boot. The system partition must not be encrypted and should also be formatted with NTFS for computers that use BIOS firmware and with FAT32 for computers that use UEFI-based firmware. This drive is often hidden in Windows Explorer. You can configure one or more partitions for your data (drive D, E, etc.) and enable BitLocker on them.

Administrative Access

You must have local Administrative rights to manage BitLocker on the operating system and fixed data drives. Standard users can only manage BitLocker on removable data drives.

Encryption Overhead

Unlike compression, which can cause a lot of disk fragmentation, BitLocker encryption doesn’t have the same impact on your computer. While there is some overhead due to encryption, it’s hardly a show stopper. With the advancement in computer hardware over the years, the central processing unit (CPU), hard drive, memory, and other components work so efficiently that the encryption overhead is minimal (less than 10%) and most people are unlikely to notice it. At my company, we use BitLocker on our desktop workstations and laptops and we haven’t experienced any noticeable performance hit. The way I look at it, even if there is a small price to pay in terms of performance overhead, securing your data with encryption is well worth it.

Step-by-Step Configuration

Here are the step-by-step instructions on how to turn on and configure BitLocker on your Windows 10 computer. If you are running Windows 10 Home edition, you won’t have the option to use BitLocker.

  1. Login to your Windows 10 computer.
  2. Right-click the Start button and select File Explorer. This was called Windows Explorer in previous Windows operating systems.
  3. Right-click the hard drive that you want to encrypt, e.g. drive C where the operating system is installed, and select Turn on BitLocker.
  4. If your computer doesn’t have a TPM chip you will see the following message. This device can’t use a Trusted Platform Module. Your administrator must set the “Allow BitLocker without a compatible TPM” option in the “Require additional authentication at startup” policy for OS volumes.
  5. You can modify the local group policy to allow BitLocker to encrypt the operating system drive on this computer even if it doesn’t have a TPM.   1
  6. In the Windows 10 Search box type gpedit.msc and press Enter to start the Local Group Policy Editor.
  7. Go to Computer Configuration -> Administrative Templates -> Windows Components -> BitLocker Drive Encryption -> Operating System Drives and in the right-hand pane double-click Require additional authentication at startup2
  8. Check the Enabled radio button and make sure that the box Allow BitLocker without a compatible TPM (requires a password or a startup key on a USB flash drive) box is checked. Then click OK.  3
  9. Go back to the File Explorer, right-click drive C and select Turn on BitLocker.
  10. This time it will allow you to turn BitLocker on. You are given a choice to either Insert a USB flash drive or Enter a password. NOTE: If you use a password to unlock your BitLocker-protected operating system drive, you won’t be able to remotely access the computer using remote desktop protocol (RDP) if it is rebooted for some reason, e.g. restarted after power outage.  4
  11. If you choose the option to insert a removable USB flash drive, it will save the startup key on the USB flash drive. This will be used to unlock the operating system drive after each reboot. NOTE: Unlike the recovery key, the startup key is not a text file. It has the file extension .BEK.  5
  12. If you select the option to enter a password, you will enter the password and confirm it. Make sure you use a long, secure pass phrase. This password will be required each time the computer is rebooted.  6
  13. Next, you will be given several options for storing the recovery key. You can back up the recovery key to one of the following locations. a) Save to your Microsoft Account. b) Save to a USB flash drive. c) Save to a file. d) Print the recovery key.  7
  14. The easiest thing to do is to use the option Save to a file. The recovery key is a text file that can be opened in Notepad. You can copy the recovery key later to a USB flash drive if there is a need, open the text file and print it, and copy it at multiple locations for back up purposes if you want. You can even type the key manually into Notepad, save it on a USB flash drive and use it to unlock the computer. There is nothing magical about the recovery key file.
  15. After you have saved the recovery key to a file, click Next.
  16. If you are setting up BitLocker on a new drive, you only need to encrypt the part of drive that is being used. When you add additional data, BitLocker will automatically encrypt that data. You should select the first radio button Encrypt used disk space only (faster and best for new PCs and drives). However, if you have already been using your computer for a while, select the second option Encrypt entire drive (Slower but best for PCs and drives already in use)8
  17. Depending on the size of the hard drive and the amount of data, the encryption process can take a long time so be patient. It’s best to start this process at the end of the day when you are no longer going to use your computer until the next day. Make sure the box Run BitLocker system check is checked and once you are ready, click Continue. 9
  18. You will be prompted to restart the computer. If you are ready to reboot now, click Restart now or click Restart later if you are not quite ready yet.
  19. After the computer reboots, it will start encrypting the drive. You won’t see any progress bar but if you go to Manage BitLocker in Control Panel (Control Panel\System and Security\BitLocker Drive Encryption) you will see that BitLocker is encrypting the drive.
  20. When the encryption is complete, you can see the status which shows that BitLocker is on for drive C. Some of the management tasks for BitLocker include the ability to suspend protection, back up your recovery key, copy the startup key and turn off BitLocker. You will find more information on suspend protection later in this article. 10
  21. After enabling BitLocker for your operating system drive, you can turn on BitLocker for other drives, such as drive D. As far as BitLocker is concerned, the order that you encrypt the drives is not important. You can encrypt the data drive first and the then the operating system drive, or vice versa.

When Would I Need the Recovery Key?

There are several reasons that you may need your recovery key. Some, but not all, of the reasons include, updating of computer BIOS, losing the flash drive that contains your startup key when you have enabled the startup key authentication, changing the original TPM (e.g. installing a new motherboard), moving your BitLocker drive to a new computer, or changing boot configuration settings. If you want to upgrade the computer BIOS chip, you can temporarily disable (turn off) BitLocker, upgrade the BIOS and then enable BitLocker (turn on).

BitLocker Best Practices

When you implement BitLocker, it’s imperative that you follow the best practices and take computer security very seriously. For example, using BitLocker to encrypt the drive but a weak password to authentication to your computer will be a bad idea. Follow these best practices and guidelines to secure your computer that’s configured for BitLocker.

  • If you encrypt your data drive but not the operating system, you need to ensure that your computer is physically secure. In addition, you need to make sure that you are using a strong password and preferably multi-factor authentication.
  • If during BitLocker configuration you save the recovery key on the local computer, make sure you copy the recovery key on a different computer. Otherwise, you may lock yourself out of your computer.
  • Have more than one recovery key for your computer and keep each key in a secure place other than the computer where it was generated.
  • Print out the recovery key on paper and store it in a safe location, like a safe deposit box at your bank. If there is a need, you can simply type the recovery key in Notepad and save it as a text file. That’s your recovery key that can now be used on the computer where it was generated.
  • If you ever regenerate the recovery key, make sure you update all your backups.
  • Name your recovery key file so it is easy to recognize the computer to which it belongs because recovery keys are computer-specific.
  • Do not confuse a startup key from recovery key. They serve different purpose. Recovery key is a text file.
  • For computers that are configured to use a startup key on a USB flash drive when they are started, make sure that you have backed up the startup key to a safe location.
  • If you have a laptop with docking station, configure the BIOS to make sure that the hard drive is listed first in the boot order whether the laptop is docked or undocked.
  • Before you upgrade Windows 7/8 to Windows 10, simply suspend BitLocker (refer to image in step 19), which won’t decrypt your drive. Upgrade to Windows 10 and then click Resume Protection.
  • Use BitLocker to Go to encrypt removable drives, such as USB flash drives, external hard disks, SD cards, etc. because they are more likely to be lost or stolen than the fixed drives.

BitLocker in Business Environment

In a domain environment, Active Directory Domain Services (AD DS) can be used to centrally manage the BitLocker keys. In addition, you can also use Group Policies to not only backup BitLocker and TPM recovery information but also manage recovery passwords. If you are interested in planning and deploying BitLocker in a business environment, check out some of the Microsoft TechNet articles listed below.

Conclusion

BitLocker Drive Encryption is built into Windows 10 Pro, Enterprise and Education versions. It offers an easy and secure way to protect your confidential data by encrypting your drives. BitLocker has very little performance overhead and you can encrypt not only your fixed drives but also the removable drives, such as USB external hard drives, USB flash drives, SD cards, etc. If you are interested in securing your data on your desktop computer or mobile devices, BitLocker is an excellent option. And best of all, you don’t even need to purchase any extra software or install an add-on.

ZubairAlexander_240x260

About the author

Zubair is a Microsoft SharePoint MVP, a Microsoft Certified Trainer, and CEO of SeattlePro Enterprises, an IT training and consulting company in Seattle, WA. He holds more than 25 industry certifications including MCT, MCSE, MCSA, MCDST, MCITP, MCTS, MCP+I, CNA, A+, Network+, Security+, CTT+ and CIW. His experience covers a wide range of spectrum: trainer, consultant, systems administrator, security architect, network engineer, Web designer, author, technical editor, college instructor and public speaker.

 

Data Loss Prevention (DLP) in SharePoint 2016 and SharePoint Online

$
0
0

Editor’s note: The following post was written by Office Servers and Services MVP Steve Smith as part of our Technical Tuesday series—on bonus Wednesday!

Hi Everyone,

In this article I am going to introduce you to the latest updates in the compliancy space around data loss prevention (DLP) in the new SharePoint 2016 public beta 2 release http://www.microsoft.com/en-us/download/details.aspx?id=49961 which is now available to download. The information from this article is also part of my SharePoint 2016 clinic designed to get people up to speed with SharePoint 2016 Beta 2 and will also form part of my upcoming 5 day 2016 Administrators classes in the UK and US. UK classes – http://www.combined-knowledge.com/Courses/Classroom/SharePoint_2016_clinic/index.html , US classes – https://mindsharp.com/course/sharepoint-2016-clinic/

 

I am also going to discuss DLP from a SharePoint Online perspective as well as on Prem, SharePoint Online being part of Office365 and also has DLP features for SharePoint online rolling out to tenants now. It is important to note however that DLP is not new in itself, the DLP features have been part of Exchange Server 2013 and Exchange online to allow you to build message driven policies for email. Having DLP in policies in SharePoint now allows a business to build a DLP structure across both email and data which is great news for all regardless if you are On Prem or in Office365.

 

I have been working with the document and records management features of SharePoint for many years and the first clarification I want to make is that DLP is not a replacement for those existing processes. In fact DLP is very much a compliment to your business’s overall strategy with how to handle compliant and sensitive data within your SharePoint environment. DLP is not replacing your document lifecycle management process but it is allowing your business to build a policy model to discover and protect data in a way previously not possible out of the box.

 

So what is DLP then I hear you ask, In a nutshell it is a method to discover (find) and restrict sensitive data being put into SharePoint that matches specific criteria through defined industry templates and thus avoid breaches of corporate data leaving the company. Such data could include credit card details or employee national insurance or social security information and they are specific to regional requirements. These 80 templates are the same ones being used by Exchange and the full list of templates can be found here https://technet.microsoft.com/en-us/library/jj150541(v=exchg.160).aspx although the SharePoint Beta 2 bits do not yet include the full set of templates I am sure by RTM there will be the full selection.

 

Although the examples I am using in this article are built in SharePoint 2016 Beta 2 on Prem you can follow along in exactly the same way in SharePoint online. The only difference is that in SharePoint online you cannot force content crawls so you may have to wait longer in order for the search results to show up.

1

Figure 1 – DLP Policy templates

If you expand these policies shown on the website linked before you will notice that each policy has a defined criteria that uses patterns and confidence levels to match data in the document in order to trigger the DLP policy to take action against the document. You will also notice that each template has specific keywords that form part of the detection criteria. The aim here is to flag items that clearly breach the rules of a policy and not flag items that may include certain keywords but have no legal implication. For example a sales person has a document in SharePoint that outlines to a client that they can pay via credit card. The keywords of credit card in this scenario do not warrant it being locked down by a DLP policy, but someone storing 50 custom credit card details in SharePoint clearly would. As you can see by the credit card template you have both keyword verification and keyword name to include card numbers as well as card type so in order for these templates to be triggered there must be clear matches against the template criteria.

2

Figure 2- DLP template criteria for credit cards

Before we start creating some DLP policies I first want to break down the two main options that you have in SharePoint around DLP and that we are going to look at later in this article. The two main elements are:

  • Discovery
  • Policy

An important point to mention here is that both of these options do apply to both items stored in SharePoint and Items stored in OneDrive.

 

Discovery

As a company you may not actually know how many items are currently in your organisation’s SharePoint data that are actually in breach of your own compliancy regulations. Having the ability to do a DLP query based on specific DLP templates across all your SharePoint data will allow you to quickly identity areas that are in need of managed policies and fixing the existing breaches. This discovery process relies 100% on search having crawled the items in SharePoint. In SharePoint 2016 In SharePoint 2016 Beta 2 there is a new addition to the eDiscovery site template called a DLP query that allows a user to launch queries against DLP templates against all your content or specific content in your SharePoint environment. One important aspect to this however is that the person who is running the query in the eDiscovery Center must have read access to all data in SharePoint. This can be achieved either via a Web Application Policy on Prem, or by adding them as site collection administrators in SharePoint Online or On Prem.

 

Policy

The obvious way to avoid sensitive data being available to others is to put in place a policy that restricts the document itself when it is put into SharePoint. A DLP policy enables the compliancy managers to create these policies and apply them to site collections in your SharePoint environment which can include policy tips, email notifications and blocking of the content once it matches a specific DLP policy template.

 

That is the terminology dealt with let’s now get started by testing the Discovery process and using the new DLP Query in the eDiscovery Center. In this example I have built my SharePoint Farm with 3 servers, one Server is the domain controller on Windows Server 2012R2, 1 is the SQL server using SQL 2014 and the third server is a SharePoint 2016 Server beta 2 in custom role. I have also tested DLP full in a Min Role farm with 9 SharePoint Servers so the same method applies regardless of your SharePoint deployment method. I have also created the User Profile Application Service for creating user my sites and personal OneDrives as well as a Search Service Application for crawling data.

 

The first thing you want to do is get a document ready to test that the DLP Query is working correctly. For this example I am going to use a generic credit card list which you can obtain from here http://www.paypalobjects.com/en_US/vhelp/paypalmanager_help/credit_card_numbers.htm and copy the table into your own word document and save. You now need to upload the document into a document library, in my test I am going to upload one into a team site and one into my personal OneDrive.

3

Figure 3 – Document added to team site library

 

4

Figure 4 – Document added to OneDrive

 

Now that you have added your documents with the credit card data into SharePoint you now need to do an update crawl of the data so that the new documents are now in the SharePoint Index that a user runs a query against. This is achieved in the search service application for your SharePoint content source. An incremental crawl is fine.

5

Figure 5 – Running an incremental crawl to update the Index

 

Part of the crawl process is to analyse the content through the content processing component and part of this process includes a new component in SharePoint 2016 called the Classification Operator. Along with other processing components such as word breakers and document parser. Once processed the classification results are stored in the Index ready for a query to be used against it.

Capture 6

Once the crawl has finished you can proceed to create a new site collection that uses the eDiscovery site template. This is done via Central Administration or PowerShell or if you are in Office 365 you can create a new site collection via the SharePoint Admin site in your tenant Admin page. When creating your new site collection ensure that you select eDiscovery Center which can be found in the Enterprise tab.

 

**Note** There is no limit to the amount of eDiscovery sites that you create in you organisation you simply control access to each one via the site permissions.

6

Figure 6 – Select the eDiscovery template

 

Ensure that the users who will be managing the eDiscovery sites and are going to be running queries against SharePoint data have got read permissions to the site collections that they will be searching for content. Once your site is created go ahead and launch the site logging in as your chosen user.

On the home page for the site you have the option of creating a new discovery case or just a DLP query. The scope of covering eDiscovery cases is too much for this article but a good starting point is here if you want to know more https://technet.microsoft.com/en-us/library/fp161516.aspx . For this demo I am going to select ‘Create DLP Query’

7

Figure 7 – The eDiscovery home page

 

In the ‘Search and Export’ page you need to create a ‘New Item’. In the new item page you will now see the DLP templates that we mentioned earlier. For this test we are going to select U.K. Financial Data as that is the data we copied earlier for the credit cards.

8

Figure 8 – New DLP Query

 

You will also notice on this template selection page that you now have the choice to choose how many instances of the particular sensitive data type you need to capture in the document before it is captured in the query. For example if you want to be shown all documents in SharePoint that have 2 or more credit card numbers you need to change this box to 2. Obviously the lower the number the more potential false positives you could capture. It all depends on what the level of identification is needed for your company. For this example I will leave it at 2.

9

Figure 9 – Choose the query match trigger amount

 

Clicking next takes you to the actual DLP query page. On this page you will need to define a name for the query which will be saved for later use and also the source of where you want to run the query against. The Source can be specific SharePoint site collections or simply all of SharePoint. So give your query a title and then in the sources section click on ‘Modify Query Scope’

10

Figure 10 – Modify the search scope

 

**Note** It is possible to amend the query at any time, so if for example you wish to change the query string to be 5 or more hits instead of the 2 we previously defined then you can edit the query in this page at any time, for example: SensitiveType=”Credit Card Number|5..” also notice in the query string that we have added standard query language to include other options in this case EU Debit Card and SWIFT Code.

 

On the modify Query Scope page select ‘Search Everything in SharePoint’ because at this stage we don’t actually know if we have any breaches in the business so we would like to first find data anywhere that matches our query type.

11

Figure 11 – Choose Everything in SharePoint

 

Click OK to and now you can click on the Search button to run the query.

12

Figure 12 – Searching for content

 

Once the query has finished you should now get a return of the documents that you uploaded earlier into your sites or in my case my project site and my personal OneDrive.

13

Figure 13 – Seeing the results

 

So now I have been able to discover exactly where data is being stored that is clearly in breach of my corporate policy. The next step is to start applying a DLP policy to the site collections I want to control. In order to create a policy we first need to create another new site collection that allows me to create and manage policies plus assign them to user site collections.

Again we do this in Central Administration or PowerShell on Prem or via the SharePoint admin page in Office 365 Admin. This time we selecting a template in the Enterprise tab you need to select ‘Compliance Policy Center’. You will notice there is also another new template here for In-Place Hold, which is something for another article J

Just like the eDiscovery site you can have as many Compliance Policy Center’s as you need just control access via the site permissions.

14

Figure 14 – Create a Compliancy Policy Center

 

Once the Site Collection has been created you can now browse to it and you will see there is two distinct sections. Delete Policies and Data Loss Prevention Policies. Both play a different role in data management, one obviously deleting content already in SharePoint based on a defined policy and the other managing content via a policy that is being put into SharePoint. The one that we care about is obviously the DLP Policy.

15

Figure 15 – The Compliancy Policy Center Home Page

 

There are two sides to a DLP policy:

 

  • Policy Management
    • Create and define the policy logic
  • Policy Assignments for Site Collections
    • Assign a policy to specific Site Collections to enforce the policy.

 

Let’s first look at Policy management and creating our first policy. Click on DLP policy Management to open the management list. From here you need to click on ‘new item’

16

Figure 16 – New DLP Policy Management Item

 

Let’s use our UK credit card scenario again, so I will call this policy UKCCFraud for example and select the same U.K. Financial Data template that we used in the previous eDiscovery query.

Change the number of instances that you want to define in order to trigger the policy. I will use 2 again in this instance as I know that works based on my previous query. You also need to define a user who will receive incident reports, this could be an email address that is seen by several high ranking legal people in your organisation for example or a compliancy officer.

There is now two additional options, neither are mandatory but definitely useful. The first option is to add policy tips. This updates the count with additional information when a user goes to edit an item that is not compliant stating for example that it is in breach of a policy. This policy tip can be shown in Office, in Item preview or through Office Online applications. We will look at this in more detail shortly. The second option requires policy tips to be on but also blocks content for normal users to view the content in the SharePoint library. Only the content owner and site owners can now see the item that is in breach of the policy and it will not be viewable for all users until the content in question is edited or changed. For the purpose of this exercise I will be selecting both.

17

Figure 17 – Defining Policy options

 

Just click Save to now store the policy.

Now that the policy has been created the second part of the configuration is to align it to any site collections where we need the policy enforcing for SharePoint data. In my example that is a site collection called projects. Obviously choose one of your own to test this against.

As usual I do need to emphasise that you should be testing this first in a test environment, not in production whilst you get familiar with the technology J

 

In the Data Loss Prevention home you now need to go to DLP Policy Assignments for Site Collections. On this page you now need to create a ‘new item’.

18

Figure 18 – select Site Collections

 

On the Site Collection Assignment page you first need to choose a site collection that you want to assign a policy to. Click on ‘First choose a site collection’ and then in the site collection field enter the full URL to your chosen site collection. In my example this is https://intranet.combined.com/projects and then click on the search icon to resolve the site collection URL. Just like the DLP Query you must have crawled your site collections in order to find them.

19

Figure 19 – Enter a site collection URL and search for it

 

Once the search has resolved your site collection you should simply tick the box for the locations you want to apply the policy. If you have selected a root site collection all site collections below that path will be shown. For example: If you select a root site collection of https://intranet.combined.com and other site collections from the managed paths will be shown, such as https://intranet.combined.com/sites/alpha and https://intranet.combined.com/sites/beta

20

Figure 20 – Choosing from multiple site collections if using a root

 

Now that you have selected your site collection you now need to assign it a policy. In the same Site Collection Assignment page click on Manage Assigned Policies. This now allows you to choose from one of your DLP management policies. In our case we created the credit card fraud one earlier so I will select that. Select it and click save to apply the policy.

21

Figure 21 – Assign a managed policy to a site collection

Finally click on save to apply the site collection policy. At this point you can rinse and repeat for as many site collections as you wish. You will notice below that I also choose my users my site host which allows me select any of my users personal sites which would also include OneDrive data on a per user basis.

22

Figure 22 – Assigning DLP Policies to users personal sites

 

Now that you have applied your policies to your site collections we should be able to see the effect by browsing back to our team site and refreshing the data. In my case I went back to my projects site and you clearly see a new icon over the document that failed the policy test and has been marked blocked due to being in conflict with a policy.

23

Figure 23 – DLP Policy applying to a document

 

You can also follow the link to see the policy tip to get more information on the policy breach and then open the item to resolve it.

Remember that the only people who can see these options are the site owners and the document owner, other user don’t even see the document in question.

24

Figure 24 – DLP Policy Tip

If you select ‘Resolve’ then an additional box appears that allows you to either override the Policy which could have legal ramifications on you personally or you can report the issue to a higher administrator and continue.

 

So there you have it, a comprehensive look at some of the new features of DLP in SharePoint 2016 Beta 2 and for those of you with the DLP feature available in your Office 365 tenant you can follow the same steps. Combine DLP with SharePoint along with DLP in Exchange and you have a very solid base for managing your corporate DLP strategy. I hope you have enjoyed following this article and I would love to hear from you and how you are getting on.

steve@combined-knowledge.com

 

Steve-Smith

About the author

Steve (@stevesmithck) is the owner of Combined Knowledge in the UK and Mindsharp in the US, SharePoint education and productivity companies since 2003 and has been a SharePoint MVP for the last 10 years, an MCT for 18 years and has recently been made a Microsoft Regional Director. Steve is also the founder of the UK SharePoint User Group now in its 10th year which is the largest active in person SharePoint User Group in the world with meetings around the UK throughout the year and is free to everyone: http://www.suguk.org/

 

 

 

Check out the new MVP Channel 9 page!

$
0
0

If you’re an MVP fan, we’re happy to announce a new way to discover great technical tips, tricks and ideas from these community leaders. Last week, Microsoft launched the MVP landing page on its highly popular Channel 9 site, which showcases technical videos, interviews and news from across Microsoft technologies.

 ch9

You can find a rotation of featured high-quality videos across the top of the page or discover featured events and content highlighted by MVP award categories.

 

MVPs began contributing videos about a month ago and already they have drawn thousands of views. So far, MVPs from a dozen countries have videos featured on Channel 9, with contributions from Czech and Brazil leading the way. Right now the most popular series is Visual Studio and Development Technologies.

 

If you’re an MVP and would like to submit content to be featured on Channel 9, contact your community program manager. If you’re an MVP fan, keep watching—new videos are coming in nearly every day!

 

 

#FridayFive with a Channel 9 Post!

$
0
0

Enterprise Mobility MVP Freek Berson @fberson: HTML5 for Azure RemoteApp available in public preview!

Windows Development MVP Matías Quaranta @ealsur: Despliegue e integración contínua en Azure

Visual Studio and Development Technologies MVP Dirk Strauss @DirkStrauss: C# REPL – Introducing C# Interactive

Visual Studio and Development Technologies MVP Colin Dembovsky @colindembovsky: Config per Environment vs Tokenization in Release Management

Office Servers and Services MVP Sathish Veerapandian: Troubleshooting addressbook issues in Lync 2013/Skype for Business


Unifying Your Web Dev Skills for Office Add-ins

$
0
0

–And how to build a custom application storage provider for Outlook Add-ins

Editor’s note: The following post was written by Outlook MVP Eric Legault as part of our Technical Tuesday series.

 

Abstract: Using RoamingSettings with your add-in to store application data for your web-based Outlook add-in is all well and good.  However, it may not take much to exceed the 32k storage limit, plus you are also left to define the data format as you see fit – which of course means extra work.  Learn how you can overcome this by creating a hidden folder in the user’s mailbox to store and retrieve your application’s business objects as serialized settings in an email message, by simply passing JSON objects that then get serialized into XML for storage and vice-versa.

Technologies: Office 365; Outlook; Office Add-ins; Exchange Web Services; JavaScript; jQuery; JSON; Office UI Fabric; XML

Introduction

You’re a modern web developer. You know JavaScript, HTML 5, CSS+, etc. You may know about the new Office Add-ins (formerly Apps for Office) and how they are built on a new web-based architecture. What you may not know yet is how to tie it all together with your current skillset or how to quickly get started learning unfamiliar APIs so you can build innovative solutions which extend Outlook, Word, Excel etc. both in the desktop clients and the browser versions of these applications across all devices.

As is usual when you’re learning any kind of new technology, finding useful starter articles and/or projects can prove difficult. Hopefully when you’re done reading this you’ll come away with some big and bright lightbulbs going off in your head! I’ll be covering the foundations of using application data storage, managing business objects, performing common messaging operations and building a simple but sharp and responsive UI. The sample project accompanying this article should not only prove to be a very useful learning tool, but can serve as the basis for implementing a custom storage provider which you can easily re-use for your own projects.

The Cool Tech We’ll Use

I’ve got quite the mix in this solution – and I guarantee you’ll come away from this article with some useful APIs, techniques and code that you can steal:

 

    • JavaScript, HTML 5 and CSS (of course)
    • Some jQuery (mainly for selectors, a dialog widget and some Deferreds)
    • Exchange Web Services (aka EWS, but using SOAP and not that fancy managed API – so the hard way)
  • JSON to XML and XML to JSON (thanks to x2js)

 

Here’s a peek at what we’re doing:

1

Figure 1: The activated “Storage Provider Demo” Outlook add-in

Creating a Better Method for Solution Storage

The Mailbox API already provides a native way to read/write custom application for Outlook add-ins via the RoamingSettings object (see “Get and set add-in metadata for an Outlook add-in” for a good overview). However, there are limits of 32KB for each setting and 2MB maximum that a single add-in can store. That may sound like enough, but not in the real world! For one add-in I was writing I was storing mailbox folder paths and PR_ENTRY_ID values for each folder in RoamingSettings, and during testing with my own production mailbox (with 400+ folders) I hit the 32KB limit quickly. I immediately realized this was no good at all and that I needed to build something custom. So I built it!

Using the same approach as RoamingSettings, my Custom Solution Storage Provider simply uses a hidden folder in the user’s mailbox and a Post message (you can use an e-mail message if you want – just alter the CreateItem request) to store required content in the message body. Not only that, it stores it as XML so that you can easily use JSON objects to manage your application settings/data and serialize/de-serialize that data via this hidden storage. All you need to do is use EWS to read and write to the message whenever you want. However, keep in mind that EWS calls in Outlook add-ins limits both request and response calls to 1MB. It may be rare to exceed this limit, but please keep this in mind.

What We’ll Be Doing

  • Using various EWS operations:
  • Creating a folder in the root of a mailbox with CreateFolder
  • Hiding a folder by setting extended properties with UpdateFolder
  • Creating a message with CreateItem
  • Retrieving the body content of an existing message with GetItem
  • Updating the body of an existing message with UpdateItem
  • Persisting JSON objects to XML, and create JSON objects from XML
  • Keeping the code organized using the jQuery Module Pattern
  • Executing multi-step EWS operations using jQuery Deferreds
  • Theme the UI using Office Fabric
  • Effectively displaying application notifications and dialogs in the context of an Outlook add-in
  • Dynamically manipulating HTML elements on the page using jQuery

What You’ll Need

How This Will Work

We’ll create a simple but effective UI that’ll showcase how to:

  • Create a named e-mail folder that’ll contain the hidden solution storage message
  • Use input elements to create some sample JSON business objects in memory and output their XML representation to the page
  • Save and retrieve add-in settings and business logic/XML to both RoamingSettings (to store the IDs for the folder and solution storage message) and our solution storage message
  • Reset add-in settings if needed to start from scratch

Playing Along

If you want to step through the code and see how everything works, download the source code (see links above in “What You’ll Need”) and run the project. If you need help with that, see “Create and debug Office Add-ins in Visual Studio”.

PLEASE NOTE: THE CODE IS NOT AVAILABLE IN THIS ARTICLE: YOU CAN GET IT DIRECTLY AT

https://github.com/elegault/OutlookAddinStorageProviderDemo

You can also install the add-in and run it without needing the source code at all. I’ve published the add-in to a web site in Azure – simply download the manifest file and install it from the Manage add-ins page under Mailbox options in Office 365. This is one way of releasing an add-in without having to publish it to the Office store (as long as you have an https cert).

Core Project Components

All Office Add-in projects in Visual Studio are comprised of two separate projects within the solution: one for the add-in itself (MailAddinStorageProvider) and one for the web component (MailAddinStorageProviderWeb). You can take a look at “Create and debug Office Add-ins in Visual Studio” for a quick walkthrough if you’re a first-timer. The add-in’s project is simple and just contains an XML manifest that’s used to basically declare the add-in’s description and intent. For this project, the main requirements are to:

  • Activate for read messages (as opposed to compose messages)
  • Ask for ReadWriteMailbox permissions (necessary for EWS calls)
  • Request a display size of 450 px (the maximum; we need a lot of space)

code 1

The web project really only contains five files that have all our custom guts:

  • /Addins/Outlook/StorageProvider/AppRead/Home.css
  • /Addins/Outlook/StorageProvider/AppRead/Home.html
  • /Addins/Outlook/StorageProvider/AppRead/Home.js
  • /Addins/Outlook/StorageProvider/App.css
  • /Addins/Outlook/StorageProvider/App.js

Note: these folder paths differ from the Visual Studio project template defaults

Files for referenced components are stored in other folders

  • Content: Office and Fabric CSS and scripts
  • Scripts: core JavaScript for Office, jQuery and xml2json libraries

2

Figure 2: Office Add-in project components in Visual Studio Solution Explorer

Architecting the Solution Storage Provider

The heavy lifting for implementing this custom storage provider is performed by using EWS operations. Note that we’re limited to a sub-set of EWS operations because not every method that is typically available in the EWS Managed API (which can be used for client applications, but NOT web apps) are supported in Outlook add-ins. The first operation that we need to call is CreateFolder, which will create a folder with the name provided in the Folder Name textbox after you click the “Create Folder” button in the NavBar. We then need to store the Id value for that folder in RoamingSettings so that we know where to create our solution storage message. For that, we need to call CreateItem and once again save the Id for that message in RoamingSettings so that we can get and update the message body when required to persist our application data via GetItem and UpdateItem calls.

Callout: EWS Operations in a Nutshell

To call an EWS operation from an Outlook add-in requires calling the Mailbox.makeEwsRequestAsync method. I’ll use the CreateFolder call as an example of how to construct an EWS request. The initial call is made in the createSolutionStorage feature:

code new 2

The first parameter for makeEwsRequestAsync requires a string in the form of an XML SOAP request with the details of the EWS operation we are asking Exchange to perform; I’m calling the ewsRequests.getCreateSolutionStorageFolderRequest function to build and return that string, helped with the folderName and isHidden parameters for that function:

code new 3

code 2.2

We have to set the value of the DistinguishedFolderId property accordingly based on whether we want to make the folder hidden or not. If it does not need to be hidden, the folder will be created at the root of the Mailbox and visible within the folder hierarchy in Outlook. However, I recommend for purposes of testing that you do NOT create a hidden folder, as there is no code provided that can delete that folder if you want to reset your solution storage and start from scratch. To reset your storage with a visible folder, you can delete the folder in Outlook (and hence delete the hidden message) and then make sure to click the “Clear Settings” button to remove the Ids for the folder and message from RoamingSettings (the named settings themselves are deleted, not just the values). If you do want to delete the hidden folder, you can use OutlookSpy, MFCMAPI or the EWSEditor to find that folder in the MsgFolderRoot and delete it using those utilities.

The second parameter is the function callback that will read the XML response returned from Exchange; in this case, the createSolutionStorageFolderCallback function:

code 3.1

code 3.2

Note the judicious use of evaluations for various response scenarios. Ideally errors will never be hit, but you’ll find these checks invaluable when you’re first trying out any kind of EWS operation as they almost never work out at first run! However, if everything works as it should then the results of our call will contain the FolderId value for our new folder in the XML response body which we can then persist in RoamingSettings to use later when we need to create the storage message in that folder. A typical response body would look something like this:

code 4

Ready to Store Your Data!

Now that we have our solution storage folder created, go ahead and create some business objects! Enter an Artist Name, select a genre and click the “Add Artist” button:

3

Figure 3: Using the add-in’s UI to generate business objects and XML

Behind the scenes, the addArtist() function create a new instance of a Band object, adds it to our solutionStorage.appicationData.FavoriteBands object and then passes solutionStorage.appicationData to the X2JS. json2xml_str function, which returns the data as serialized XML:

code 5

Here’s what our FavoriteBands collection of Band objects looks like in XML:

code 6

Now that we have the XML, we can push it up to storage – go ahead and click the “Update Storage” button. This will run the updateStorage() function, which is setup to detect whether this is the first time we’re using the solution storage, because if it is we’ll create it first – otherwise we’ll just update it. Since this is our first time using storage, this function will call solutionStorage.createStorageItem() function which in turn issues an EWS request for CreateItem. When we read the response from solutionStorage.createStorageItemCallback we can grab the Id of the new message and persist it to RoamingSettings. Later calls to update storage will use the EWS UpdateItem operation instead.

Both CreateItem and UpdateItem calls are similar in that they are taking our business objects (stored in solutionStorage.applicationData) and passing it to the X2JS.json2xml_str function which conveniently returns to us the XML string that we can store in the message body:

code 7

As you can see in the folder that we created, the XML markup is happily living in the message body of the Post message that’s being used for solution storage:

4

Figure 4: Data for the add-in stored as XML in an Outlook Post item

The UI

When I first built this it looked rather plain using the standard HTML controls. I wanted it to look more modern and quickly realized this is exactly what the Office UI Fabric is for. With minimal effort you can not only use the Office Design Language so that it blends right in to the Office application you’re extending, but it also ensures that your UI is responsive and “mobile first”. All that’s needed is to apply some Fabric CSS classes (and implement some JavaScript where required for some components).

To get started, simply add some references to the Fabric CSS via CDN (or host them yourself) in the head of the HTML:

code new 1

Then apply a Fabric class to make otherwise bland HTML controls like buttons and dropdowns jump out, such as in the controls we’re using to add and remove business objects:

code 8

Code Review Time

Those of you who are traditional desktop or Office developers (hello COM Add-ins! You’re not dead yet!) may have a bit of a learning curve when it comes to the bread and butter of modern web application development, especially JavaScript and jQuery. So it may help to go through a brief review of how those APIs were used effectively in the context of this solution.

Object-Oriented Programming adherents will notice a similarity in how the solutionStorage feature was designed. I followed the jQuery Module Pattern and broke down the core methods and properties into loosely coupled units of functionality the best I could. The key constructs are:

  • ApplicationData, FavoriteBands and Band object functions
  • Features for EWS operations (ewsCallbacks and ewsRequests) and the core [module] function used by all Office add-ins
  • The solutionStorage feature with core properties and functions for:
    • clearSettings()
    • createSolutionStorageFolder()
    • createStorageItem()
    • get StorageIds()
    • getStorageItem()
    • saveMyAddinSettingsCallback()
    • saveSettings()
    • updateStorageItem()

5

Figure 5: The Module Pattern with core features

One thing that’s very handy when dealing with EWS calls is being able to cleanly execute multi-step EWS operations using jQuery Deferreds. I’m not a fan of calling a second EWS call from within an EWS callback function as your code can get very messy very quickly. By using Deferred objects you can have more legible code through chainable constructors to better organize your asynchronous calls. A good example of how this is used in our sample project is in the createFolder function:

code 9

After we make the call to create the folder in the “when()” branch, we can then wait until the “then()” branch executes to decide if we need to make a second EWS call to make the folder hidden. This is a better approach than making this evaluation inside the first callback within the “when()” branch. Deferreds can quickly become essential if you need to make three or more EWS calls (it happens!) in a row, and the compactness of the code is far better than navigating amongst multiple methods when coding or debugging.

What’s Next?

Feel free to rip out the solutionStorage feature for your own projects and extend or modify it as you see fit. You may require multiple storage items for advanced scenarios or need to use JSON strings instead of XML for application data. Take the UI examples even further with other cool Fabric components like the Panel, DatePicker, PersonaCard or ListItem. Or borrow the code organization patterns and EWS XML requests for your own unique solution. You have your wicked web dev skills already, and now you hopefully have a good foundation for taking what you know to the next level with Office add-ins. Remember, you can get the code for this at https://github.com/elegault/OutlookAddinStorageProviderDemo  Cheers!

Eric, MVP plus rocker

About the author

Eric is the owner/principal consultant of Eric Legault Consulting Inc. He has nearly 20 years of professional consulting and development experience, has been a Microsoft MVP for Outlook for 12 years and is the co-author of Microsoft Office Professional 2013 Step By Step from Microsoft Press.  He has developed dozens of Outlook add-ins and other custom Office solution projects over the years, and is known as a “full stack” product specialist with experience developing, managing and marketing products for enterprise and commercial markets.  He has deep expertise with the Outlook Object Model, Office desktop and web Add-ins (including Office 365 REST APIs and Office for JavaScript), VSTO, Add-in Express, Redemption and Exchange Web Services and has spoken on Outlook and Office programming at conferences around the world (Sweden, Las Vegas, Montreal, Winnipeg, Regina).

 

Go into the Weekend with Your MVP #FridayFive!





Latest Images