Quantcast
Channel: The Microsoft MVP Award Program Blog
Viewing all 788 articles
Browse latest View live

Integrating SemanticZoom with Hub in Windows 8.1

$
0
0

Editor’s note: The following post was written by Client Development MVP Houssem Dellai

 Integrating SemanticZoom with Hub in Windows 8.1

1-    Introduction

Windows 8.1 has come with some cool new controls. One of them is the Hub component. The Hub makes it easier for developers to design grouped items, than it was in Windows 8.0. In fact, it’s now possible to design each group independently from other groups, without using the CollectionViewSource or either developing additional classes to got items of the group in different sizes.

2-    Problem

One of the common tasks for developers was integrating the SemanticZoom with the groups using the same CollectionViewSource. Now they want to use the Hub, without the CollectionViewSource. In that case, they may notice that, in ZoomedOutView mode, tapping a section name will return to the ZoomedInView mode, but doesn’t scroll to the right HubSection. Instead, it always scroll to the first HubSection, as shown in figure 1!

 

Figure1. Clicking on any Hub Section scrolls always to the first Hub Section.

 

3-    Solution

Let’s make each item of the ZoomedOutView point to the right HubSection in the ZoomedInView. For that, we’ll use the ScrollToSection (HubSection section) method in the Hub object. All what to do is just passing into parameter the HubSection to scroll to.

Following is the Tapped event’s code that will recognize the requested HubSection and scroll to it.

 

 

private async void GoToSection_Tapped(object sender, TappedRoutedEventArgs e)

{

    await Task.Delay(200); // This delay is relevant for the success of the scrolling.

 

    // Get the title of the HubSection from the tapped TextBlock

    var textBlock = e.OriginalSource as TextBlock;

    if (textBlock == null) return;

    var sectionName = textBlock.Text;

 

    // Depending on the title, scroll to the related HubSection.

    switch (sectionName)

    {

        case "Hub Section 1":

            MainHub.ScrollToSection(MainHub.Sections[0]);

            break;

        case "Hub Section 2":

            MainHub.ScrollToSection(MainHub.Sections[1]);

            break;

        case "Hub Section 3":

            MainHub.ScrollToSection(MainHub.Sections[2]);

            break;

        case "Hub Section 4":

            MainHub.ScrollToSection(MainHub.Sections[3]);

            break;

    }

}

 

Figure2. Clicking on “Hub Section 2” will scroll to the second Hub Section.

 

4-    Conclusion

This article shows how much easy it is to take advantage of the semantic zoom in Windows Store apps. As users really like to use these funny touch gestures on their tablets, this is a useful feature.

About the author

Houssem is a Software Engineer specialized in client development. He is developing Windows 8 and Windows Phone apps that uses Windows Azure for backend services. He also give technical trainings, speak in international conferences and write articles. You can find more on his website or follow him on Twitter!

The MVP Monday Series is created by Melissa Travers. In this series we work to provide readers with a guest post from an MVP every Monday. Melissa is a Community Program Manager, formerly known as MVP Lead, for Messaging and Collaboration (Exchange, Lync, Office 365 and SharePoint) and Microsoft Dynamics in the US. She began her career at Microsoft as an Exchange Support Engineer and has been working with the technical community in some capacity for almost a decade. In her spare time she enjoys going to the gym, shopping for handbags, watching period and fantasy dramas, and spending time with her children and miniature Dachshund. Melissa lives in North Carolina and works out of the Microsoft Charlotte office.


MVP Developed Coloris App Shines in Windows Store

$
0
0
After attending a Windows 8 UX Design Camp in Japan, MVPs Yutaka Tsumori, Yu Mitsuba and Akira Hatsune discovered a need for customizable color options when creating Windows Phone apps.  The MVP group wanted to increase the color functionality of backgrounds and buttons for developers in the Windows Store.
"I felt that color was a crucial element for well-balanced store app design," said Visual Basic MVP Akira Hatsune. "And there were few applications that deployed visually nice colors."
 
Enter Coloris.  A Windows Phone app that enables users to more easily adjust in-app coloring and design.  Hatsune, who is no stranger to the Windows Store having developed 10 previous apps, collaborated with Client Development MVP Yu Mitsuba and Windows Embedded MVP Yutaka Tsumori to deliver Coloris to app creators.
 

At the November 2013 MVP Summit, Hatsune, Tsumori and Mitsuba took home the top prize during the MVP Showcase beating 41 other MVP entries.  "MVPs visited us and said, 'Yes! This is exactly what I needed!'" said Hatsune.  To see more from the MVP Showcase,click here

When asked about the MVP Award community and the advantages of being an MVP, Hatsune said, "For me, the Microsoft MVP community is a place where we can communicate with all the geeks around the world.  Being an MVP means having a strong pipeline with Microsoft developers and other excellent MVPs."

To download or discover more about Coloris, visit the Windows Store.  

Top 5 MVP Monday Posts of 2013

$
0
0

1. Virtual Directories: Exchange 2013

By Exchange Server MVP Manu Philip

A virtual directory is used by Internet Information Services (IIS) to allow access to a web applications in Exchange 2013

Autodiscover Service, ECP, EWS, ActiveSync, OWA, OAB, Powershellare the available virtual directories through EAC.

You can manage a variety of virtual directory settings on Exchange 2013 including authentication, security, and reporting settings. I am explaining here, how you can manage the Virtual Directories through Exchange Admin Center. I have also included some example PowerShell cmdltes to show how to manage those resources: Click here to read the full post.

2. Windows Azure Service Bus Connection Quotas

By Microsoft Integration MVP Damir Dobric

In the era of devices and service Windows Azure Service Bus will play definitely a more important role. However the huge number of devices which can connect to the bus in a practical and hypothetical scenario might be an issue. Observing the huge number of connections is not an easy task and it can bring any technology to the limit.

Windows Azure Service Bus is designed to observe a huge number of connections. We know Service Bus provides the solution but in a typical scenario you cannot just simply connect millions of devices to it and expect that it just works. Fortunately for some specific scenarios there are solutions which just work out of the box. The good example is Service Bus Notification Hubs. Click here to read the full post. 

3. Introduction to C++ 11 in Visual Studio 2013

By Visual C++ MVP Alon Fliess

I just had a conversation with one of my colleagues. He told me “I have started looking at C++". "I didn’t realize that it is such a productive language”, he added. You see, my colleague is a gifted C# developer, and he knew that C++ is an "old" language that one uses to program Operating Systems, Drivers, High Performance algorithms, communicate with hardware devices and make your life interesting but also complicated. My friend was born the year that Bjarne Stroustrup invented C with Classes, the first name that he gave to C++, at AT&T Bell laboratories.

For a C# developer, C++, in many cases, is the legacy code that you need to interop with. For me and many other veteran developers, C++ is one of the sharpest tools in our toolbox. As a Windows developer, I tend to choose the right tool for the job, be it C++ native code, C# with .NET or even JavaScript. Click here to read the full post. 

4. Enhanced Presenter View in PowerPoint 2013

By PowerPoint MVP Geetesh Bajaj

Most presenters just cram their slides with text – you may have seen such slides often, characterized by so much text that they look like a Word document repurposed as a slide – or even worse, it may appear as someone just copied tons of data from an Excel sheet and put in on a single slide! Of course, each of the slides would receive awards for competing in a “Fill-up-your-slide” contest.

OK, there’s no such contest – yet there are entrants for such contests everywhere. So the question that needs to be asked is why do presenters assume that their slides need so much text? There are several answers – and most of these get repeated each time I ask this question in my training sessions: Click here to read the full post. 

5. Using SharePoint PropertyBag in the Context of Search

By SharePoint MVP Nicki Borell

The Property Bag is a “store” within SharePoint which can be used to places information’s and metadata. Property Bag is a hierarchical structure starting at farm level and goes down up to list level.  Microsoft itself uses the Property Bag to store configuration settings and information’s. For details see that msdn article: Managing SharePoint Configuration

For common information’s about the Property Bag please refer that msdn sites:

Click here to read the full post

About MVP Monday

 

The MVP Monday Series is created by Melissa Travers. In this series we work to provide readers with a guest post from an MVP every Monday. Melissa is a Community Program Manager, formerly known as MVP Lead, for Messaging and Collaboration (Exchange, Lync, Office 365 and SharePoint) and Microsoft Dynamics in the US. She began her career at Microsoft as an Exchange Support Engineer and has been working with the technical community in some capacity for almost a decade. In her spare time she enjoys going to the gym, shopping for handbags, watching period and fantasy dramas, and spending time with her children and miniature Dachshund. Melissa lives in North Carolina and works out of the Microsoft Charlotte office.

Congratulations New and Renewed MVPs! - Happy New Year

$
0
0

Today, 1,011 exemplary community leaders around the world were notified that they have received the MVP Award! These individuals were chosen because they have demonstrated their deep commitment to helping others make the most of their technology, voluntarily sharing their passion and real-world knowledge of Microsoft products with the community.

While there are more than 100 million social and technical community members, only a small portion are selected to be recognized as MVPs. Each year, around 4,000 MVPs are honored. They are nominated by Microsoft, other community individuals, or in some cases themselves. Candidates are rigorously evaluated for their technical expertise, community leadership, and voluntary community contributions for the previous year. They come from more than 90 countries, speak over 40 different languages, and are awarded in more than 90 Microsoft technologies. Together, they answer more than 10 million questions a year!

MVPs are recognized each quarter for this annual award, which continues to grow and evolve to reflect the development of Microsoft technologies.

Congratulations to the new MVPs, and welcome back to renewed MVPs. We are very excited to recognize your amazing accomplishments!

Using ObjectDataSource with ASP.NET ListView for Entity Framework 6

$
0
0

Editor’s note: The following post was written by Visual C# MVP Ming Man Chan

Using ObjectDataSource with ASP.NET ListView for Entity Framework 6

This article consists of three subsection:

  • Create an ADO.NET Entity Data Model
  • Add a model class for Object Data Source binding
  • Add and configure Object Data Source and ListView

When you are try to create an EntityDataSource in VS 2013 you will have this error:

The provider did not return a ProviderManifest instance in VS 2013 and EntityFramework 6

In order to resolve this one of the choices is use ObjectDataSource.

Create an ADO.NET Entity Data Model

 

Following the steps below to an ObjectDataSource to utilize Entity Framework 6.

  1. Click on File menu in Visual Studio 2013.
  2. Click on Project...
  3. Click on New Project.
  4. Select the Web template.
  5. Select ASP.NET Web Application.
  6. Type the project name for example WebAppNWEF6.
  7. Click on OK in New Project.
 
8.  Choose Web Forms. Click on OK in "New ASP.NET Project - WebAppNWEF6” dialog.

 

The project is now created.

 

9.  Right click on WebAppNWEF6 (tree view item).

10. Select Add -> New Item

 

11. Select Data then ADO.NET Entity Data Model.

12. Type in the Name for example, NWModel.edmx.

13. Click on Add button.

 

14. Click on Next > button.

 

 15. Click on Which data connection should your application use to connect to the database? (combo box) in Entity Data Model Wizard.

 

16. Click on New Connection... button in Entity Data Model Wizard.

 

17. Type on Server name: in "Connection Properties" for example, .\SQLEXPRESS

 

18. Click on Open button in "Connection Properties".

 

19. In this sample you can click on northwind in list item.

20. Click on OK button in "Connection Properties".

 

21. Click on Next > button in Entity Data Model Wizard.

 

22. Click on "Tables (tree item)" in Entity Data Model Wizard.

23. Click on dbo (tree item) in Entity Data Model Wizard.

24. Select the Products table.

25. Click on Finish button in Entity Data Model Wizard.

The ADO.NET Entity Model is now created.

26. Click on Build menu item to build your project.

  1. Right click on Model.
  2. Select Add -> Class… menu item.
  3. Type in the class name for example, ProductModel.cs. Click Add.

Add a model class for Object Data Source binding

 

The class is created as follow.

using System;

using System.Collections.Generic;

using System.Linq;

using System.Web;

 

namespace WebAppNWEF6.Models

{

  public classProductModel

  {

  }

}

We will now add the Select, Update, Insert, and Delete method into the class.

using System;

using System.Collections.Generic;

using System.Linq;

using System.Web;

 

namespace WebAppNWEF6.Models

{

  public classProductModel

  {

    //Declare the context

    staticnorthwindEntities ctx = newnorthwindEntities();

 

    //Method to retrieve all the records

    public staticIEnumerable<Product> GetAllProducts()

    {

      var result = from r in ctx.Products select r;

 

      return result;

    }

 

    //Add record

    public static void AddProduct(Product product)

    {

      ctx.Products.Add(product);

      ctx.SaveChanges();

    }

 

    //Update record

    public static void UpdateProduct(Product product)

    {

      var result = from r in ctx.Products where r.ProductID == product.ProductID select r;

 

      result.FirstOrDefault().ProductName = product.ProductName;

      result.FirstOrDefault().Discontinued = product.Discontinued;

 

      ctx.SaveChanges();

    }

 

    //Delete record

    public static void DeleteProduct(Product product)

    {

      var result = from r in ctx.Products where r.ProductID == product.ProductID select r;

 

      ctx.Products.Remove(result.FirstOrDefault());

      ctx.SaveChanges();

    }

  }

}

Add Webform and ListView

  1. Right click on WebAppNWEF6 (tree view item).
  2. Select Add -> Web Form.
  3. Leave the name as default Webform1.

 

4. Click OK.

 

5. Click on the Toolbox.

6. Go to the Data section. Select the ListView.

 

7. Drag the ListView to in between the <div> tag in Webform1 Web Form.

 

Add and configure Object Data Source and ListView

  1. Click on Choose Data Source: (split button) in ListView Tasks in the Design view.
  2. Click on <New data source...> list item.
  3. Click on Object (list item) in Data Source Configuration Wizard.
  4. Leave the name as default ObjectDataSource1. Click on OK (button).

 

 5. Click on Open (button) in Configure Data Source - ObjectDataSource1.

6. Select WebAppNWEF6.Models.ProductModel in list item.

 

7. Click on Next > button in Configure Data Source - ObjectDataSource1.

 

8. Click Choose a method list item in Configure Data Source - ObjectDataSource1.

9. Select GetAllProducts() returns IEnumerable<Product>.

 

10. Click on UPDATE tab item in Configure Data Source - ObjectDataSource1.

11. Select UpdateProduct(Product product) in Choose a method list item.

 

12. Click on INSERT tab item in Configure Data Source - ObjectDataSource1.

13. Select AddProduct(Product product) in Choose a method list item.

 

14. Click on DELETE tab item in Configure Data Source - ObjectDataSource1.

15. Select DeleteProduct(Product product) in Choose a method list item.

16. Click on Finish button in Configure Data Source - ObjectDataSource1.

 

17. Click on Configure ListView... (link) in ListView Tasks.

 

18. Click on Enable Editing, Enable Inserting, and Enable Deleting (check box) in Configure ListView.

19. Click on Colorful (list item) in Configure ListView.

20. Click on OK (button) in Configure ListView.

 

Basically all set but do not forget to add the DataKeyNames in the ListView. The DataKetNames that we use for the sample is ProductID, the primary key of the table. The html tag for ListView will look as follow.

<asp:ListViewID="ListView1"runat="server"DataSourceID="ObjectDataSource1"InsertItemPosition="LastItem"DataKeyNames="ProductID">

You can now run the project to try the Wenform1 that you have just created

About the author

Ming Man is Microsoft MVP since year 2006. He is a software development manager for a multinational company. With 25 years of experience in the IT field, he has developed system using Clipper, COBOL, VB5, VB6, VB.NET, Java and C #. He has been using Visual Studio (.NET) since the Beta back in year 2000.  He and the team have developed many projects using .NET platform such as SCM, and HR based applications. He is familiar with the N-Tier design of business application and is also an expert with database experience in MS SQL, Oracle and AS 400.  Additionally you can read Ming’s Channingham’s blog

About MVP Monday

The MVP Monday Series is created by Melissa Travers. In this series we work to provide readers with a guest post from an MVP every Monday. Melissa is a Community Program Manager, formerly known as MVP Lead, for Messaging and Collaboration (Exchange, Lync, Office 365 and SharePoint) and Microsoft Dynamics in the US. She began her career at Microsoft as an Exchange Support Engineer and has been working with the technical community in some capacity for almost a decade. In her spare time she enjoys going to the gym, shopping for handbags, watching period and fantasy dramas, and spending time with her children and miniature Dachshund. Melissa lives in North Carolina and works out of the Microsoft Charlotte office.

Friday Five - January 17, 2014

Identity in your own apps with Windows Azure Active Directory

$
0
0

Editor’s note: The following post was written by Office 365 MVP Martina Grom and Client Development MVP Toni Pohl. This is the first of a 4 part series

Part 1: Use authentication from the cloud

No user likes it – but everybody needs it: Security. Some things never change, not even in IT. When using a Line of Business (LOB) application, you still need to authenticate in the same manner as 20 years before. You login into an app with a given username and a password. But what changed are the mechanisms behind the scenes. In former times you could only access your apps within your company network, later with the use of Virtual Private Networks (VPN) from your location into your firm. Today we´re working with identities on a larger level and try to use Authentication Providers we trust or our own Active Directory in the cloud, in some cases in a hybrid scenario. The advantage for the user is that he can access apps and services with one single identity – the one he got from the company and Single Sign On (SSO) scenarios are possible.

When implementing solutions for our customers these are mainly LOB apps. In Microsoft Office 365 we benefit from the usage of Windows Azure Active Directory (WAAD). In case you didn´t know: every Office 365 tenant uses WAAD. So, if you´re already working with Office 365, all Directory Objects are residing in WAAD which regulates the authentication in a central object model in the cloud. We can use apps and code to remote control SaaS. If your company also works with WAAD (or Office 365) you can even use this for the authentication process in your own apps.

 

This is the part where Development meets Software as a Service. In our line-up this is where we both MVP´s can combine our knowledge to produce solutions with reliable and secure authentication from the cloud. This is part 1 of a 4-part series where we will show the magic of using Windows Azure Active Directory and Office 365 Software as a Service to automate tasks in a user´s enterprise life. In this article we show how to use WAAD as your authorization system in your own website.

Even if you have a hybrid scenario running in your company, it´s the same story. By now there are many thousands of WAAD´s running in Microsoft´s datacenters. The benefit is, enterprise IT doesn´t have to care about security, maintenance, backup and scalability, it´s all done for you by Microsoft – and it´s free, even without an Office 365 subscription you can use WAAD without costs.

For apps the authorization flows like this: A user requests access from an authority who handles the authentication. The app must be known by the authority. Upon successful authentication, the browser gets redirected back to the web app along with a security token (T in the graphics below) carrying information about the incoming user.

 

Image source: http://msdn.microsoft.com/en-us/library/windowsazure/dn151790.aspx

So, let´s use this service for our authorization. The goal is to create a new ASP.NET website with functionality to automate Office 365 services: Authentication, SharePoint Online and Exchange Online. The best at the beginning: You don´t have to write any line of code for accomplishing this task. First, let´s start with the prerequisites. You need

  • Visual Studio 2013
  • A Windows Azure Active Directory (WAAD)

The WAAD can be created in the Windows Azure Portal https://manage.windowsazure.com or in the Office 365 portal https://portal.microsoftonline.com. For implementing the later steps we need an Office 365 tenant, so we recommend creating a new free 30-day subscription like here: https://portal.microsoftonline.com/partner/partnersignup.aspx?type=Trial&id=d4424e90-7069-4148-ad5d-4871a577929f&msppid=575861 . Fill out the form and you get your Office 365 tenant instantly.

Once you have your administrator-login start Visual Studio 2013. Click File / New / Project. In the New Project dialog select Visual C# / Web and ASP-NET WebApplication, choose your project location and click OK.

Here comes the important part: Choose the MVC template and click Change Authentication on the right side. Now you can use the (new) VS 2013-wizard to connect to a specific domain in WAAD.

 

Change Authentication has options for several authentication scenarios: No Authentication, Individual User Authentication, Organizational Accounts or Windows Authentication. We want to use authentication against an Office 365 domain, so we mark Organizational Accounts. On the right side we use Cloud – Single Organization and enter our domain. This can be a custom domain (added and verified in the Office 365 portal) or the predefined onmicrosoft.com domain. Access Level defines, if we only need to authenticate with Single Side on, if we want to read AD-objects or if we want to read and write into the Directory. Let´s check the last option for reading and writing.

 

Below Visual Studio generates an App ID URI for our application. This can be any string to identify our app within the WAAD, but it must be in URL-format, like http://mycompany/myapp1 or similar. We leave the generated name and click OK.

Now you need to login with any admin-account from that domain. In our case we use our admin@mvpdemo2014.onmicrosoft.com-account and our password. After this login VS returns to the new portal site. The authentication is already filled in. Click OK in the dialog and Visual Studio handles all the rest.

 

What´s happening behind the scenes is that VS now installs the NuGet package “Active Directory Authentication Library” for the authorization. This package contains the Windows Azure AD Authentication Library for .NET (ADAL). ADAL simplifies the whole Token handling for developers when working against the GraphAPI of WAAD. Windows Azure AD Graph provides programmatic access to Windows Azure Active Directory (AD) through REST API endpoints. The second thing the wizard did was to create a new application in our WAAD, we´ll look at this later. This is how our ASP.NET MVC-solutions looks like in VS 2013:

 

After VS created the project for us, let´s have a look into web.config. First, we got two new sections: system.identityModel and system.identityModel.services with some data in it. AppSettings have been extended with keys like “ida:FederationMetadataLocation” and so on. The prefix “ida” stands for Identity and Authorization. These values are important for identifiying our app. ida:ClientID holds our application-ID and ida:Password is our secret. With submitting these values we ensure that only our app with the secret can access the WAAD – and no other app with no or wrong secret (don´t give the secret away, anyway, you can recreate a new secret if yours gone public). These settings are stored in the /applications node of the domain in WAAD. ida:AudienceUri and ida:Realm is the Unique ID of our app (in URL-format). ida:FederationMetadataLocation is the address of the XML-document where we get the endpoints for login and logout. If you open the address you see a lot of information for the WS-Federation method. And we have a database connection like …|DataDirectory|\aspnet-MyPortal-<datetime>.mdf… for storing some IDs.

If you open global.asax in Application_Start()some routines are called. The most interesting part is IdentityConfig.ConfigureIdentity();. If you jump into the definition with F12-key \App_Start\IdentityConfig.cs is opened. Here you can see the usage of the ida* AppSetting-keys. Back in global.asax the procedure void WSFederationAuthenticationModule_RedirectingToIdentityProvider(…) is the part where the app does the redirection to the authentication provider. Because in our web.config we have values that only authenticated users shall get access to our portal with <system.web>, <authorization>, <allow users="*" /> the ASP.NET system sees that authentication is needed and uses the information in our config to redirect to the defined provider – in our case Microsoft WAAD.

Of course all project properties are also set. Our localhost is using SSL on a new port (here 44324). This address is also saved in the WAAD in our application – it´s the return address if the authentication worked.

 

Again, the nice thing: The wizard made all necessary steps for us and we simply can use the functionality now. So, after looking around in the web app, simply hit F5 to run the app.

After the browser was opened, the redirection takes place and the login webpage of WAAD follows. Depending on the speed you see the opening of https://localhost:44324/ (our app) followed by the URL of the WAAD login-page like https://login.microsoftonline.com/login.srf?wa=wsignin1.0&wreply=https%3a%2f%2flogin.windows.net%2f... . If the user authenticates correctly with his login and password – followed by the wsfed document URL - the start page of our MVC-app (Views/Home/Index.cshtml) follows. If the login was not correct it simply ends here.

 

If you protocol the HTTP-traffic with a proxy tool like Fiddler, you see the different URLs and what happens here. Yes, all traffic is running on HTTPS (and some calls on HTTP with port 443 which is the same secure way). Finally we end at https://localhost:44324 with HTTP status code 200 (OK) – our app.

 

So the login worked out of the box.

Another interesting part is what happens in the app. Well, it´s a standard app with almost no functions, but you can click on then logged-in user name and – voila: We get data of the user object. The app shows Display name, first and last name.

 

So, our conclusion for part 1 is that we don´t need to write any code to use WAAD in own web applications. The Visual Studio 2013 team did great work and the wizard and ASP.NET does all the steps for us if we simply provide the information which domain we want to use in our app to protect it with a robust, secure login mechanism.

In part 2 we´ll have a deeper look and see what happened in our new WAAD, what we can do there and how GraphAPI works!

 

About the authors

Toni Pohl has worked for nearly 20 years as an independent IT expert. Toni began his IT career in the early 1990s as a trainer. After working as a trainer, his emphasis changed to the field of consulting and developing software solutions using database systems. In 1999 Toni Pohl founded atwork information technology group along with Martina Grom. They share the business administration. He is responsible for new business development, technology, and quality assurance. Toni Pohl has been awarded in 2013 by Microsoft with the Most Valuable Professional Award (MVP) for his expertise in Client Development. Toni blogs about new technologies for the atwork-blog, in Microsoft-TechNet-Team-Austria Blog, in codefest.at, the Austrian MSDN Blog and in cloudusergroup and is writing technical articles for various magazines. You can connect with him on Twitter, Facebook, or Linkedin.


Martina Grom works as IT-Consultant & is co-founder and CEO of atwork information technology. atwork is located in Austria/Europe and is specialized in development of Online Solutions. Martina & Toni founded atwork in 1999, and she has worked in IT since 1995. Martina is recognized as an expert in Microsoft Office Online Services solutions and was worldwide one of the first 8 MVP’s awarded in 2011 for her expertise in Office 365. She writes numerous articles and blog posts. Her passion is Online & Social Media, cloud computing and Office 365. Martina consults companies on their way to the cloud. Martina has a master degree in international business administration of University of Vienna, You can connect with her on Twitter, Facebook, Linkedin or join her cloudusergroup

 

About MVP Monday

The MVP Monday Series is created by Melissa Travers. In this series we work to provide readers with a guest post from an MVP every Monday. Melissa is a Community Program Manager, formerly known as MVP Lead, for Messaging and Collaboration (Exchange, Lync, Office 365 and SharePoint) and Microsoft Dynamics in the US. She began her career at Microsoft as an Exchange Support Engineer and has been working with the technical community in some capacity for almost a decade. In her spare time she enjoys going to the gym, shopping for handbags, watching period and fantasy dramas, and spending time with her children and miniature Dachshund. Melissa lives in North Carolina and works out of the Microsoft Charlotte office.

MVP Featured App: NOAA Doppler Radar Mosaic Imagery

$
0
0

As a massive winter storm hits the Mid-Atlantic and Northeast regions of the United States, Windows Entertainment and Connected Home MVP Barb Bowman wants residents to know they can stay informed.

 "The National Oceanic and Atmospheric Administration (NOAA) makes their Doppler animations available but no one, as far as I know, had put together anything comprehensive to access them all as a Windows 8 app. app," said Bowman.  Until now.

Meet "NOAA Doppler Radar Mosaic Imagery."  A new and improved Windows 8 app that allows users to get a complete view of the weather, snow, rain and all. 

"I’d had the idea for this app for a long time," said Bowman, "But never got to the point of sitting down with Visual Studio and putting anything together."  With the launch of Project Sienna, Bowman realized she could easily develop the app.  She reported her first experiment with the functionality of Project Sienna on her blog before designing the first of two weather tracking apps.  Bowman, who has been recognized as an MVP 13 times, received encouragement from fellow MVPs to proceed to phase two of the app by including 155+ individual Doppler stations. 

"Finding the animated images and creating the Excel workbooks was actually the most time consuming part of the project."

You can download NOAA Doppler Radar Mosaic Imagery for free

 

Have you created a new Windows 8 app?  Email your Lead with the details and your app could be featured on the MVP Award Blog


Friday Five - January 25, 2013

Calendar Federation with an Exchange Hybrid

$
0
0

Editor’s note: The following post was written by Office 365 MVP Loryan Strant and Exchange Server MVP Michael Van Horenbeeck

In this blog post Exchange Server MVP Michael Van Horenbeeck and Office 365 MVP Loryan Strant will attempt to bridge the divide between the on-premises world and the cloud – and why sometimes it doesn’t always work as we expect.

A great feature of Exchange Server 2010 and 2013 is the ability to share calendar Free/Busy information with users outside of your organisation. This is also the case with Office 365 – and in fact is even easier for users to do.

However in the situation of a hybrid between Exchange Server and Exchange Online there can be a very small caveat that can cause confusion.

Before going into this let’s take a few steps back.

What is Exchange federation?

There are a number of federation types utilised in Microsoft technologies and specifically Office 365.  Federation ultimately is the creation of a “trust relationship” between two organisations so that information can be shared between them.

Similar to Active Directory Federation Services sharing and security tokens and identity information, Lync federation allowing separate organisations to see each other’s presence, setting up a federation between Exchange environments allows for calendar Free/Busy information to be exchanged (pardon the pun) between them.

Ultimately this is useful for people in different organisations to see each other’s calendar availability and schedule a meeting without having to correspond a great deal beforehand. We’ve all been in that situation where scheduling a meeting with more than one person in a different organisation – it’s not easy. Exchange calendar federation fixes that!

What it’s used for and how it works

While a hybrid effectively joins Exchange Server to Exchange Online to appear as a single environment for a single business – it is important to remember that these are separate environments and that the federation is simply providing a mechanism for information to flow simply and easily between them.

Exchange federation is also used to allow separate businesses to share Free/Busy information between each other. This was covered in a previous blog by Loryan Strant in the early days of Office 365: http://blogs.msdn.com/b/mvpawardprogram/archive/2011/08/08/mvps-for-office-365-establishing-calendar-sharing-between-office-365-customers.aspx

How does Exchange federation work in a hybrid environment

As described earlier, Exchange federation is used to exchange information between two environments. Prior to Exchange Server 2010 if a company wanted to share calendar information with another company, it had to go through a series of steps to set it up. One of these steps was to exchange account information for a service account which would be used to retrieve the requested information. Because this is not always desirable, Microsoft developed a service called the “Microsoft Federation Gateway”, hereafter referred to as the MFG.

The MFG acts as an authentication broker between both environments and explains where the term “Exchange federation” comes from. Requests from one organization to the other are “authenticated” through the MFG, and therefore these requests are federated. It does not matter if an organization wants to federate with a remote organization or with its hybrid counterpart in Office 365, the principle and how it works is the same. In fact, Exchange treats its hybrid counterpart as if it would be a remote organization which – to be technically correct – it is.

In order to be able to use this free MFG service from Microsoft, one has to setup a ‘trust’ with the MFG. Usually, this is something which has to be done manually. But in case of a hybrid deployment, the Hybrid Configuration Wizard automatically takes care of this.

Next to having a trust with the MFG, an organization has to setup a “relationship” with the other organization (or hybrid counterpart) by means of creating an Organization Relationship. This organization relationship is an object in Exchange which provides Exchange with more information on what URI to contact the other Exchange organization and on what information can be exchange between both environments.

In a hybrid deployment both the Exchange on-premises environment and Exchange Online have an organization relationship which look similar to what you see in the following image:

 

The on-premises organization has an organization relationship with information which points to Office 365 and the Exchange online organization has information about the on-premises deployment.

Now, let’s have a high-level look at how Exchange uses all these components to query Free/Busy information in a hybrid deployment. Take a look at the following image:

 

  1. User ‘Loryan’ requests Free/Busy information for Michael’s mailbox (michael@contoso.com). This request is made at the on-premises Exchange server.
  2. The Exchange server will lookup michael@contoso.com and find a mail-enabled user object. This object has a targetAddress attribute which points to Michael’s mailbox in Office 365 (michael@contoso.mail.onmicrosoft.com).
  3. The Exchange server will now lookup its organizational relationships and verify if it has one for contoso.mail.onmicrosoft.com
  4. Now, the Exchange server will contact the Microsoft Federation Gateway and request a authentication token for contoso.mail.onmicrosoft.com
  5. The MFG sends back a token because contoso.com has a trust relationship with the MFG.
  6. The on-premises Exchange server now uses the information it obtained from the Organization Relationship to do an Autodiscover request for contoso.mail.onmicrosoft.com in order to retrieve the remote Exchange Web Services endpoint it should connect to. It then uses the address it received to send the Free/Busy request to.
  7. Exchange online will first check whether the MFG token which the Exchange on-premises has sent across with is valid before accepting the Free/Busy Request.
  8. Exchange online will also verify its organization relationship so that it knows what information it is allowed to return
  9. Exchange online queries Michael’s mailbox for the Free/Busy information
  10. Exchange online sends back the Free/Busy information to the on-premises organization
  11. The on-premises Exchange server sends back the request Free/Busy information to Loryan

As you can see, there are quite some ‘moving parts’ in requesting a remote user’s Free/Busy information. The same process is applied when an Exchange Online user queries the Free/Busy information for an on-premises user.

 

External organisational Free/Busy constraints in Exchange hybrid scenario

Now that we know how Exchange queries Free/Busy information, let’s have a look at the following scenario.

Contoso and Paradyne wish to share Free/Busy information with one another. In order to do so, they decide to setup everything that is necessary to make this work using Exchange Federation. Paradyne, however, is also enrolled in a hybrid deployment. In this particular scenario, Contoso users will not be able to request Free/Busy information for user in Paradyne’s organization if they are Exchange online users.

Take a look at the following scenario in which Loryan – working for Contoso – tries requesting Free/Busy information for Michael whose mailbox is hosted in the Exchange Online tenant of Paradyne:

 

 

  1. User ‘Loryan’ requests Free/Busy information for Michael’s mailbox (michael@paradyne.com). This request is made at the on-premises Exchange server.
  2. Given that the Exchange server isn’t authoritative or otherwise configured for paradyne.com it will lookup whether it has an organization relationship for that domain.
  3. Now, the Exchange server will contact the Microsoft Federation Gateway and request a authentication token for paradyne.com
  4. The MFG sends back a token because there’s a trust relationship with the MFG.
  5. The on-premises Exchange server now uses the information it obtained from the Organization Relationship to do an Autodiscover request for paradyne.com in order to retrieve the remote Exchange Web Services endpoint it should connect to. It then uses the address it received to send the Free/Busy request to.
  6. The Paradyne Exchange server will first check whether the MFG token which the Exchange on-premises has sent across is valid before accepting the Free/Busy Request.
  7. The Paradyne Exchange server will also verify its organization relationship so that it knows what information it is allowed to return.
  8. The Exchange server now looks up the recipient Michael@paradyne.com and discovers that object has a targetAddress stamped to it, which points to the cloud.
  9. The Paradyne Exchange server now sends back that information to the Exchange server at Contoso.

There’s no 10th bullet. The Contoso Exchange server now receives back information which it doesn’t know how to handle. In fact, even if Exchange would know how to handle the information (which isn’t Free/Busy information but rather a redirect), there would still be an issue because the Contoso Exchange organization doesn’t have an organization relationship with the Exchange Online tenant of Paradyne. So, as a result, this scenario is sort-of broken.

Workaround

With Office 365 becoming more and more popular, it’s more likely you will encounter such a scenario sooner rather than later. As long as the remote organization is an Exchange Online-only deployment, everything will work fine. In the case where the remote organization has a hybrid deployment, there’s one thing you can do: create (additional) separate Organization Relationships to and from the remote organization’s Exchange Online tenant. This allows you to bypass the limitation we discussed earlier. However, in order to query those user’s Free/Busy information, you would also need to use their target addresses, e.g. michael@paradyne.mail.onmicrosoft.com instead of their regular email address (michael@paradyne.com).

This workaround is far from perfect, but will at least allow you to query Free/Busy information for the remote organization’s Exchange Online users. The fact that you will need to know what users are hosted in Exchange online and what their targetAddresses are, is just something you’ll have to learn with until Exchange gets updated with the code required to handle these ‘redirects’.

 

About the authors

Loryan Strant is the Founder of Paradyne and a high profile ambassador for Microsoft Office 365. Drawing on his technical know-how and business brains, Loryan makes a habit of solving business problems through the application of new technology.  The application of Office 365 for business is his latest success story.

Considered a thought-leader in his field, Loryan is regularly called upon by Microsoft and other leading organisations to deliver talks on best practice, cloud security and how to capitalise on cloud technology for excelled business performance. His expertise and plain-English explanations have been consolidated into a comprehensive guide for businesses that Loryan co-authored and subsequently published; “Microsoft Office 365: Exchange Online Implementation and Migration”.

A highly-respected contributor to industry magazines and blogs, Loryan’s technical wisdom is quoted regularly in Microsoft marketing materials and technical journals. In 2011, Loryan was awarded the Microsoft Office 365 MVP Status (Most Valuable Professional), recognising him as a force to be reckoned with, amongst technical communities around the world.  Follow Loryan on Twitter or check out his blog.

==========

Michael Van Horenbeeck is a Microsoft Certified Solutions Master and Exchange Server MVP from Belgium, specialized in Exchange, Office 365, Active Directory and a bit of Lync. Michael has been active in the industry for 12 years and developed a love for Exchange back in early 2000. He is a frequent blogger, member of the Belgian Unified Communications User Group ‘Pro-Exchange’ and also a regular contributor to The UC Architects podcast. In addition to that, he frequently speaks at various international conferences and writes articles for several tech websites. You can follow Michael on twitter (@mvanhorenbeeck) or through one of his blogs on michaelvh.wordpress.com and www.pro-exchange.be

About MVP Monday

The MVP Monday Series is created by Melissa Travers. In this series we work to provide readers with a guest post from an MVP every Monday. Melissa is a Community Program Manager, formerly known as MVP Lead, for Messaging and Collaboration (Exchange, Lync, Office 365 and SharePoint) and Microsoft Dynamics in the US. She began her career at Microsoft as an Exchange Support Engineer and has been working with the technical community in some capacity for almost a decade. In her spare time she enjoys going to the gym, shopping for handbags, watching period and fantasy dramas, and spending time with her children and miniature Dachshund. Melissa lives in North Carolina and works out of the Microsoft Charlotte office.


 

 

 

 

 

 

 

 

 

Helping Victims of Typhoon Haiyan - with SharePoint MVP Jennifer Mason

$
0
0

On November 8, 2013, Typhoon Haiyan struck the Philippines killing over 6,000 people becoming the deadliest and most powerful typhoon in history.  In the aftermath of the storm, SharePoint MVP Dux Raymond Sy formulated a plan for himself and fellows MVPs to assist in the disaster relief.  SharePoint MVP Jennifer Mason was one of the first to respond to Dux's call for MVP involvement. We sat down with her to discuss the project and how you can help victims of Typhoon Haiyan.  

 (Please visit the site to view this video)

MVPs, Disaster Relief and Tony Surma

$
0
0

Last month, we introduced you to Tony Surma, Chief Technical Officer of Microsoft’s Disaster Response, on our website.  We sat down with Tony to discuss MVP's role in developing technology, code and strengthening community to better respond to disasters across the globe.  For more information about Microsoft's disaster response, check out their website

(Please visit the site to view this video)

Friday Five - January 30, 2014

Identity in Your Own Apps with Windows Azure Active Directory - Part 2

$
0
0

Editor’s note: The following post was written by Office 365 MVP Martina Gromand Client Development MVP Toni Pohl. This is the second of a 4 part series. Read part 1 here. 

Identity in your own apps with Windows Azure Active Directory

Part 2: Whats happening in WAAD

In part 1 (link) we created an ASP.NET MVC app with authentication against an Office 365 domain. We didn´t have to code anything in our website, Visual Studio 2013 handles all steps for us, includes the libraries and creates the settings and code for the project. That´s cool stuff.

Now let´s have a look into the Windows Azure Active Directory (WAAD). Can we do that? Yes, there are several ways, we´ll show two scenarios. Let´s start with the management-possibilities in this article.

The easiest way is to use the Windows Azure portal. For Azure you need – of course - a valid Azure-account. Depending, how you created the WAAD you can see this domain and administer it directly in the Azure portal. If you didn´t use Windows Azure but created a new Office 365 tenant, we need to add this domain to the portal. We did that, so here are the steps for adding the domain.

Login to https://manage.windowsazure.com/ with your Azure account and go to the Active Directory module close to the bottom in the left menu. In the list of all AD´s you see already connected WAAD´s.

 

So, let´s add our Office 365 domain here. Click Add on the bottom menu. In the dialog choose use existing directory from the dropdown - Otherwise you can create a new AD here too. Check the ready to be signed out-switch. It´s a good idea to delete all cookies before so that IE isn´t confused with already existing cookies and identities.

 

Now Azure redirectes to another login form. Choose Use another account here. Now use your Office 365 admin account and password here. In the next step add your personal Azure account as global administrator in the Office 365 AD. Click Next.

 

Azure prompts the user with a final message that you can use the AD now in your Azure portal with your Azure-account. Click Logout now and reuse your existing Azure-account to login into Azure again.

When going back to the Active Directory list you see a new entry (in our case it´s AD number 9 with name “atwork”). Click on it to rename it and look into the details.

 

First we want to rename this AD in the Configure menu. Let´s give this AD it´s real name mvpdemo2014.onmicrosoft.com to identify it easily. After that click Save.

 

Now comes the interesting part. Look into Applications to see what happened here.

The Visual Studio Wizard created an app “MyPortal” with the app URL http://localhost:44324 in the WAAD. Open the app to see the app settings. Here´s the App URI and the address of the WS federation document as well as some more settings.

 

When clicking on Configure we can modify App name, URL, client ID and so on. If the app secret key is compromised, we can create a new one for 1 or 2 years. When saving the settings, a new key is generated. We can use this key to replace the old one in our app (web.config, key ida:Password).

 

So all keys we need in our app can be managed here in the Azure portal. They are stored directly in the WAAD.

We also can manage users and groups here. Change to the Users or Groups menu and add new objects into the AD. By default we have the user who created the AD – in our case the admin-user admin@mvpdemo2014.onmicrosoft.com who created the Office 365 tenant – and the second admin-user tp@atwork.at (a Microsoft Account which is the login for our Azure tenant) we needed to add to access this WAAD from our Azure portal.

 

The portal offers all standard functions up to resetting a user´s password. Simply add users and groups. It´s the same procedure as in the Office 365 https://portal.microsoftonline.com portal. You can choose if the new user object is a new one or even connect other users as in this screenshot.

 

The important information here is that if your app uses an user account for accessing WAAD (f.e. in the Visual Studio wizard for connecting a new project with Single Sign on) this user has to be a global administrator – it cannot be a standard user. Only admins can access the GraphAPI (see part 3).

 

All changes are made instantly because they are immediately stored directly in the WAAD. You can also add custom domains in the Azure portal like f.e. mvpdemo2014.com and define if Directory Integration is allowed or not. So it´s the same as in the Office 365 portal, wih the same rules. Custom domains must be verified and Hybrid-scenarios with ADFS or DirSync also work.

It´s a relatively new feature since fall 2013 that you can add more than one AD into your Azure tenant. An interesting part is that you cannot delete WAAD´s here in the Azure portal. It seems this function is to come, otherwise it´s going to be a little be confusing in our Azure tenant with many AD´s.

So we have a tool to manage our WAAD in the cloud: The Windows Azure Portal. As we mentioned at the beginning there´s a second way a second way to view and manage our WAAD. This is covered in part 3.

 

About the authors

Toni Pohl has worked for nearly 20 years as an independent IT expert. Toni began his IT career in the early 1990s as a trainer. After working as a trainer, his emphasis changed to the field of consulting and developing software solutions using database systems. In 1999 Toni Pohl founded atwork information technology group along with Martina Grom. They share the business administration. He is responsible for new business development, technology, and quality assurance. Toni Pohl has been awarded in 2013 by Microsoft with the Most Valuable Professional Award (MVP) for his expertise in Client Development. Toni blogs about new technologies for the atwork-blog, in Microsoft-TechNet-Team-Austria Blog, in codefest.at, the Austrian MSDN Blog and in cloudusergroup and is writing technical articles for various magazines. You can connect with him on TwitterFacebook, or Linkedin.


Martina Grom works as IT-Consultant & is co-founder and CEO of atwork information technology. atwork is located in Austria/Europe and is specialized in development of Online Solutions. Martina & Toni founded atwork in 1999, and she has worked in IT since 1995. Martina is recognized as an expert in Microsoft Office Online Services solutions and was worldwide one of the first 8 MVP’s awarded in 2011 for her expertise in Office 365. She writes numerous articles and blog posts. Her passion is Online & Social Media, cloud computing and Office 365. Martina consults companies on their way to the cloud. Martina has a master degree in international business administration of University of Vienna, You can connect with her on Twitter,FacebookLinkedin or join her cloudusergroup

 

About MVP Monday

The MVP Monday Series is created by Melissa Travers. In this series we work to provide readers with a guest post from an MVP every Monday. Melissa is a Community Program Manager, formerly known as MVP Lead, for Messaging and Collaboration (Exchange, Lync, Office 365 and SharePoint) and Microsoft Dynamics in the US. She began her career at Microsoft as an Exchange Support Engineer and has been working with the technical community in some capacity for almost a decade. In her spare time she enjoys going to the gym, shopping for handbags, watching period and fantasy dramas, and spending time with her children and miniature Dachshund. Melissa lives in North Carolina and works out of the Microsoft Charlotte office.

Friday Five - February 7, 2014


Enable Cross-Premises Network Connectivity to Windows Azure with Windows Server 2012 R2

$
0
0

Editor’s note: In partnership with Microsoft Press, now celebrating their 30th year, MVPs have been contributing to an ongoing guest series on their official team blog. Today’s article is from Enterprise Security MVP Richard Hicks which is the 38th in the series.

Introduction

If you are an IT professional, no doubt you have a lab environment to perform development, pre-production testing, quality assurance (QA), and many other things I’m sure. With the introduction of Windows Azure Infrastructure-as-a-Service (IaaS), IT pros can now leverage the flexibility and near-limitless capacity of the public cloud to serve this purpose much more effectively. However, lab environments often require access to resources and data not easily migrated to the cloud. In addition, if your testing requires specialized hardware configurations (e.g. multiple network adapters, hardware security modules, physical HBAs, etc.) then moving your entire lab to the cloud won’t be possible. You can, however, extend your lab environment to include resources in Windows Azure to provide additional capacity as needed. In this article I’ll demonstrate how you can use the site-to-site VPN feature of the Routing and Remote Access (RRAS) role in Windows Server 2012 R2 to serve as a gateway to enable cross-premises network connectivity between your on-premises lab network and your Windows Azure virtual networks. 

Continue reading full article here

 

 About the author

Richard Hicks (MCP, MCSE, MCTS, and MCITP Enterprise Administrator) is a network and information security expert specializing in Microsoft technologies. As a five-time Microsoft MVP, he has traveled around the world speaking to network engineers, security administrators, and IT professionals about Microsoft security solutions. Richard has nearly 20 years of experience working in large scale corporate computing environments and has designed and deployed perimeter defense and secure remote access solutions for some of the largest companies in the world. Richard is currently the Director of Sales Engineering for Iron Networks, a Microsoft OEM partner developing secure remote access, network virtualization, and converged cloud infrastructure solutions. You can keep up with Richard at www.richardhicks.com and follow him on Twitter @richardhicks.

About MVP Monday

The MVP Monday Series is created by Melissa Travers. In this series we work to provide readers with a guest post from an MVP every Monday. Melissa is a Community Program Manager, formerly known as MVP Lead, for Messaging and Collaboration (Exchange, Lync, Office 365 and SharePoint) and Microsoft Dynamics in the US. She began her career at Microsoft as an Exchange Support Engineer and has been working with the technical community in some capacity for almost a decade. In her spare time she enjoys going to the gym, shopping for handbags, watching period and fantasy dramas, and spending time with her children and miniature Dachshund. Melissa lives in North Carolina and works out of the Microsoft Charlotte office.

MVPs to Present at the Microsoft Embedded Conference in Naples

$
0
0

This Saturday, five MVPs will present at the Microsoft Embedded Conference in Naples, a free event focused on embedded Microsoft technologies. With session titles like, "Is My Washing Machine Flirting with my Neighbor's Water Heater?" and "Windows 8 POS API, for Developing Point of Service Systems in a Simple and Quick Way," the event is sure to be packed full of entertainment, real world applications and deep, technical knowledge that audiences have come to expect from MVPs. We'd like to wish MVPs Valter Minute, Mirco VaniniBeppe Platania, Gianni Rosa Gallina and Marco Dal Pino "buona fortuna" on their presentations!


Is My Washing Machine Flirting with my Neighbor's Water Heater? 
By Windows Embedded MVP Valter Minute

photo


Internet of Things : Smart Home & Smart Factory systems (Part 1 and 2)
By Windows Embedded MVP Mirco Vanini

photo

Exploring Sense 3: An Example of "Intelligent System" for Showrooms, Trade Fairs and Conferences
By Windows Embedded MVPs Beppe Platania& Gianni Rosa Gallina

photophoto

Windows 8 POS API, for Developing Point of Service Systems in a Simple and Quick Way.
By Windows Embedded MVP Marco Dal Pino

photo

Friday Five - February 14, 2014

Identity in Your Own Apps with Windows Azure Active Directory - Part 3

$
0
0

Editor’s note: The following post was written by Office 365 MVP Martina Gromand Client Development MVP Toni Pohl. This is the third of a 4 part series. Read part 1and part 2. 

Identity in Your Own Apps with Windows Azure Active Directory

Part 3: See behind the scenes with GraphExplorer

We´re using the Windows Azure Portal in part 2 (link) to manage Windows Azure Active Directory (WAAD). Existing Office 365 AD´s can be connected with your Windows Azure account as global administrator. From that point on all objects like users, groups, apps and settings within our AD can be managed in the Azure portal. Developers can use a second mechanism: The GraphAPI (http://bit.ly/1eNLLnG ). This interface is used in code and delivers the same functionality as the Azure or Office 365 portal.

Before we look into the details of GraphAPI we want to point out a very helpful tool made by the Windows Azure Active directory product group: The GraphExplorer. You find it online at https://graphexplorer.cloudapp.net/ .

 

As you see by its URL GraphExplorer is a website living itself on Windows Azure. For exploring WAAD we can use a built-in Demo Company AD – or our own. So click Sign In and login with your Office 365 domain administrator. Guess what… GraphExplorer uses the same functionality as our web app we created in part 1 in combination with GraphAPI.

 

After the login let´s experiment with the already filled out API call against our own AD. When clicking the Get-button we get a JSON formatted return for our call: the name and a collections of services for this WAAD. The request goes online to https://graph.windows.net/<active directory domain>/ . The API (a cloud service) reads the request and sends back the informations for that method.

 

What a great tool! Again, without programing anything we can use an online tool for reading object-data of any AD we have administrator-credentials. Let´s explore this a little bit!

/users delivers all users in our AD. With these service methods we get all data stored in the user containers, much more than in the Azure or Office 365 portals, we can access all objects with all public properties.

 

If we look at use object of Martina we created in the Azure portal in part 2 this is what we get: an user object in a collection of users in JavaScript Object Notation format (JSON):

{

      "odata.type": "Microsoft.WindowsAzure.ActiveDirectory.User",

      "objectType": "User",

      "objectId": "2d6d1e86-b93c-421a-a4c6-0032bc15e32a",

      "accountEnabled": true,

      "assignedLicenses": [],

      "assignedPlans": [],

      "city": null,

      "country": null,

      "department": null,

      "dirSyncEnabled": null,

      "displayName": "Martina Grom",

      "facsimileTelephoneNumber": null,

      "givenName": "Martina",

      "jobTitle": null,

      "lastDirSyncTime": null,

      "mail": null,

      "mailNickname": "martina",

      "mobile": null,

      "otherMails": [

        "mg@atwork.at"

      ],

      "passwordPolicies": "None",

      "passwordProfile": null,

      "physicalDeliveryOfficeName": null,

      "postalCode": null,

      "preferredLanguage": null,

      "provisionedPlans": [],

      "provisioningErrors": [],

      "proxyAddresses": [],

      "state": null,

      "streetAddress": null,

      "surname": "Grom",

      "telephoneNumber": null,

      "usageLocation": null,

      "userPrincipalName": "martina@mvpdemo2014.onmicrosoft.com"

    },

 

The good news: This is exactly what we need to do in our own code when working against the GraphAPI. Accessing the API with a special request and reading the returned data.

The same method works with groups. Go back in the browser and try the link to the /groups call. In our AD we have two groups: Redmond and Vienna. We also see that these groups are not synced with a local AD but live in the cloud, they are not mail enabled and so on.

 

If you want to get the members of one specific group go deeper with a call like this: Use /groups/<object-Id of the group>/members, f.e.: https://graph.windows.net/mvpdemo2014.onmicrosoft.com/groups/524785a0-b416-4f1b-b4b7-aa1c6a55bcaa/members .

This is the complete output (return) for that call. We get one user-object, in that case user Martina.

 

In the Azure portal we also saw the app IDs. Of course, they also can be requested with GraphAPI. When opening the link /applications we see all apps as in the portal before.

 

Does this look familiar to you? Sure, it´s all the values we have in our web.config in part 1 when Visual Studio created our ASP.NET MVC app for us. Here´s the counterpart in WAAD, all values are stored in the /applications node of our AD-tree. In WAAD there exist some more metadata including the time frame for the app secret.

Of course, the app secret is also stored in here, but you can´t get it. It´s only visible when creating a new key and never shown as result. You only can renew the secret – but then you have to do it in WAAD as well as in your app(s). It´s also very important to see that the app URI and the replyURLs are stored here. If you put your (web) app to production the URL https://localhost:44324/ of course will change. Then you have to modify the replyURL with the new one (the easiest way is to do this in the Azure portal). As you see there can be more than one replyURL– but in our tests it never worked with more than one URL. The redirection always used the first entry. Maybe this behavior will change in future depending on the address where the call comes from which was in our opinion the idea of this collection.

There´s another node we want to mention: /servicePrincipals (SPN). These object types can be used in applications too. The difference is that SPN´s have more privileges than apps, see more here.

When using Office 365 the call to /subscribedSkus can be helpful. Here you get all assigned license plans for that tenant.

 

The service plans are depending on your subscription. ServicePlan Yammer_Enterprise is always automatically included in the Office 365 plans E3 and E4. When you use a 30-day test tenant, you get full access to all Office 365 features.

Our conclusion and our recommendation here is to have a look into the GraphExplorer online-tool when working with WAAD. This tool is very cool to test requests against GraphAPI and see what results you get. For extending our own portal we´ll use GraphAPI in the next part 4.

 

About the authors

Toni Pohl has worked for nearly 20 years as an independent IT expert. Toni began his IT career in the early 1990s as a trainer. After working as a trainer, his emphasis changed to the field of consulting and developing software solutions using database systems. In 1999 Toni Pohl founded atwork information technology group along with Martina Grom. They share the business administration. He is responsible for new business development, technology, and quality assurance. Toni Pohl has been awarded in 2013 by Microsoft with the Most Valuable Professional Award (MVP) for his expertise in Client Development. Toni blogs about new technologies for the atwork-blog, in Microsoft-TechNet-Team-Austria Blog, in codefest.at, the Austrian MSDN Blog and in cloudusergroup and is writing technical articles for various magazines. You can connect with him on TwitterFacebook, or Linkedin.


Martina Grom works as IT-Consultant & is co-founder and CEO of atwork information technology. atwork is located in Austria/Europe and is specialized in development of Online Solutions. Martina & Toni founded atwork in 1999, and she has worked in IT since 1995. Martina is recognized as an expert in Microsoft Office Online Services solutions and was worldwide one of the first 8 MVP’s awarded in 2011 for her expertise in Office 365. She writes numerous articles and blog posts. Her passion is Online & Social Media, cloud computing and Office 365. Martina consults companies on their way to the cloud. Martina has a master degree in international business administration of University of Vienna, You can connect with her on Twitter,FacebookLinkedin or join her cloudusergroup

 

About MVP Monday

The MVP Monday Series is created by Melissa Travers. In this series we work to provide readers with a guest post from an MVP every Monday. Melissa is a Community Program Manager, formerly known as MVP Lead, for Messaging and Collaboration (Exchange, Lync, Office 365 and SharePoint) and Microsoft Dynamics in the US. She began her career at Microsoft as an Exchange Support Engineer and has been working with the technical community in some capacity for almost a decade. In her spare time she enjoys going to the gym, shopping for handbags, watching period and fantasy dramas, and spending time with her children and miniature Dachshund. Melissa lives in North Carolina and works out of the Microsoft Charlotte office.

Troubleshooting WCF Services during Runtime with WMI

$
0
0

Editor’s note: The following post was written by ASP.Net/IIS MVP Ido Flatow

Troubleshooting WCF Services during Runtime with WMI

One of the coolest features of WCF when it comes to troubleshooting is the WCF message logging and tracing feature. With message logging you can write all the messages your WCF service receives and returns to a log file. With tracing, you can log any trace message emitted by the WCF infrastructure, as well as traces emitted from your service code.

The issue with message logs and tracing is that you usually turn them off in production, or at the very least, reduce the amount of data they output, mainly to conserve disk space and reduce the latency involved in writing the logs to the disk. Usually this isn’t a problem, until you find yourself in the need to turn them back on, for example when you detect an issue with your service, and you need the log files to track the origin of the problem.

Unfortunately, changing the configuration of your service requires resetting it, which might result in loss of data, your service becoming unavailable for a couple of seconds, and possibly for the problem to be resolved on its own, if the reason for the strange behavior was due to a faulty state of the service.

There is however a way to change the logging configuration of the service during runtime, without needing to restart the service with the help of the Windows Management Instrumentation (WMI) environment.

In short, WMI provides you with a way to view information about running services in your network. You can view a service’s process information, service information, endpoint configuration, and even change some of the service’s configuration in runtime, without needing to restart the service.

Little has been written about the WMI support in WCF. The basics are documented on MSDN, and contain instructions on what you need to set in your configuration to make the WMI provider available. The MSDN article also provides a link to download the WMI Administrative Tools which you can use to manage services with WMI. However that tool requires some work on your end before getting you to the configuration you need to change, in addition to it requiring you to run IE as an administrator with backwards compatibility set to IE 9, which makes the entire process a bit tedious. Instead, I found it easier to use PowerShell to write six lines of script which do the job.

The following steps demonstrate how to create a WCF service with minimal message logging and tracing configuration, start it, test it, and then use PowerShell with WMI to change the logging configuration in runtime.

  1. Open Visual Studio 2012 and create a new project using the WCF Service Application template.

After the project is created, the service code is shown. Notice that in the GetDataUsingDataContract method, an exception is thrown when the composite parameter is null.

2. In Solution Explorer, right-click the Web.config file and then click Edit WCF Configuration.

3.In the Service Configuration Editor window, click Diagnostics, enable the WMI Provider, MessageLogging and Tracing.

 

By default, enabling message logging will enable logging of all the message from the transport layer and any malformed message. Enabling tracing will log all activities and any trace message with severity Warning and up (Warning, Error, and Critical).
Although those settings are useful during development, in production we probably want to change them so we will get smaller log files with only the most relevant information.

4. Under MessageLogging, click the link next to Log Level, uncheck Transport Messages, and then click OK.

The above setting will only log malformed messages, which are messages that do not fit any of the service’s endpoints, and are therefore rejected by the service.

5. Under Tracing, click the link next to Trace Level, uncheck Activity Tracing, and then click OK.

The above setting will prevent every operation from being logged, unless those that output a trace message of warning and up. You can read more about the different types of trace messages on MSDN. http://msdn.microsoft.com/en-us/library/ms733025(v=vs.110).aspx

By default, message logging only logs the headers of a message. To also log the body of a message, we need to change the message logging configuration. Unfortunately, we cannot change that setting in runtime with WMI, so we will set it now.

6. In the configuration tree, expand Diagnostics, click Message Logging, and set the LogEntireMessage property to True.

7. Press Ctrl+S to save the changes, close the configuration editor window, and return to Visual Studio.

The trace listener we are using buffers the output and will only write to the log files when the buffer is full. Since this is a demonstration, we would like to see the output immediately, and therefore we need to change this behavior.

8. In Solution Explorer, open the Web.config file, locate the <system.diagnostics> section, and place the following xml before the </system.diagnostics> closing tag: <trace autoflush="true"/>

Now let us run the service, test it, and check the created log files.

9. In Solution Explorer, click Service1.svc, and then press Ctrl+F5 to start the WCF Test Client without debugging.

10. In the WCF Test Client window, double-click the GetDataUsingDataContract node, and then click Invoke. Repeat this step 2-3 times.

Note: If a Security Warning dialog appears, click OK.

11. In the Request area, open the drop-down next to the composite parameter, and set it to (null).

12. Click Invoke and wait for the exception to show. Notice that the exception is general (“The server was unable to process the request due to an internal error.”) and does not provide any meaningful information about the true exception. Click Close to close the dialog.

Let us now check the content of the two log files. We should be able to see the traced exception, but the message wouldn’t have been logged.

13. Keep the WCF Test Client tool running and return to Visual Studio. Right-click the project and then click Open Folder in File Explorer.

14. n the File Explorer window, double-click the web_tracelog.svclog file. The file will open in the Service Trace Viewer tool.

15. n the Service Trace Viewer tool, click the 000000000000 activity in red, and then click the row starting with “Handling an exception”. In the Formatted tab, scroll down to view the exception information.

As you can see in the above screenshot, the trace file contains the entire exception information, including the message, and the stack trace.

Note: The configuration evaluation warning message which appears first on the list means that the service we are hosting does not have any specific configuration, and therefore is using a set of default endpoints. The two exceptions that follow are ones thrown by WCF after receiving two requests that did not match any endpoint. Those requests originated from the WCF Test Client tool when it attempted to locate the service’s metadata.

Next, we want to verify no message was logged for the above argument exception.

16. Return to the File Explorer window, select the web_messages.svclog file, and drag it to the Service Trace Viewer tool. Drop the file anywhere in the tool.

There are now two new rows for the malformed messages sent by the WCF Test Client metadata fetching. There is no logged message for the faulted service operation.

Imagine this is the state you now have in your production environment. You have a trace file that shows the service is experiencing problems, but you only see the exception. To properly debug such issues we need more information about the request itself, and any other information which might have been traced while processing the request.

To get all that information, we need to turn on activity tracing and include messages from the transport level in our logs.

If we open the Web.config file and change it manually, this would cause the Web application to restart, as discussed before. So instead, we will use WMI to change the configuration settings in runtime.

17. Keep the Service Trace Viewer tool open, and open a PowerShell window as an Administrator.

18. To get the WMI object for the service, type the following commands in the PowerShell window and press Enter: $wmiService = Get-WmiObject Service -filter "Name='Service1'" -Namespace "root\servicemodel" -ComputerName localhost $processId = $wmiService.ProcessId $wmiAppDomain = Get-WmiObject AppDomainInfo -filter "ProcessId=$processId" -Namespace "root\servicemodel" -ComputerName localhost

Note: The above script assumes the name of the service is ‘Service1’. If you have changed the name of the service class, change the script and run it again. If you want to change the configuration of a remote service, replace the localhost value in the ComputeName parameter with your server name.

19. To turn on transport layer message logging, type the following command and press Enter: $wmiAppDomain.LogMessagesAtTransportLevel = $true

20. To turn on activity tracing, type the following command and press Enter: $wmiAppDomain.TraceLevel = "Warning, ActivityTracing"

21. Lastly, to save the changes you made to the service configuration, type the following command and press Enter: $wmiAppDomain.Put()

22. Switch back to the WCF Test Client. In the Request area, open the drop-down next to the composite parameter, and set it to a new CompositeType object. Click Invoke 2-3 times to generate several successful calls to the service.

 

23. In the Request area, open the drop-down next to the composite parameter, and set it to (null).

24. Click Invoke and wait for the exception to show. Click Close to close the dialog.

25. Switch back to the Service Trace Viewer tool and press F5 to refresh the activities list.

You will notice that now there is a separate set of logs for each request handled by the service. You can read more on how to use the Service Trace Viewer tool to view traces and troubleshoot WCF services on MSDN. http://msdn.microsoft.com/en-us/library/aa751795(v=vs.110).aspx

 

26. From the activity list, select the last row in red that starts with “Process action”.

 

You will notice that now you can see the request message, the exception thrown in the service operation, and the response message, all in the same place. In addition, the set of traces is shown for each activity separately, making it easy to identify a specific request and its related traces.

27. On the right pane, select the first “Message Log Trace” row, click the Message tab, and observe the body of the message.

 

 

Now that we have the logged messages, we can select the request message and try to figure out the cause of the exception. As you can see, the composite parameter is empty (nil).

If this was a production environment, you would probably want to restore the message logging and tracing to its original settings at this point. To do this, return to the PowerShell window, and re-run the command from before with their previous values:

$wmiAppDomain.LogMessagesAtTransportLevel = $false

$wmiAppDomain.TraceLevel = "Warning"

$wmiAppDomain.Put()

 

Before we conclude, now that your service is manageable through WMI, you can use other commands to get information about the service and its components. For example, the following command will return the service endpoints’ information:
Get-WmiObject Endpoint -filter "ProcessId=$processId" -Namespace "root\servicemodel" -ComputerName localhost

 

 About the author


Ido is a senior architect and trainer at SELA Group, and an expert on Windows Azure and Web technologies, such as WCF, ASP.NET, IIS, and Silverlight. Ido is a Microsoft ASP.NET/IIS MVP, a Microsoft certified trainer (MCT), and the co–author of Microsoft's official courses for WCF 4 (10263A) and Building Web Services for Windows Azure (20487B). Ido is also the co-author of the book Pro .NET Performance, and the manager of the Israeli Web Developers User Group.

You can follow Ido's work on his blog at http://blogs.microsoft.co.il/blogs/idof and on Twitter: @IdoFlatow

About MVP Monday

The MVP Monday Series is created by Melissa Travers. In this series we work to provide readers with a guest post from an MVP every Monday. Melissa is a Community Program Manager, formerly known as MVP Lead, for Messaging and Collaboration (Exchange, Lync, Office 365 and SharePoint) and Microsoft Dynamics in the US. She began her career at Microsoft as an Exchange Support Engineer and has been working with the technical community in some capacity for almost a decade. In her spare time she enjoys going to the gym, shopping for handbags, watching period and fantasy dramas, and spending time with her children and miniature Dachshund. Melissa lives in North Carolina and works out of the Microsoft Charlotte office.

Viewing all 788 articles
Browse latest View live




Latest Images