Quantcast
Channel: The Microsoft MVP Award Program Blog
Viewing all 788 articles
Browse latest View live

MVP Xbox Conference - Remember Your NDA

$
0
0

Xbox MVPs from around the globe arrived on the Microsoft campus today to kick of the 2014 MVP Xbox Conference. 

For over 20 years, Microsoft teams have sat down with these technology and community leaders to provide MVPs visibility into early stage products and new releases giving MVPs the information they need to look forward in their work with community and to gain valuable community feedback to help make our products better.

Most MVPs have numerous communications with Microsoft product teams throughout their award year, but the hands-on experience and deep exchange of ideas at the Xbox Conference is something most members of the community—including Microsoft’s product teams—look forward to. To lay the groundwork for this relationship, all MVPs sign a non-disclosure agreement assuring they will not share any confidential information they may gain from Microsoft. Throughout the Xbox Conference MVPs hear the refrain, “Don’t forget your NDA!” since it’s essential to the dynamic exchange of ideas between the MVP community and Microsoft. In case MVPs need a reminder, Orrin the NDA Octopus has a message for them.


MVPs Help Make the 2014 System Center Universe a Success

$
0
0

Nearly 2,000 technical enthusiasts, industry insiders and first-time System Center adopters from around the globe participated in the 2014 System Center Universe (SCU) event.  SCU has deep roots in the community and brings together top tier presentations in System Center from Microsoft executives, MVPs and other experts.

With more than 30 simulcast locations and 10 hours of live streamed content, MVPs played a crucial role in the development and success of the event.  SCU covered content ranging from Windows Server 2012 and Hyper – V to Windows Azure IaaS and Windows 8.  The popularity of the event is evident from its social media presence, #scu2014 was a trending topic on Twitter just three hours into the event.

The presentation videos from the event can be found here.  This is just the tip of the iceberg!  Prepare for more SCU -

SCU Asia March13th http://www.systemcenteruniverse.asia/

SCU Europe September 17-19 http://www.systemcenteruniverse.ch/

SCU 2015 February 4th 2015 http://www.systemcenteruniverse.com/

 

 

Join us in congratulating the following MVP presenters!

Johan Arwidmark

 

Johan Arwidmark is the Chief Technical Architect with Knowledge Factory. He is a consultant, author and all-around geek specializing in Systems Management and Enterprise Windows Deployment Solutions. 

In addition to his consulting role, Johan present trainings and speaks at several conferences each year, including MMS and TechEd around the world. He is also actively involved in communities like deploymentresearch.com and myitforum.com, and he has been awarded Microsoft Most Valuable Professional (MVP) for eight years. 

Johan is known for an energetic and humorous style, tackling complex concepts using simple "Real World" scenarios and lots of live demos. His areas of expertise include: Enterprise Windows Deployment Tools and Solutions, ConfigMgr, MDT, WinPE, USMT, and WDS.

Kent Agerlund

 

Kent Agerlund works as a consultant; author, trainer and event speaker specializing in System Center Solutions at Coretech. Kent frequently speaks at conferences like TechEd, MMS and is the main author of Mastering Configuration Manager 2012 training and the books Mastering Configuration Manager 2012 and System Center 2012 Configuration Manager Mastering the Fundamentals .

He is also actively involved in communities like TechNet forums, local user groups, and is awarded Microsoft Most Valuable Professional (MVP) for his work with Microsoft System Center Configuration Manager. On stage Kent is known for mixing "Real World" scenarios into his presentation with lots of demos.

Cameron Fuller

 

Cameron Fuller, System Center MVP for Cloud and Datacenter Management, is a Principal Consultant for Catapult Systems and serves as their Corporate Practice Lead for System Center. He focuses on management solutions, with 20 years of infrastructure experience. Cameron coauthored Microsoft Operations Manager 2005 Unleashed (Sams, 2006), System Center Operations Manager 2007 Unleashed (Sams, 2008), and System Center Operations Manager 2007 R2 Unleashed (Sams, 2010), and a contributor to System Center Configuration Manager 2007 Unleashed (Sams, 2009). Cameron has written for Windows IT Professional and TechNet magazines and blogs on System Center related topics. Cameron has presented at numerous Microsoft conferences, including TechEd and MMS.

Maarten Goet

 

Maarten Goet has been a System Center Cloud and Datacenter MVP for six years and is a well-known speaker at leading industry events such as the Microsoft Management Summit, Microsoft Teched in the US and Europe, and Microsoft Techdays in Europe and Asia. In his day job, Maarten assists enterprise customers envision and build their Private Clouds based on Microsoft Hyper-V and System Center. In the last 12 months, Maarten has been specializing in Windows Azure and helping out customers deploying Hybrid clouds. He contributed to various System Center books and was co-founder ofsystemcentercentral.com He runs the dutch System Center User Group and founded authoringfriday.com to help build a community around authoring System Center packs.

Jason Sandys

 

Jason Sandys, ConfigMgr MVP, is currently a Technology Evangelist and Principal Consultant for Catapult Systems Inc, and has nearly 20 years of experience in a wide range of technologies, environments, and industries with extensive knowledge in implementing and supporting all things SMS and Configuration Manager beginning with SMS 2.0. Jason is also active in the online support community, is a co-author of System Center 2012 Configuration Manager Unleashed (Sams 2012), was a contributing author to System Center Configuration Manager 2007 Unleashed (Sams, 2009), and is a frequent presenter at Microsoft TechEd, MMS, and various nation-wide user-groups.

Pete Zerger

Pete is a consultant, author, speaker and Microsoft MVP focusing on System Center management, private cloud and data center automation solutions. He is a frequent speaker at Microsoft conferences and has written articles for a variety technical magazines including Microsoft TechNet.

He was a contributing author on several books, including Opalis 6.3 Unleashed and the upcoming System Center Orchestrator 2012 Unleashed. He also contributed to Operations Manager 2012 Unleashed and the PowerShell 2.0 Bible.

Pete is also the co-founder of SystemCenterCentral.com, a popular web community providing information, news and support for System Center Technologies. In 2008, Pete founded the System Center Virtual User Group, a group dedicated to sharing System Center knowledge with users around the world.

 

Friday Five - February 28, 2014

Exchange 2013 SP1 -- Way to go Windows 2012 R2

$
0
0

Editor’s note: The following post was written by Exchange Server MVP Prabhat Nigam

Exchange 2013 SP1 – way to go Windows 2012 R2

On Feb 25, 2014 Microsoft released Exchange 2013 Service Pack 1. This was a big day for Microsoft as the release of Exchange 2013 Service Pack 1 made it possible to use Windows 2012 R2 for Exchange 2013. Microsoft also released Exchange 2010 service pack 3 Rollup update 5 and Exchange 2007 Service pack 3 Rollup Update 13  so that Exchange 2010 and 2007 can be installed on Windows 2012 R2.

Current Exchange 2013 users can upgrade to Exchange 2013 SP1 but windows 2012 in place upgrade is not supported so you can’t use the Windows 2012 R2.

Windows 2012 R2 has some great new features. Windows 2012 R2 new features can be checked here.

The 1st feature which will smile you is the “Start button” is back, with many functions possible through right click on it which will make things easier.

Hyper-V has improved and is considered best virtualization software in today’s era. Smart Paging and Online Disk size editing are 2 of the great features.

Online Disk size change: Now we can change the disk size even if a VM is in use and production. This can be done by simple 3 steps. Edit disk in Hyper-V Manager, edit the size of the vhdx file and extend thevolume. Earlier we were required to shut down the VMs. This is an awesome change! Another factor to make which will simplify things for system administrators.

 Smart Paging: In the dynamic memory allocation the following 3 configurations will allow your servers to start even if the memory is not enough to run them.

              Minimum RAM – The minimum amount of RAM that a VM requires in order to work.

Startup RAM – The amount of RAM required during boot. This may well be higher than the minimum and increasing this will ensure a VM boots faster.

Maximum RAM – The maximum amount of RAM a VM requires in order to work.

 

I have highlighted few new features of Hyper-V on my blog here.

“Windows Server 2012 R2 forest functional level” is only going to work for Exchange 2013 SP1 and Exchange 2010 SP3 RU5. So Exchange 2007 customers can’t use “Windows Server 2012 R2 forest functional level”

Exchange 2013 Service Pack 1 is a cumulative update 4 and will do the schema update similar to previous cumulative updates. So be prepared for the schema update. That being said, the next Exchange 2013 update will be cumulative update 5.

Exchange 2013 SP 1 installation steps are described here.

 

Release of Exchange 2013 SP1 brings us many new features to use. Some of them are mentioned below:

  1. Windows Server 2012 R2 is now a supported operating system in Exchange 2013 if we use Exchange 2013 SP1. This means to many that Windows 2012 R2 will be helpful to those who are planning for Exchange 2013 SP1. We can also use Windows 2012 R2 domain controllers and forest. Check the Exchange matrix here.

 2. Edge Transport role is back with Exchange 2013 SP1. Edge transport has been used for 2 purposes: one for the spam filtering and second for SMTP relay. So if you used EDGE Transport for the         messaging security then you should plan to move to exchange 2013 Edge Transport.

 3. Secure/Multipurpose Internet Mail Extension for message signing and encryption is back. So anyone holding out on migration because of this feature can start the migration.

 4.  MAPI over HTTP: Messaging Application Programming Interface (MAPI) over HTTP is a new protocol to connect Outlook. It improves the reliability and stability in connection by moving the transport layer tothe HTTP model. If the Outlook connection breaks this will reconnect the session faster because of TCP connection. One needs to run Exchange 2013 SP1 and Outlook 2013 SP1 to use this feature. By defaultthis feature is disabled. If you don’t want to use this feature then you can still use Outlook Anywhere or enable both of them and if some clients are not yet running Outlook 2013 SP1, they can connect usingOutlook Anywhere. This is an organization – wide feature which can be enabled for whole organization and not per server. To enable this protocol run the below command:

Set-OrganizationConfig -MapiHttpEnabled $true
At the same time we might need to check registry key mentioned in this article. 

5. Data Loss Prevention policy tips will be showing up on Desktop and mobileOutlook Web App as well.

                    
6. Document Fingerprinting: Document Fingerprinting is a Data Loss Prevention (DLP) feature that converts a standard form into a sensitive information type, which you can use to define transport rules and DLP policies.

 7. Secure Socket Layer Offloading is now being supported. With the help of this feature we can use hardware load balancer for SSL offloading rather than offloading on Client access which uses serverresources.

 8. Exchange 2013 SP1 is going to support Hybrid deployments with multiple active directory forest which is having Exchange servers in different Active Directory forests.

 9. Exchange Admin Center Cmdlet Logging: In the Exchange Admin Center up to the last 500 executed commands will be captured for later review if we enable cmdlet logging feature of SP1. Logging will stopif we will close the window.

 10.   Database Availability Group without an administrative access point:

  • This is one of the best features which will allow a database availability group (DAG) to run without cluster IP, name and cluster name object (CNO). So this means:
    • No DAG IP
    • No DAG Cluster Name
    • No Cluster Name Object (CNO)
    • No DNS Entry
  • In the IP address 255.255.255.255 will be used to fill the property because IP address property is required.
  • Exchange has changed and started managing cluster information and uses the configuration from the “configuration container in active directory” rather using local clustered data.
  • We can still create a traditional DAG. Transition from traditional DAG to DAG without an administrative access point is not supported and there is no way to transition except creating new DAG and moving mailboxes.
  • We are using windows 2012 R2 Active Directory-detached cluster feature mentionedherewhere in place of dns we are giving none.
  • In a multi – Datacenter expanded DAG we don’t need to assign any DAG IP. This is not only saving the IPs but also reduces the effort and false positive that the 2nd Datacenter IP is not online in SCOM.
  • We can only manage this DAG from Exchange management shell or Exchange Admin Center. New DAG can’t be managed from Failover Cluster Manager because it does not have Cluster name object.
  • We need to use windows 2012 R2 with Exchange 2013 Service Pack 1 to create a Database Availability Group without an administrative access point.
  •  The network team does not need to worry about the 2nd datacenter IP which will not be responding. We don’t need to look for a free IP address in every datacenter.
  • The following command will create the new DAG in Exchange management shell:

New-DatabaseAvailabilityGroup -Name DAGName -DatabaseAvailabilityGroupIPAddresses ([System.Net.IPAddress]::None) -WitnessServer WitnessServerName–WitnessDirectory “Path of witness Directory

 

                         Or

  • In Exchange Admin Center we can create the DAG and in place of IP we need to give 255.255.255.255.

 

With all this added functionality Exchange Administrators should consider Exchange 2013 SP1 on Windows 2012 R2 to take full advantage of these great new features!

About the author

Prabhat has an MBA in Information Technology and is working as Microsoft Architect where he helps in designing, implementing and managing solutions of private messaging cloud, mergers, collaboration between different messaging software and other migration & deployment projects. He also manages the Presales Exchange and Directory services for his company. Recently he was involved in a EOP migration for over 250,000 user base customer and has worked on many private cloud and Exchange migration projects. Prabhat has worked for many IT giants where he has lead the Global team for Exchange and Active Directory. Beginning his career as Technical Consultant in Exchange 5.5 with Microsoft PSS, his love for Exchange never stopped & continued with 2000/2003/2007/2010/2013. In 2013 Microsoft awarded him Most Valuable Professional Award (MVP) for his expertise in Exchange Server. He manages the website and blog – MSExchangeGuru.com (Learn Exchange The Guru Way), LinkedIn Groups "Microsoft Exchange Server" and "Microsoft Exchange Server 2013". Prabhat often responds to queries on TechNet forum, linkedin and on his blog. He can be contacted via prabhat@msexchangeguru.com or Twitter, linkedin and youtube.

 About MVP Monday

The MVP Monday Series is created by Melissa Travers. In this series we work to provide readers with a guest post from an MVP every Monday. Melissa is a Community Program Manager, formerly known as MVP Lead, for Messaging and Collaboration (Exchange, Lync, Office 365 and SharePoint) and Microsoft Dynamics in the US. She began her career at Microsoft as an Exchange Support Engineer and has been working with the technical community in some capacity for almost a decade. In her spare time she enjoys going to the gym, shopping for handbags, watching period and fantasy dramas, and spending time with her children and miniature Dachshund. Melissa lives in North Carolina and works out of the Microsoft Charlotte office.

Friday Five - March 7, 2014

$
0
0

Adding SQL Server Express 2012 Advanced Services to Existing Development Environment

$
0
0

Editor’s note: The following post was written by Visual C# MVP Ming Man Chan

Adding SQL Server Express 2012 Advanced Services to Existing Development Environment

This article consists of three subsections:

  • Get the SQL Express 2012 Advanced Services installation file
  • Install SQL Express 2012 Advanced Services
  • Configure SQL Express 2012 Advanced Services

To setup a good complicated development environment in a laptop or a desktop is not an easy task for many developers.

Let’s assume that I have a laptop with Windows 8.1 and Microsoft Office 2013 installed. I believe this is the configuration most of the end users have. Now, for a .NET developer using C# or VB.NET they will very likely do the following if their application requires connection to Microsoft SQL database.

1. Install Visual Studio 2013.

2. Download SQL Express 2012 from the Internet.

Over a period of time you might want to use reporting services to do the reports for your application. This is the time you need to look SQL Server Express 2012 Reporting Services.

Get the SQL Express 2012 Advanced Services installation file

You will not be able find the download easily if you start search SQL Server Express 2012 Reporting Services or SQL Server Express 2012 Reporting Server in Bing other search engine. The reason is in SQL Server Express Edition they do not call it SQL Server Express 2012 Reporting Services but rather the Reporting Services is in SQL Server Express 2012 Advanced Services.

Please do not get the wrong perception that you can use SQL Server Analysis Service (SSAS) and SQL Server Integration Service (SSIS) with SQL Server Express 2012 Advanced Services. SQL Server Express 2012 Advanced Services comes with SQL Server Reporting Server only but allows you to develop SSAS and SSIS solution if you have a remote server that is running SQL Server Analysis Service and SQL Server Integration Service. For more information:

To download SQL Server Express Advanced Services go to http://www.microsoft.com/en-my/download/details.aspx?id=35579

More of the PCs or Servers today should be 64 bit. Download the file SQLEXPRADV_x64_ENU.exe if your Operating System is 64 bit or SQLEXPRADV_x32_ENU.exe if your Operating System is 32 bit. Please bear in mind that 64 bit machine can still install 32 bit Operating System.

 

For details editions comparison see: http://msdn.microsoft.com/en-us/library/cc645993.aspx

 

Install SQL Express 2012 Advanced Services

1. Double click on SQLEXPRADV_x64_ENU.exe.

2. Click New SQL Server stand-alone installation or add features to an existing installation.

 

If you physical computer or virtual computer without connecting to the Internet then you will get the error SQL Server Setup could not search for updates through the Windows Update service.

3. You can ignore the error and Click Next >.

From my experience, if you are installing SQL Server Engine and you ignore the update error then you will crash the SQL Server installation. Somehow this is fine with Express 2012 Advanced Services.

 

 

4. Click Select All to select all the features.

5. Click Next >.

 

 

6. Select Add features to an existing instance of SQL Server 2012.

7. Click Next >.

 

5. Set the Startup Type Automatic if you are not familiar with how to start a Windows Services manually.

As mentioned above, there is no SSAS and SSIS that you can install.

6. Click Next >.

 

7. You can only see an option available, Install only. Click Next >.

 

 

The Express 2012 Advanced Services is now fully installed.

 

Configure SQL Express 2012 Advanced Services

To configure SQL Server Reporting Services run Reporting Services Configuration Manager.

1. When Reporting Services Configuration Manager started then connect to your SSRS. Click Connect.

 

 

You will see the configuration page. Configure the SSRS from the top to bottom is advisable.

2. Click on Service Account.

 

 

You can leave it with the default account ReportServer$EXPRESS.

3. Click on Web Service URL.

 

4. To use the Web Service URL. Click Apply.

 

 

 

 

 

 

 

 

The Virtual Directory for Web Service will now be created.

5. Click on Database.

 

6. There is no Create Database. Click Change Database.

 

 

7. Select Create a new report server database.

8. Click Next.

 

 

9. You can change the database server or keep the default database server. Click Next.

 

 

 

10. You can keep the default database name ReportServer or change to the database name that you want to use. Click Next.

 

 

 

11. Use the default Authentication Type. Click Next.

 

 

12. Once you verify all the setting then click Next.

 

 

 

You will now see the Database configuration portion done.

13. Click Finish.

 

To configure Report Manager URL.

14. Click on Report Manager URL.

15. Click Apply.

 

Up to here your report server will be working fine. The rest of the configuration is optional.

 

 

 

 

About the author

Ming Man is Microsoft MVP since year 2006. He is a software development manager for a multinational company. With 25 years of experience in the IT field, he has developed system using Clipper, COBOL, VB5, VB6, VB.NET, Java and C #. He has been using Visual Studio (.NET) since the Beta back in year 2000.  He and the team have developed many projects using .NET platform such as SCM, and HR based applications. He is familiar with the N-Tier design of business application and is also an expert with database experience in MS SQL, Oracle and AS 400.  Additionally you can read Ming’s Channingham’s blog

About MVP Monday

The MVP Monday Series is created by Melissa Travers. In this series we work to provide readers with a guest post from an MVP every Monday. Melissa is a Community Program Manager, formerly known as MVP Lead, for Messaging and Collaboration (Exchange, Lync, Office 365 and SharePoint) and Microsoft Dynamics in the US. She began her career at Microsoft as an Exchange Support Engineer and has been working with the technical community in some capacity for almost a decade. In her spare time she enjoys going to the gym, shopping for handbags, watching period and fantasy dramas, and spending time with her children and miniature Dachshund. Melissa lives in North Carolina and works out of the Microsoft Charlotte office.

Xbox MVPs

MVP Developed Coloris App Shines in Windows Store

$
0
0
After attending a Windows 8 UX Design Camp in Japan, MVPs Yutaka Tsumori, Yu Mitsuba and Akira Hatsune discovered a need for customizable color options when creating Windows Store apps.  The MVP group wanted to increase the color functionality of backgrounds and buttons for developers in the Windows Store.
"I felt that color was a crucial element for well-balanced store app design," said Visual Basic MVP Akira Hatsune. "And there were few applications that deployed visually nice colors."
 
Enter Coloris.  A Windows Store app that enables users to more easily adjust in-app coloring and design.  Hatsune, who is no stranger to the Windows Store having developed 10 previous apps, collaborated with Client Development MVP Yu Mitsuba and Windows Embedded MVP Yutaka Tsumori to deliver Coloris to app creators.
 

At the November 2013 MVP Summit, Hatsune, Tsumori and Mitsuba took home the top prize during the MVP Showcase beating 41 other MVP entries.  "MVPs visited us and said, 'Yes! This is exactly what I needed!'" said Hatsune.  To see more from the MVP Showcase,click here

When asked about the MVP Award community and the advantages of being an MVP, Hatsune said, "For me, the Microsoft MVP community is a place where we can communicate with all the geeks around the world.  Being an MVP means having a strong pipeline with Microsoft developers and other excellent MVPs."

To download or discover more about Coloris, visit the Windows Store.  


9 Countries, 24 Cities, 17 MVP Presenters - MVP ComCamp

$
0
0

PowerPoint MVP Yubin Cha demonstrates the functionality of PowerPoint

Nine countries, twenty-four cities and hundreds of community leaders, technology enthusiasts and MVPs will participate in the Asia Pacific MVP Community Camp starting March 17. 

Over two-dozen MVPs will present during a week-long series of events.  It all kicks off on Monday, March 17, 2014, with the first ever rendezvous of technical communities from across the region.

"MVPs have rich experience and knowledge, we're technophiles!" says ASP.NET/IIS MVP Demo Fan.  "That’s why we joined the MVP ComCamp, it covers all the hot topics so community members can learn the best use of technologies directly from MVPs."

MVP Community Camp (#MVPComCamp) is a six-day event with live webcasts and in-person seminars. MVPs from nine countries will deliver free, public webcasts providing expert-level insight into technology trends and techniques.  The in-person seminars will be available in 24 cities across nine countries with tracks providing opportunities to connect with others in the same expertise and interest.

Topics range from Microsoft Cloud OS to the latest devices & services. Throughout the week, MVPs and other technical community experts will present more than 200 webcasts and in-person seminars focusing on themes such as application development, enterprise management and SMB solutions.

 "As a speaker it is an opportunity to share my knowledge and real world experience in deploying the technologies with the attendees," says Hyper-V MVP Alessandro Cardoso.  

Here are all the dates, details and information you need to know!


Live Webcasts
 

March 17 - March 21, 2014

A total of 17 live webcasts featuring MVP speakers will be broadcasted during ComCamp Week. These global webcasts cover a wide range of topics, from Cloud OS to application development.

Session Schedule

Start Time shown in the table below is UTC (Universal Time Coordinated).  You may use the Time Zone Converter to find out the time based on your location.

1 Mumbai, New Delhi (UTC+5:30)

2 Jakarta, Jakarta Special Capital Region, Indonesia (UTC +7:00)

3 Kuala Lumpur, Singapore, Perth, Hong Kong (UTC+8)

4 Adelaide, Darwin (UTC+9:30)

5 Canberra, Melbourne, Sydney, Brisbane, Hobart (UTC+10)

6 New Zealand (UTC+12)

7 Seattle, Redmond. (UTC-8)

 

Day 1: Opening Day - Our Cloud OS vision

March 17, 2014

Session

Speakers

Register

9:00 am (UTC)

Windows Azure Websites: In Depth

Session Language: English

Mahesh Krishnan (Australia)

John Azariah (Australia)

Microsoft MVPs for Windows Azure

 

11:00 am (UTC)

New Enterprise Platform – Microsoft Cloud OS 

Session Language: Chinese

Jinliang Gao (China)

Microsoft MVP for System Center Cloud and Datacenter Management

 

12:00 pm (UTC)

Microsoft Cloud OS New Feature Sharing 

Session Language: Chinese

Hao Hu (China)

Microsoft MVP for System Center Cloud and Datacenter Management

 

1:00 pm (UTC)

Developing Open Source Software with Free Microsoft Development Tools only

Session Language: Korean

Jeonghyun Nam (Korea)

Microsoft MVP for Windows Azure

 

2:00 pm (UTC)

Researching and operating with Big Data using Windows Azure 

Session Language: Korean

Hongju Jung (Korea)

Microsoft MVP for Windows Azure

 

Day 2: Developers Day - Application Development

March 18, 2014

Session

Speakers

Register

11:00 am (UTC)

Advanced C# session for Store app Developers 

Session Language: Japanese

Akira Kawamata (Japan)

Microsoft MVP for Visual C#

 

12:00 pm (UTC)

Learn how to use enhanced features of Windows 8.1 to your advantage in developing Windows Store apps

Session Language: English

Akira Hatsune (Japan)

Microsoft MVP for Visual Basic

 

12:00 pm (UTC)

Windows Phone App development 

Session Language: Chinese

Ouch Liu (Taiwan)

Microsoft MVP for Client App Dev

 

1:00 pm (UTC)

Monetize your apps through the Windows Store 

Session Language: Chinese

Bill Chung (Taiwan)

Microsoft MVP for Visual C#

 

Day 3: Office Servers Day - How Microsoft Office Servers enhance our daily work

March 19, 2014

Session

Speakers

Register

6:30 am (UTC)

The Best Use of SharePoint 2013 Social Features on Enterprise Level

Session Language: English

Destin Joy (India)

Microsoft MVP for SharePoint Server

 

11:30 am (UTC)

Exchange 2013: Transport and Mail Delivery

Session Language: English

Prabhat Nigam(India)

Microsoft MVP for Exchange Server

 

Day 4: SMB Day - Running Small and Mid-business with Microsoft technologies

March 20, 2014

Session

Speakers

Register

7:00 am (UTC)

Scheduling successful time critical Projects

Session Language: English

Rod Gill (New Zealand)

Microsoft MVP for  Project

 

8:00 am (UTC)

Virtualization with Hyper-V and Windows Server 2012 R2 Essentials for Small Business

Session Language: English

Boon Tee (Australia)

Microsoft MVP for  Windows Server for Small and Medium Business

 

9:00 am (UTC)

Office 365 powering collaboration and visibility in a supply chain

Session Language: English

Ed Richard (Australia)

Microsoft MVP for  Visio

 

10:00 am (UTC)

How Can Dynamics CRM Help You?

Session Language: English

Leon Tribe (Australia)

Microsoft MVP for  Dynamics CRM

 

Day 5: All about Visual Studio 2013

March 21, 2014

Session

Speakers

Register

12:00 pm (UTC)

VS 2013 overview session (what’s new)

Session Language: English

Ming Man Chan (Malaysia)

Microsoft MVP for  Visual C#

 

1:00 pm (UTC)

VS 2013 application life cycle management

Session Language: English

Jeffery Tay (Singapore)

Microsoft MVP for  ASP.NET/IIS

 

In-Person Seminars  - March 22, 2014 

Day 6: Each Seminars will include keynotes, breakout sessions, lunch/coffee break, post-event activity, and we will have live streaming feed.

Approximately 200 technical sessions will be conducted throughout Asia Pacific. Local MVPs and other technical community leaders will host in-person seminars at 24 different cities across seven countries.  For English speaking countries, please see more detail below.  For others, please visit individual registration site for more event information. Don’t miss the opportunity to connect with peers who share your technical interests!

Australia

China (In simplified Chinese)

India

Japan (In Japanese)

Korea (In Korean)

New Zealand

Taiwan (In traditional Chinese)

 

Day 6

China

Korea

Japan

Taiwan

India

Australia

New Zealand

Beijing

Seoul

Busan

Daegu

Daejeon

Jeonju

 

Tokyo

Sapporo

Tohoku

Nagoya

Osaka

Hokuriku

Hiroshima

Okinawa

Taipei

Bangalore

Hyderabad

Sydney

Melbourne

Brisbane

Adelaide

Perth

Auckland

Friday Five - March 14, 2014

$
0
0

Identity in Your Own Apps with Windows Azure Active Directory - Part 4

$
0
0

Editor’s note: The following post was written by Office 365 MVP Martina Grom and Client Development MVP Toni Pohl. This is the conclusion of the 4 part series. Read part 1, part 2 and part 3

Identity in Your Own Apps with Windows Azure Active Directory

Part 4: Using GraphAPI

GraphExplorer from part3 (link) is very helpful to test requests against GraphAPI (http://bit.ly/1eNLLnG ) and to see what results you get. If we want to use information from WAAD and offer user and group management in our own app we need use GraphAPI of Windows Azure Active Directory (WAAD). Before WebAPI was available WAAD only could be administered with PowerShell which is great for automated management and scripts, but not for usage in apps – the overhead is too big. So, we´re happy that GraphAPI is available since about spring 2013. Let´s extend our sample project we created in part 1 (link).

To understand how requests to GraphAPI are made first have a look into the method that reads information about the logged in user. The code is available in the Home Controller of the project, in \Controllers\HomeController.cs.

 

Navigate to public async Task<ActionResult> UserProfile(). This is what happens when the user clicks onto the username. Here we show only the relevant parts of this class.

        private const string TenantIdClaimType = "http://schemas.microsoft.com/identity/claims/tenantid";

        private const string LoginUrl = "https://login.windows.net/{0}";

        private const string GraphUrl = "https://graph.windows.net";

        private const string GraphUserUrl = "https://graph.windows.net/{0}/users/{1}?api-version=2013-04-05";

        private static readonly string AppPrincipalId = ConfigurationManager.AppSettings["ida:ClientID"];

        private static readonly string AppKey = ConfigurationManager.AppSettings["ida:Password"];

 

            // …a little bit more code…

 

        [Authorize]

        public asyncTask<ActionResult> UserProfile()

        {

            string tenantId = ClaimsPrincipal.Current.FindFirst(TenantIdClaimType).Value;

 

            // Get a token for calling the Windows Azure Active Directory Graph

            AuthenticationContext authContext = newAuthenticationContext(String.Format(CultureInfo.InvariantCulture, LoginUrl, tenantId));

            ClientCredential credential = new ClientCredential(AppPrincipalId, AppKey);

            AuthenticationResult assertionCredential = authContext.AcquireToken(GraphUrl, credential);

            string authHeader = assertionCredential.CreateAuthorizationHeader();

            string requestUrl = String.Format(

                CultureInfo.InvariantCulture,

                GraphUserUrl,

                HttpUtility.UrlEncode(tenantId),

                HttpUtility.UrlEncode(User.Identity.Name));

 

            HttpClient client = newHttpClient();

            HttpRequestMessage request = newHttpRequestMessage(HttpMethod.Get, requestUrl);

            request.Headers.TryAddWithoutValidation("Authorization", authHeader);

            HttpResponseMessage response = await client.SendAsync(request);

            string responseString = await response.Content.ReadAsStringAsync();

            UserProfile profile = JsonConvert.DeserializeObject<UserProfile>(responseString);

 

            return View(profile);

        }

First the app creates a new AuthenticationContext with the Login-URL and the domainID https://login.windows.net/{0}. A new ClientCredential with App-ID and AppSecret is created. Then an AuthenticationResult is made with the GraphAPI-URL https://graph.windows.net (we saw in part 3 when using GraphExplorer) and the credentials of our app. With that a new HTTP Get-Request is formed with the GraphAPI-URL for the user we want to get: https://graph.windows.net/{0}/users/{1}?api-version=2013-04-05 . The tenant-ID and username are inserted. Also the version of GraphAPI is passed so that we ensure that we get the expected result (2013-04-05 is the last official version used by Visual Studio). Now the request is sent. VS also created a model for our user-object in \Models\HomeViewModels.cs which we can use for storing the user data. There are not much properties in it, only DisplayName, GivenName and Surname.

 

The result of our HTTP-request is JSON as seen before in part 3. So this data has to be serialized into our UserProfile which happens in line UserProfile profile = JsonConvert.DeserializeObject<UserProfile>(responseString); .

The last return delivers the data to the View in \Views\Home\Userprofile.cshtml where the UserProfile is visualized.

 

To show how simple it is to extend these objects change the userprofile-model in \Models\HomeViewModels.cs. We add two lines with City and ObjectId.

namespace MyPortal.Models

{

    public classUserProfile

    {

        public string DisplayName { get; set; }

        public string GivenName { get; set; }

        public string Surname { get; set; }

        public string City { get; set; }

        public string ObjectId { get; set; }

    }

}

And of course we add these properties in \Views\Home\Userprofile.cshtml . Because <table> isn´t really necessary for our key-values we simply replace the whole HTML <table> with this <div>-block:

<divclass="span12">

    <p>DisplayName: @Model.DisplayName</p>

    <p>GivenName: @Model.GivenName</p>

    <p>SurName: @Model.Surname</p>

    <p>City: @Model.City</p>

    <p>ObjectId: @Model.ObjectId</p>

</div>

Run the app with F5, log in and click on the user. You see the extended user object with City and ObjectId, it´s working.

 

When doing more we could also code the other HTTP-Requests GraphExplorer delivers us. But doing this and creating models for all objects including error handling is a lot of work. So, let´s go the simple way and … use a project from MSDN: Download the MVC Sample App for Windows Azure Active Directory Graph project from http://bit.ly/19mOwhY .

 

Unzip the package and open it with Visual Studio. Modify web.config and replace the first three keys with the adapted ones from your own ASP.NET MVC sample app (part 1).

<appSettings>

  <addkey="TenantDomainName" value="mvpdemo2014.onmicrosoft.com" />

  <addkey="AppPrincipalId" value="f72129f4-8f1c-4fe5-a13b-16449032a4bf" />

  <addkey="Password" value="MwFIu4…" />

 

Now run the app with F5. You should get the app start page. Click the link User Management. This sample app enables user and group management with GraphAPI. We see all users in our WAAD, can display, edit and change their properties.

 

There are methods for the Create Update and Delete operations (CRUD).

 

The best way to see how it works, run the app and analyze the code behind for understanding how the requests are made and handled. This sample works and can be included in your own project. The key element is the “DirectoryService” for accessing the GraphAPI – see the code in UserController.cs:

privateDirectoryDataService directoryService;

publicDirectoryDataService DirectoryService

{

    get

    {

        if (directoryService == null)

        {

            directoryService = MVCGraphServiceHelper.CreateDirectoryDataService(this.HttpContext.Session);

        }

        return directoryService;

    }

}

To add the functionality we need to integrate the solution of GraphHelper. We can get the whole GraphHelper solution from http://bit.ly/1auDTs3. The project is small and contains the complete library.

 

After unzipping we can add the GraphHelper project to our own solution, add a reference to the second solution and we´re almost ready to go.

 

And add the reference to your project:

 

Try to compile the solution, it should work without errors.

The next steps are to copy and adapt the desired functions (Controllers, Models and Views) from MVCDirectoryGraphSample. Create a directory Helpers, copy the two class files from the sample into your project and adapt all namespaces to yours (f.e. MyPortal). Redo all steps for your own project to compile and run again. This is the copy, paste and modify part. At the end you should have your own WAAD portal.

It´s a good idea to use the MVCDirectoryGraphSample project, it simplifies the whole access. If you look f.e. into DirectoryServiceReference.cs you see what we mean. There´s a lot of code and schemas for accessing the GraphAPI.

 

The deployment of the functions then is simple. For creating a new user we create a new user object, add it with Addtousers(user) to the directory service and call the SaveChanges() method.

user.userPrincipalName = string.Format(CultureInfo.InvariantCulture, "{0}@{1}", emailAlias, selectedDomain);

user.mailNickname = emailAlias;

DirectoryService.AddTousers(user);

DirectoryService.SaveChanges();

return RedirectToAction("Index");

For updating an user the user object is searched and the properties are updated in an extra function CopyPropertyValuesFromViewObject(user, refreshedUser);. Then UpdateObject(refreshedUser) and SaveChanges() are called.

User refreshedUser = DirectoryService.users.Where(it => (it.objectId == user.objectId)).SingleOrDefault();

refreshedUser.userPrincipalName = string.Format(CultureInfo.InvariantCulture, "{0}@{1}", emailAlias, selectedDomain);

refreshedUser.mailNickname = emailAlias;

CopyPropertyValuesFromViewObject(user, refreshedUser);

DirectoryService.UpdateObject(refreshedUser);

DirectoryService.SaveChanges(SaveChangesOptions.PatchOnUpdate);

These are short code blocks and easy to use within our MVC app.

We encourage you to download the sample app and experiment with GraphAPI. In our opinion Authorization is the next “big thing” for delivering enterprise apps with Single Sign On mechanism. We hope our small series delivers the idea of using Windows Azure Active Directory in your own (enterprise) apps and you find it useful!

 

About the authors

Toni Pohl has worked for nearly 20 years as an independent IT expert. Toni began his IT career in the early 1990s as a trainer. After working as a trainer, his emphasis changed to the field of consulting and developing software solutions using database systems. In 1999 Toni Pohl founded atwork information technology group along with Martina Grom. They share the business administration. He is responsible for new business development, technology, and quality assurance. Toni Pohl has been awarded in 2013 by Microsoft with the Most Valuable Professional Award (MVP) for his expertise in Client Development. Toni blogs about new technologies for the atwork-blog, in Microsoft-TechNet-Team-Austria Blog, in codefest.at, the Austrian MSDN Blog and in cloudusergroup and is writing technical articles for various magazines. You can connect with him on Twitter, Facebook, or Linkedin.



Martina Grom works as IT-Consultant & is co-founder and CEO of atwork information technology. atwork is located in Austria/Europe and is specialized in development of Online Solutions. Martina & Toni founded atwork in 1999, and she has worked in IT since 1995. Martina is recognized as an expert in Microsoft Office Online Services solutions and was worldwide one of the first 8 MVP’s awarded in 2011 for her expertise in Office 365. She writes numerous articles and blog posts. Her passion is Online & Social Media, cloud computing and Office 365. Martina consults companies on their way to the cloud. Martina has a master degree in international business administration of University of Vienna, You can connect with her on Twitter, Facebook, Linkedin or join her cloudusergroup

About MVP Monday

The MVP Monday Series is created by Melissa Travers. In this series we work to provide readers with a guest post from an MVP every Monday. Melissa is a Community Program Manager, formerly known as MVP Lead, for Messaging and Collaboration (Exchange, Lync, Office 365 and SharePoint) and Microsoft Dynamics in the US. She began her career at Microsoft as an Exchange Support Engineer and has been working with the technical community in some capacity for almost a decade. In her spare time she enjoys going to the gym, shopping for handbags, watching period and fantasy dramas, and spending time with her children and miniature Dachshund. Melissa lives in North Carolina and works out of the Microsoft Charlotte office.

Side Loading Deployment of Windows Store Apps in Enterprises – Step by Step

$
0
0

Editor’s note: The following post was written by Microsoft Integration MVP Damir Dobric

Side Loading Deployment of Windows Store Apps in Enterprises – Step by Step

If you want to deploy Windows Store Apps in enterprises, Windows 8 provides a concept called “Side Loading”, which is described in this article, at least.
Basically, when preparing side loading, you will have to build the package, sign it and deploy it. Please note, there are several ways do side-load the app.
Right now, deployment can be done by executing power shell scripts, by using of SCOM or it can be distributed on the Windows image itself.

There are many articles which describe all of these procedures, but surprisingly all of them targets deployment with developer license key. That means, you will commonly run Visual Studio, create the package and finally run the auto-generated power shell script.

All this looks intuitive and very easy. Basically, you don’t have to do anything. Visual Studio will take care about everything. Period.

Unfortunately, described deployment helps to try side-loading. For this purpose the key called code signing key will be generated, which unfortunately cannot be used in the real production environment? This key will run under developer license. Developer license is usually valid 30 days if created by Visual Studio. However if it is created by using the store directly, it is valid 90 days.

In this article I will try to describe all required steps which are required for production environment.

To deploy your application in enterprise, you need to do following steps:

1. Prepare your machine
2. Create enterprise signing certificate
3. Enroll Certificate
3. Deploy the app

1. Prepare your machine

Before you deploy the app, you need to do some preparation of the machine where the app has to be installed. This procedure is exactly the same one as when deploying with self-signed key for testing purpose.
Side loading works in general on Windows Server 2012 and Windows 8 Enterprise editions if following is satisfied:

a) The machine is domain joined.
b) The group policy is set to Allow trusted apps to install.

To setup group policy open Group Policy editor setting located at
Local Computer Policy | Computer Configuration | Administrative Templates | Windows Components | App Package Deployment. 
Navigate to Allow all trusted apps to install and enable it.

This can also be done by setting following registry key:

HKEY_LOCAL_MACHINE\Software\Policies\Microsoft\Windows\Appx\AllowAllTrustedApps = 1

For side loading to Windows 8 Pro, Windows RT, or Windows 8 Enterprise, you need to

a) Activate the product key.
b) Set the group policy to Allow trusted apps to install.

As you see side loading on Windows 8 Pro and Windows RT works only if the product key is activated.
Don’t be surprised, side loading on Windows 8 is supported either by Domain Joining or Key activation.

2. Create Enterprise Code Signing Key

One of most important steps which is in fact major prerequisites for side loading is creating of the code signing key. Unfortunately this step is most difficult one and it is not documented well (almost at all).
One possible solution would be to buy code signing key. Unfortunately the certificate with signing key for Windows Store Apps is slightly different than common code-signing certificate. More about this latter in this article.
Most articles which reference to side loading describe how to create Self Signing Key. In fact this key is automatically created by Visual Studio.

After lot of trying I found a way to make all this happen. It is not very easy and it includes multiple steps. I’m sorry for this. Please forgive me, it was not my idea to design it this way :)
To make code signing key (certificate which will envelope the key) you have to have installed corporate Certificate Authority (CA). This article does not cover installation of CA.
Once it is installed there is an old, very old ASP Web application, which is used to request certificates. Following three pictures show several steps in the certificate enrollment process.

 

 

 

 

 

Go to Create and Submit certificate:

 

Unfortunately this application has few issues. First, it does not provide by default code signing template, which you will need to deploy the app. Second, it does not support Windows Server 2008 Enterprise certificate templates.

It can only deal with Windows Server 2003 Enterprise certificate templates.
Bu never mind. I decided to try 2003 certificate templates. Enrollment process worked fine, but unfortunately the app verification process during side load has failed over and over again.
For this reason I decided for me, “This application is not useful for issuing of code signing certificates for Windows Store Apps”. If somebody knows how to make this working with this application, I would love to see comments here.

But, don’t wary, there is a better way to complete this story successfully.
Open the MMC and add-in called “Certificate Templates” as shown on the picture below.

 

Click on the node on the picture on the right side and the list of installed templates will appear.

 

 

As next, find Code Signing certificate template in the list of templates.

Unfortunately the Code Signing certificate template, which you see in the list in the picture below is not useful for Windows Store Apps. This template is probably used by other certificate authorities when you decide to purchase the certificate by third-party.

In other words, if you use default code signing template side loading will fail.

The reason for this is that certificate for Windows Store Apps requires few more attributes than default template used for code-signing.

To fix this issue, please select the template (right mouse click) and choose “Duplicate”, to create the copy of existing template. You guess, we will extend existing template.

 m

 

In dialog on right, you need to select Windows Server 2008 enterprise as shown.
This is exactly the step which is not supported by the Web Application shown above.
Remember, in the web application we had to select Win2003 template.

In the next step, you have to make few changes in the template, which you have created from default one. Please be careful. Perform exactly described steps.

 

 

 
If you make some mistake during making changes, it will be almost impossible (difficult) to figure out why side loading will not work. But, don’t wary. You cannot damage anything in the CA, because you are working with copy of the default template.

As next select Subject Name tab and click “Supply in Request”. This will enforce developer to provide the subject name of the certificate, when certificate is added in the Visual Studio solution.

 

The subject which is enforced in the last step will became Common Name attribute known also known CN, which should match to the publisher of the app.

 

As you might know, publisher can/should be associated with the store, where usually the app will be uploaded. This is shown at the picture on right side.

The Common Name (Subject) must equal Publisher Name shown in package manifest.

To make them same, you will have to enter publisher name in the subject in Certificate Request (explained below).

 


Next picture shows where in certificate will publisher name appear.

 

Note, you don’t have to associate your app with the store account, to make side loading happen. But it is good practice to do this even if you don’t want to publish application
through the store.
It is good to do it, because Publisher will be set on your publicly known organization name, if the app is associated with your public account.
If you do not associate it with the store account the package will contain your user name as a publisher like shown at the next picture.

 
Such kind of certificate would not look very professionally when doing roll-outs in big enterprises. This is why it is better to setup organizational account. Even if you use store account
and associate your app with organizational store account, side loading process will remain as it is. No any additional step or tweaks have to be done.

As next you need to enable private key to be exported. If you don’t do this you will not be able to move PFX (code signing certificate) with private key.
That means you will not be able to sign the package.

 

 

 

In the next step you should give some descriptive name of the template.

Later, after certificate is created the template name will appear in the certificate as shown at the picture below on right.

 

 

Optionally you can set expiration date. Default is one year.


Now, you have to enable Basic Constraints Extension. This is what most code signing certificates do not include (at least at time of writing of this article).
When you purchase this certificate by some third party, be sure that this extension is included.
To add this extension, just click the check-box.

 

 

 

After  applying of this template will appear in the list of all templates.

 

As next, refresh the list and select your new template.

 

Finally select it an choose “enable”

 

This is all what you need to do to create the certificate template. Now, we can finally start with enrollment.

3. Certificate Enrollment


As next, you have to enroll certificate for you app. To do that open certificate management tool on any machine joined to domain. (MMC with user certificate management)

 

 

In the MMC you have to select the “Request New Certificate”

 

On the following dialog click “Next”.

 

 

As next, expand the list of available templates:

 

Select the certificate template to be used for your new certificate.


The yellow warning is not an issue. Remember we have chosen for Subject Name “Supply in the Request”. This is what the message above is trying to explain. Ignore this and proceed with “Enroll”.

   

As next, you can select a friendly name for your certificate.

 

The certificate Friendly Name and Description are attribute which will become consisted part of your certificate after it is enrolled:

 

As almost last step, you have to enter the Subject Name of the certificate. This name should be the name copied from the App-Package as
I already described above.

 

You will probably not believe me, but this is almost all you need to do for enrollment. Sorry :(


The last step is to export certificate with private key. Select your certificate and click “Export”. Then select YES Export Private Key.
This is important, because without private key, certificate can only be used for verification of signature, but not for creating of signature.

  

 


Then select PFX format and enter protection password. This should not be a weak password.

  

 


When you later select certificate in the app, you will have to enter the password.
Huuhhh. That’s all for enrollment.

3. Deploy the App

The last required step is to deploy the app. This step is exactly as it should be: “simple”. Visual Studio will do all magic.
Open Package Manifest (Package.appmanifest), click on button “Choose Certificate”, browse for certificate file which you have exported in last enrollment step.
Note that after you select certificate, the password has to be entered, which will unlock the key.

 

 

Once you have entered the password, you will not be asked anymore to do that.
You can simply continue with development and deployment. Right mouse click on App-project,

select store and then “Create App Packages”.

 

 

In the next dialog choose “NO”, which means you are not going to publish the App in the store.
You just want to create the signed package.


Now, select the destination folder of the package and architecture. If you don’t know how to choose proper architecture select “Neutral”. It will work on all architectures.

 

The package is now created in previously specified folder.

 

Click on URL and notice following files:

 

If you execute (open) Add-AppDevPackage.ps1 the app will be installed.

Recap

That’s it. I hope this article will help to understand side-load deployment process, which is obviously not easy. In my opinion this process can and must be improved and simplified.
Deploying of an app must not be more complex than development of the app. Right now, I can imagine lot of simple apps made with App-Studio or similar, which are easier to build than to deploy. Of course you don’t have to perform all described steps every time. Deployment itself is easy process.

On the other hand, Store-Concept is great example of simplicity. Unfortunately, deploying an app through store requires integrating of your enterprise concept in the deployment process of Microsoft Store Apps via store. In the real world this does not work, because every enterprise must be process owner of its own deployment process without dependency to any other organization.

Finally, many steps described in this article do not have to be done by every developer or developer at all. Once you have certificate template and certificate deployment is very easy.

 About the author


Damir Dobric is co-founder, managing director and lead architect of DAENET Corporation, which is Microsoft long term Gold Certified Partner and leading technology integrator specialized in Microsoft technologies with strong focus on Windows, .NET and Web. He is with DAENET winner of worldwide Technology Innovation Award for year 2005, German innovation award for SOA in 2008 and Microsoft Partner of the year 2010.

Working with  Microsoft on helping customers to adopt new technologies, he has well over 25 years of experience as a developer, architect, speaker and author. His focus is on practical implementation of custom solutions, which target integration of devices and services.  Damir is Windows Azure VTSP, Docent for Software Engineering on University of applied sciences in Frankfurt am Main with focus on Distributed and Cloud Computing and member of Connected Technology Advisory and Windows Azure Insiders Groups. Visit his blog or follow him on Twitter.

About MVP Monday

The MVP Monday Series is created by Melissa Travers. In this series we work to provide readers with a guest post from an MVP every Monday. Melissa is a Community Program Manager, formerly known as MVP Lead, for Messaging and Collaboration (Exchange, Lync, Office 365 and SharePoint) and Microsoft Dynamics in the US. She began her career at Microsoft as an Exchange Support Engineer and has been working with the technical community in some capacity for almost a decade. In her spare time she enjoys going to the gym, shopping for handbags, watching period and fantasy dramas, and spending time with her children and miniature Dachshund. Melissa lives in North Carolina and works out of the Microsoft Charlotte office.

Global Windows Azure Bootcamp to Reach Thousands

$
0
0

Saturday, March 29th, MVPs and Windows Azure enthusiasts around the world will kick off the 2014 Global Windows Azure Bootcamp.  In just one day, over 137 cities in 57 countries will be host to thousands of technology enthusiasts looking to augment their Windows Azure knowledge.  From first-time users to full-time experts, all are welcome and each Bootcamp will use the Windows Azure Training Kit, which can be downloaded for free, as the road map for their session.

New to Windows Azure? No need to worry, MVPs and other experts will be on hand to answer any questions you may have.  You can also join the conversation via Twitter using #GWAB. 

In addition to the amazing technical knowledge being shared globally, organizers are taking this unique opportunity to aid in Diabetes research.  

The world-wide event will host a globally distributed lab in which attendees of the event will deploy virtual machines in Windows Azure. This will help analyze data needed for Diabetes research. The purpose is to discover how the body’s serum protein glycosylation works. Researchers want to know how high blood sugar levels present in diabetes patients affect the complex sugar production systems required for the body's health and ability to fight disease. They hope to prove the theory that when small changes in this process start occurring, the disease can progress and lead to Type 2 diabetes. The results from this work will not only help scientists understand the human diabetic state at the molecular level but also lead the way for early detection of diabetes. You can read more about this effort on the #GWAB website

As if this wasn't enough, MVP organizers will run a shared workload in Windows Azure in the hopes of setting a world record for being the biggest community event running the largest shared workload in a public cloud at the same time. 

It’s a global community, but each event is executed on the local level.  You can check out the location and sign up by visiting the Global Windows Azure Bootcamp website.  

Each location may have a slightly different agenda, so make sure to check with your local organizer to see what software prerequisites you should have ready. These may be listed on the registration page for each event as well. In almost all cases you’ll want to bring a laptop if you can (don’t forget your power cord!) so that you can participate in the hands on labs. 

Many different languages are supported on Windows Azure. If your event is focused mostly on .NET then having the following installs ready on your laptop before you arrive would be helpful.

The training kit also has labs that can be performed with a MAC and the cross platform command line tools.

Don't forget to join the conversation on Twitter!  Use #GWAB to share the experience with the community across the globe!

MVPs Host Global Cloud OS Roadshows

$
0
0

 

The MVP Cloud OS Roadshow continues to inspire IT Professionals and Developers to explore the Cloud OS vision!

What started in the UK has grown to encompass more than 20 countries including Central and Eastern Europe, Germany, Spain, the Netherlands and North America. Each MVP led event delivers a series of real-world scenarios that demonstrate how to integrate the Cloud OS into the constantly changing landscape of IT Professionals and Developers. 

MVPs continue to share their Cloud OS expertise and real world experience, helping businesses to think differently about their Cloud solutions.

"Cloud OS is a new adventure for everyone and as an MVP it’s really important to be able to share the vision with our fellow technical experts,” said System Center MVP Simon Skinner.

 Check out what's happening in each of these regions below!

-      United Kingdom and Ireland

-      Spain

-      Germany

-      The Netherlands

-      United States

-      Central and Eastern Europe

-      Canada

United Kingdom and Ireland

MVPs in the UK and Ireland hit the ground running and continue to lead the way in delivering Cloud OS content to their community. The first MVP Cloud OS community week was hosted at London and boasted 18 UK and Irish MVPs engaging with 450 IT Professionals and Developers in the region.

Shortly after the successful week at London, MVPs and the Cloud OS Community User Group produced the MVP Cloud OS Relay and hosted events in 9 different locations across the UK.

Cloud OS events in the UK and Ireland have been so popular that MVPs are bringing the roadshow to the community in the comfort of their own home.  On March 31, MVPs will launch a week of online sessions dedicated to educating the community about Cloud OS technology and integration.  Topics include:

  • Transform the Datacenter
  • Empower People Centric IT

Online sessions will be hosted throughout the week and participating MVPs will host a live Q&A to ensure that all of your questions are answered. You can sign up for these session here

 

Canada

The Canadian Cloud OS MVP Roadshow is coming off extremely successful events in Calgary, Vancouver and Montreal, and two more events are planned for April in Toronto and Ottawa. Nine Canadian MVPs have delivered content in the 5 city-tour, which is expected to reach over 350 IT Professionals and Developers.

 As one attendee put it: “I didn’t realize the capabilities of Windows Azure! This opens up options.”

Tremendous planning and execution from the following Canadian MVPs:

Spain

MVPs hosted events in Valladolid, Valencia and Murcia. The format of the sessions offered attendees a high level of technical content, along with hands-on demonstrations. 150 IT Professionals and Developers attended the events and more are planned.

Central and Eastern Europe

MVP Damir Dizdarevic travelled to Sarajevo and Belgrade to speak to nearly 200 attendees about our Cloud OS vision. Damir said, “So far this year I’ve delivered three Cloud OS Roadshow events, mostly focused on Windows Server 2012 R2 and Windows Azure. Local community members in both Bosnia and Serbia paid huge interest to these events, so we organized repeated events to host all the people who wanted to come. Even now, dozens of people are on the waiting lists, so I sincerely hope I will be able to organize and deliver more Cloud OS events during the year. These events are a great idea for providing people with up-to-date information on Microsoft technologies and strategies.”

Germany

In Germany, 8 MVPs took to the stage at one of the biggest IT exhibitions (CeBIT) to deliver a number of outstanding sessions talking about our Cloud Solutions.  This activity was preceded by an ‘Insight into any data’ community meeting including 10 MVPs talking to 350 IT Professionals and Developers about SQL Server. 

An attendee to the event commented, “I think it´s important [to have this kind of event] as IT gets more and more complicated. The “community” character of the Roadshow is the most beneficial aspect to us participants, as in this way we get to know each other and talk about the problems that each of us is having with the products: they are similar, but not identical.”


See some of the German MVPs in action

United States

With a slew of events in North America, MVPs have been building Cloud OS roadshows from the ground up.  We had a chance to catch up with Directory Services MVPs Sean Deuby and Brian Desmond, here’s what they had to say about the roadshows.

(Please visit the site to view this video)

Netherlands

Six MVPs worked diligently to provide answers, examples and enthusiasm during the Netherlands Cloud OS Roadshow. Topics of discussion ranged from “Extending Your Datacenter with Virtualization and Networking” to “Why Cloud Matters for Modern Business Applications. 

“I really value the knowledge that was shared by the MVPs….security and service delivery are really issues for my company,” commented one Cloud OS Roadshow attendee. 

More Cloud OS Roadshows are being planned across the globe.  MVPs were key in identifying the need for this type of community event. "For most IT professionals, relying on a single product discipline is becoming a thing of the past,” said SQL Server MVP Tony Rogerson.  “The economic reality over the past few years has meant more needs to be done by fewer people – never before has so much been done by so few for so many! The concept of a job role with SQL Server as a single discipline is becoming out dated, it’s all about Data Platform forming a polyglot across various technologies both database and product."

To see the global impact of the Cloud OS Roadshow community events, check out this article on the MVP Award website.

All upcoming MVP Cloud OS Roadshow events are listed below:

February
February 10-12 – SQL Server Konferenz 2014, Darmstadt, Germany
February 18 –
Sarajevo, Bosnia-Herzegovina

February 25 - Calgary, Canada

February 28 – Valencia, Spain
February – Valladolid
February – Belgrade
February – Sarajevo

March
March 3 - Munich, Germany

March 6 - Vancouver, Canada
March 7 – Amsterdam, The Netherlands
March 7 –
Cologne, Germany
March 7 –
Copenhagen, Denmark
March 10-15 –
CeBIT, Hannover, Germany
March 14 –
Murcia, Spain
March 17 –
Gothenburg, Sweden
March 17 – Sarajevo
March 17-21 –
Cloud OS Online Week (UK/Ireland)
March 18 – Nigeria
March 18 – Kenya
March 18 –
Copenhagen, Denmark (2nd event)

March 20 - Montreal, Canada
March 24 – Aarhus, Denmark
March 25 – Cote d´Ivoire
March 25 – Angola
March 28 –
Rome, Italy

April
April 2 –  
Helsinki, Finland

April 2 - Ottawa, Canada
April 3 –  
Milan, Italy
April 16 –
Warsaw, Poland

April 16 - Toronto, Canada
April 22 –
Madrid, Spain
April 28 – Edinburgh, UK
April 29 – Sunderland, UK
April 30 – Reading, UK
April TBD – Oslo, Norway

May
May 1 – Birmingham, UK
May 2 – London, UK
May 3 – Istanbul, Turkey
May 4 – Bradford/Leeds, UK

 

If you have any questions, please contact us on Twitter @MVPAward

Work Management service in SharePoint 2013: A Short Overview for Developers

$
0
0

 Editor’s note: The following post was written by SharePoint MVP Adis Jugo

Work Management service in SharePoint 2013: a short overview for developers

We all know the dilemma of having our tasks scattered all over the place: some are in Exchange (Outlook), some are in SharePoint tasks lists, and some are in Project Server. Taking care of all those tasks can be a challenge. That’s why Microsoft has introduced in SharePoint Server 2013 and SharePoint Online a new service application, called “Work Management Service Application” (WMA).

This service is colloquially well-known as “Task aggregation service”, and, for the end users, it is manifested through the Tasks menu in their personal sites. Users can find all their SharePoint, Project and Exchange tasks (if opted in), aggregated and sorted, under unified “My Tasks” experience. They can sort their tasks, “pin” them (mark them as important), and mark tasks as completed - directly in the aggregated task view. Special views of “Active tasks”, “Important and upcoming tasks” and “Completed tasks” are accessible within a click. All tasks updates are synced back to their originating systems – the task providers.

 

 

Providers and refreshes

Under the hood, the whole system is based on the model of “Task providers”. Task providers are systems from which the WMA can aggregate tasks:

 

At the moment, three tasks providers are supported: Microsoft SharePoint, Microsoft Exchange and Microsoft Project Server. This means, that the users will see the tasks from those three sources, in their “My Tasks” area. Adding own task providers in not supported at the moment. Different providers are actually very different underlying systems, and there is no easy, “unique” way to do the aggregation and synchronization. Microsoft was planning to include an official API and methodology for adding own providers, but it never made its way in production bits, and it seems that it got a bit off the agenda. Pity, since there is great demand for it – that is actually always the first question being asked, after users learn of existence of this service application.

 

WMA initiates the task sync with providers through so called “provider refreshes”. In the WMA lingo, a refresh is an initiation of the task aggregation (sync) process. For SharePoint and Microsoft Project, provider refreshes are triggered on demand. That means, there is no background process refreshing the tasks for the users in some regular time intervals. Refresh is triggered at that very moment when a user accesses her MyTasks section. A small spinning circle indicates that a refresh is in progress. After refresh is completed, tasks are “reloaded”, and user gets to see her tasks. There is an artificial limit of one refresh every five minutes, not to jeopardize the overall SharePoint server performace. This limit can be changed with PowerShell commands for SharePoint 2013 (there is no GUI for Work Management Application management). For SharePoint Online in Office 365, this limit of one refresh per five minuts cannot be changed.

Provider refresh for Exchange tasks is somewhat different, and it is based on a background timer job (“Work Management Synchronize with Exchange”) which actually does the sync. The job is initially set to run every minute.

Tasks are internally cached in a hidden list in user’s My Site. That means that all the tasks which provider refresh discovers, and eventually updates, are locally stored in the WmaAggregatorList_User in the MySite.

Locations

Providers look in so called “Locations” to get the updates for existing tasks, and to discover new tasks. Locations are different and specific for each provider. For SharePoint 2013 as task provider, for example, locations are task lists in different SharePoint sites.

How does the provider know where to look, where to find the task updates for the current user? There is a collection of so called “hints”, which tells to the provider – please go and look there. In the case of SharePoint, there is an event receiver on each tasks list (new “Tasks with timeline”, content type 171), which, whenever a task for a user is added, stores that list in the hints collection. During the provider refresh, SharePoint provider looks for the tasks in that list, and, if found caches the tasks, and the list. In that process, that task list becomes a known “location”, and the tasks for the user that belong to that list are displayed under that “location” in the MyTasks.

There are also tasks without a location, so called “personal tasks”. The “My Tasks” page inside user’s My Site, is meant to be a place where users can organize their personal tasks, which do not belong to any of the tasks lists in the SharePoint Sites, or Projects. Those tasks are stored only in the “WmaAggregatorList_User” list, and, if Exchange Sync is configured, they are pushed to the Exchange, and therefore accessible in Outlook, mobile devices, etc.

All tasks without location can be “promoted” to the location tasks, what means, they can be “pushed” to a SharePoint list or to Project Server. Once when such a task gets a location, and after it is promoted, its location cannot be changed anymore (tasks cannot be moved across the locations).

Now, this was just an introduction to the WMA concepts and lingo. For anyone who wants to know more how WMA works under the hood, there is a great white paper written by Marc Boyer from Microsoft, which covers all installation and configuration aspects of Work Management Service (bit.ly/1ioO2Zz).

Let’s see now, what can be done with WMA from a developers’ perspective.

What is in it for developers?

Even if there isn’t much developer documentation out there, WMA has one of the best and cleanest APIs in the SharePoint world, with equally good SSOM and CSOM implementations. Now, this is something that SharePoint developers have been asking all along – that CSOM APIs get more power – and Microsoft is delivering on its promise. A drawback here is that there are no REST endpoints for WMA API, so you are stuck with CSOM on the client side, for better or worse.

SSOM side API can be found in

Microsoft.Office.Server.WorkManagement.dll (in GAC)

CSOM side API can be found in

Microsoft.SharePoint.WorkManagement.Client.dll (in Hive15/ISAPI)

Since the both development models are actually parallel, the examples below will be in SSOM, but they can be 1:1 translated to CSOM side (the last example in this post will be CSOM).

Namespaces and classes

UserSettingsManager is the first thing you will want to initialize when working with task aggregation. It contains all of the user-related WMA information, like Locations, important Locations, timeline settings (how does the timeline looks like for a user), etc.

Microsoft.Office.Server.WorkManagement.UserSettingsManager usm = newUserSettingsManager(context);


//Get all locations
LocationClientCollection allLocations = usm.GetAllLocations();

UserOrderedSessionManager is used to manage WMA sessions for a user. You will need to pass a SPServiceContext (for SSOM), or ClientContext (for CSOM) to the UserOrderedSessionManager constructor, to initialize a session manager.

//Gets the session manager and create session
Microsoft.Office.Server.WorkManagement.UserOrderedSessionManager osm = newUserOrderedSessionManager(context);

Once you have your session manager, you can create a new WMA session. There are two kinds of sessions you can create: UserOrderedSession and LocationOrientedUserOrderedSession. They are for the most part the same, just that the LocationOrientedUserOrderedSession contains location info within the tasks, and this is the type of session you will usually want to use in most of the cases.

 

LocationOrientedUserOrderedSession uos = osm.CreateLocationOrientedSession();

This session object will be your main tool for executing task and provider refresh related actions. You will be able to create, edit, and delete tasks, to start a provider refresh, or Exchange sync, and a whole lot of the other cool things that can be done with tasks. Let’s take a look at some of them.

Querying the tasks

When you have a session manager object, you can retrieve all the tasks for a user, or filter the tasks to get only those that you need. There are seven possible filters that you can apply when querying the tasks:

  • CustomAttributeFilter
  • FieldFilter
  • KeywordFilter
  • LastModifiedDateRangeFilter
  • LocationFilter
  • PinnedFilter
  • PrivacyFilter

Most of them are self-explanatory. You can filter the tasks by custom attributes (which you can freely assign to each task, although only programmatically), by standard fields, by keywords, by last modified info, location, is task pinned (important) and task privacy. Of course, the filters can be freely combined.

It is often a requirement to get only those tasks for a user, which are stored in one SharePoint site and its subsites. Using the locations collection, and location filter, this becomes very easy to implement:

// Create a new task query

Microsoft.Office.Server.WorkManagement.TaskQuery tq = newTaskQuery();

//Get all locations which are under a specified URL
IEnumerable<Location> myLocations = allLocations.Where(a => a.Url.Contains("http://demo/sites/subsite"));
Location[] locationKeys = myLocations.ToArray();

// Create location filter
LocationFilter locationFilter = newLocationFilter() { IncludeUncategorizedTasks = false, LocationKeys = locationKeys };

tq.LocationFilter = locationFilter;

// Read filtered tasks from the task session
LocationGroupClientCollection tcc = uos.ReadTasks(tq);

 

This way, we got all the SharePoint tasks for a user, which are stored in all task lists in the http://demo/sites/subsite site, or in one of its subsites.

Creating a task

To create a task, we will use the session object created above. When we create a task through Work Management Application, it is initially always a personal task, stored only in the task cache list in user’s My Site. Even if we are able to set a task location when creating a task, that task is still only a personal task.

There are two overloads of the CreateTask() method, one with, and one without location. Setting the location to null, we can create the task without location even with the location-aware overload:

TaskWriteResult taskWriteResult = uos.CreateTask("My task name", "My task description", "03/24/2014", "03/27/2014", isCompleted,isPinned, "", locationId);

Task myNewTask = taskWriteResult.Result;

With those two lines, we create a task with a name, description, start date (localized string), due date (localized string), booleans that define if the task is completed and pinned, task edit link (this would be interesting if we would have our own task providers), and the location id (nullable integer).

As mentioned above, this task is still a personal task, even if there is a location info associated with it.

To “promote” this task to its location (basically, to push it to the associated SharePoint list, Project etc), we will use one of the two overloads of the Promote method. If location information has already been associated with the task, this will suffice:

taskWriteResult = uos.PromotePersonalTaskToProviderTask(myNewTask.Id);

If there was no location information, we can set it during the task promotion:

taskWriteResult = uos.PromotePersonalTaskToProviderTaskInLocation(myNewTask.Id, locationId);

Now, this task can be found in its new location as well, and the location information cannot be changed anymore.

More task operations

Once we have a task, there are numerous task related methods within the Session object. We can, for example, pin and unpin tasks to the timeline (equals to the “important” tasks in Outlook, or high priority tasks in other systems):

taskWriteResult = uos.PinTask(myTask.Id);
taskWriteResult = uos.RemovePinOnTask(myTask.Id);

Editing tasks can be done via UpdateTaskWithLocalizedValue method. There is an enumeration WritableTaskField, that defines which task fields can be updated. This is important, since tasks are coming from different providers, which might have different task properties (SharePoint tasks are not the same as Exchange tasks), and those properties are basically the least common denominator, which can be found in each task provider. Those are:

  • Title
  • Description
  • StartDate
  • DueDate
  • IsCompleted
  • EditUrl

Please note IsCompleted property here: by updating it, you can set the task status to “completed” (100% task progress in some providers), or not completed (0% progress).

This line of code would update the task description:

uos.UpdateTaskWithLocalizedValue(myNewTask.Id, WritableTaskField.Description, “New Description”);

This line of code sets the task to the “completed” status:

uos.UpdateTaskWithLocalizedValue(myNewTask.Id, WritableTaskField.IsCompleted, “true”);

Deleting task is also quite simple, even if the word “deleting” might not be a right expression here. The method DeleteTask actually removes the task from your MySite timeline:

TaskWriteResult twr = uos.DeleteTask(myNewTask.Id);

For personal tasks, which live only inside the timeline, this is ok – they are really deleted. But the provider tasks are not – only their cached copy is removed from the timeline, and the tasks can still be found in the respective SharePoint lists and Project tasks. If you need to delete them from there, you will need to call respective provider specific APIs.

Working with provider refreshes

As mentioned above, a provider refresh is a process of retrieving and caching tasks for a user. It is done on demand, with no background job processing the tasks. Refresh will be started at that moment, when a user access its “My Tasks” area. There is an artificial limit of 5 minutes between provider refreshes, which can be changed through PowerShell for SharePoint on premise.

Provider refresh can also be initialized programmatically, from the task session:

CreateRefreshJobResult refreshResult = uos.BeginCacheRefresh();

The resulting CreateRefreshJobResult will tell us if the provider refresh process has started successfully (it will be probably not yet completed). If the provider refresh process could not be started from some reason, CreateRefreshJobResult will hold the error information and it’s Correlation Id (mostly caused by the imposed 5 minutes limit).

If CreateRefreshJobResult was executed successfully, it will contain an Id of the newly created provider refresh process, which can be used to check the provider refresh job status. You can check the general status, the status for each task provider separately, and update status for each task location separately.

This short code snippet explains it all:

RefreshResult rr = uos.GetRefreshStatus(jobId);

Console.WriteLine("Refresh state: " + rr.AggregatorRefreshState.ToString());

Console.WriteLine("Corellation id: " + rr.CorrelationId);

Console.WriteLine("Refresh finished: " + rr.RefreshFinished.ToShortDateString() + " " + rr.RefreshFinished.ToShortTimeString());

 

Console.WriteLine("");

Console.WriteLine("Provider statuses:");

Console.WriteLine("------------------");

foreach (ProviderRefreshStatus prs in rr.ProviderStatuses)

{

Console.WriteLine("Provider key: " + prs.ProviderKey);

Console.WriteLine("Provider name: " + prs.ProviderLocalizedName);

Console.WriteLine("Root location id: " + prs.RootLocationId.ToString());

Console.WriteLine("Provider Refresh started: " + prs.RefreshStarted.ToShortDateString() + " " + prs.RefreshStarted.ToShortTimeString());

Console.WriteLine("Provider finished: " + prs.RefreshFinished.ToShortDateString() + " " + prs.RefreshFinished.ToShortTimeString());

Console.WriteLine("");

}

 

Console.WriteLine("");

Console.WriteLine("Location update results:");

Console.WriteLine("------------------------");

foreach (LocationUpdateResult lur in rr.TaskChangesByLocation)

{

Location loc = allLocations.Where(a => a.Id == lur.RootLocationId).FirstOrDefault();

 

Console.WriteLine("Location: " + lur.RootLocationId);

Console.WriteLine("Added: " + lur.AddedCount.ToString());

Console.WriteLine("Active added: " + lur.ActiveAddedCount.ToString());

Console.WriteLine("Removed: " + lur.RemovedCount.ToString());

Console.WriteLine("Updated: " + lur.UpdatedCount.ToString());

}

The result from this code snippet will look like the screenshot below. We can see, that there is only one provider active (SharePoint, in this case), and that 3 tasks have been updated in the location with id 3:

 

You can retrieve a provider refresh history for a user, with all provider refreshes from some time interval until now.

Those few lines of code will enable you to check refresh status for the past 7 days, and to analyze potential problems – for each unsuccessful provider refresh, Correlation Id associated with the refresh status, for the further analysis.

RefreshHistory rh = uos.GetRefreshHistory(DateTime.Now.AddDays(-7));

 

foreach (RefreshResult rr in rh.Refreshes)

{

        // Check refresh status

}

And on the client side?

I have mentioned already that the CSOM side have been implemented with almost the same methods as the server side. To finish this blog post with a goodie, I will create a Windows 8 store app, with a Windows Runtime Component which will reference the Work Management APIs (“Microsoft.SharePoint.WorkManagement.Client.dll” from Hive15/ISAPI), and fetch the task data.

You will, of course, need a ClientContext to create UserSettingsManager (instead of SPServiceContext which was used within the server side model).

ClientContext context = newClientContext("http://server");

 

//user settings manager

Microsoft.SharePoint.WorkManagement.Client.UserSettingsManager usm = new Microsoft.SharePoint.WorkManagement.Client.UserSettingsManager(context);

 

//get all locations from usm

Microsoft.SharePoint.WorkManagement.Client.LocationClientCollection locations = usm.GetAllLocations();

context.Load(locations);

 

//user ordered session manager

Microsoft.SharePoint.WorkManagement.Client.UserOrderedSessionManager osm = new Microsoft.SharePoint.WorkManagement.Client.UserOrderedSessionManager(context);

 

//location oriented session

Microsoft.SharePoint.WorkManagement.Client.LocationOrientedUserOrderedSession uos = osm.CreateLocationOrientedSession();

 

//task query

Microsoft.SharePoint.WorkManagement.Client.TaskQuery tq = new Microsoft.SharePoint.WorkManagement.Client.TaskQuery(context);

 

//read tasks

Microsoft.SharePoint.WorkManagement.Client.LocationGroupClientCollection tcc = uos.ReadTasks(tq);

 

//batching done, client execute

context.Load(tcc);

context.ExecuteQuery();

 

//iterate through results

List<Microsoft.SharePoint.WorkManagement.Client.Task> allTasks = tcc.ToList<Microsoft.SharePoint.WorkManagement.Client.Task>();

 

List<SpTask> tasks = newList<SpTask>();

foreach (Microsoft.SharePoint.WorkManagement.Client.Task task in allTasks)

{

//iterate through results, create an IEnumerable that can be bound to the View

}

 

The task data will be bound to an items page in the C# Windows Store app. This will work with both SharePoint on premise, and with Office 365. The result could look like:

 

I hope that this short walkthrough the Work Management Service Application development model, could give you a brief overview of the things that can be done. I deeply feel that WMA is somehow neglected in the SharePoint 2013 and Office 365, although it offers some great user scenarios.

You can find out more about Work Management Service application development model at my blog http://adisjugo.com, or, if you have specific questions, feel free drop me a direct message at twitter handle @adisjugo. 

 

About the author

Adis Jugo is a software architect with over 20 years of experience, and a Microsoft MVP for SharePoint server. He first met SharePoint (and Microsoft CMS server) back in 2002, and since 2006 his focus was completely shifted towards architecture and development of solutions based on the SharePoint technology. Adis is working as a Head of Development for deroso Solutions, a global consulting company with headquarters in Germany. He is an internationally recognized speaker with over 10 years of speaker experience, speaking at various Microsoft, Community and SharePoint conferences worldwide, where he is often awarded for being one of the top speakers. Adis is also one of the initiators and organizers of the annual SharePoint and Project Conference Adriatics.

You contact Adis on Twitter at @adisjugo, and visit his SharePoint architecture, development and governance blog at http://www.adisjugo.com.

The MVP Monday Series is created by Melissa Travers. In this series we work to provide readers with a guest post from an MVP every Monday. Melissa is a Community Program Manager, formerly known as MVP Lead, for Messaging and Collaboration (Exchange, Lync, Office 365 and SharePoint) and Microsoft Dynamics in the US. She began her career at Microsoft as an Exchange Support Engineer and has been working with the technical community in some capacity for almost a decade. In her spare time she enjoys going to the gym, shopping for handbags, watching period and fantasy dramas, and spending time with her children and miniature Dachshund. Melissa lives in North Carolina and works out of the Microsoft Charlotte office.


New and Renewed MVPs Announced!

$
0
0

 

Today, 966 exemplary community leaders around the world were notified that they have received the MVP Award! These individuals were chosen because they have demonstrated their deep commitment to helping others make the most of their technology, voluntarily sharing their passion and real-world knowledge of Microsoft products with the community.

 

While there are more than 100 million social and technical community members, only a small portion are selected to be recognized as MVPs. Each year, around 4,000 MVPs are honored. They are nominated by Microsoft, other community individuals, or in some cases themselves. Candidates are rigorously evaluated for their technical expertise, community leadership, and voluntary community contributions for the previous year. They come from more than 90 countries, speak over 40 different languages, and are awarded in more than 90 Microsoft technologies. Together, they answer more than 10 million questions a year!

 

MVPs are recognized each quarter for this annual award, which continues to grow and evolve to reflect the development of Microsoft technologies.

 

Congratulations to the new MVPs, and welcome back to renewed MVPs. We are very excited to recognize your amazing accomplishments!

For more information or to nominate someone, go to MVP.Microsoft.com

 

SP24: SharePoint, MVPs and the World in 24 Hours

$
0
0

 What do you accomplish in 24 hours?

If you're SharePoint Server MVPs Vlad Catrinescu and Matthias Einig, you chase the sun from New Zealand to Hawaii and bring the world right along with you.  Starting April 16 at 10pm GMT in New Zealand, they'll do just that. 

"We were looking at new ways of organizing something for the community that has never been done before," said Catrinescu. The two MVPs joined forces with other community leaders and began the initial brainstorming that would result in SP24 - A 24-hour global SharePoint conference.

"SP24 is an original online conference in a way that it has never been done before," said Catrinescu.  "Not only is it accessible for free by anyone wishing to learn SharePoint, but most of the speakers were chosen by the community through voting!"  The conference now boasts 24 MVP speakers from across the globe, 48 live sessions covering business and technical applications and an impressive 4,000 enthusiasts have already registered! The conference kicks off with a keynote address from SharePoint Senior Product Marketing Manager, Bill Baer and continues with an impressive and exciting 24-hour agenda

The online conference, which is built on SharePoint 2013, has two tracks. One that appeals to business users and another that targets developers and administrators alike. The conference and sessions are free to the public and Einig and Catrinescu assure users that "SP24 has sessions for all levels of SharePoint, from end-user sessions that explain the very basics, to level 400 sessions from which any expert can learn."

Register here to join SP24 

Check out the SP24 trailer

(Please visit the site to view this video)

 

 

 

PowerPoint and Excel: Perfect Partners for Dynamic Tables and Dashboards

$
0
0

Editor’s note: The following post was written by PowerPoint MVP Glenna Shaw

PowerPoint and Excel: Perfect Partners for Dynamic Tables and Dashboards

PowerPoint 2010 and PowerPoint 2013 introduced improvements to a lot of features, but sadly PowerPoint Tables was not one of them (and I can say the same for Word Tables). In both apps, pretty much the only thing you can do with tables is add rows and columns, type in the information and, either apply a pre-existing style or laboriously create your own style which you can’t even save as a template.

So what‘s the alternative? In PowerPoint there is a little talked about feature under the Insert, Tables section of the toolbar called Insert Spreadsheet.

 

Unlike PowerPoint and Word, Excel Tables has a slew of very cool features. With Excel Tables you can:

 

  • Sort
  • Filter
  • Remove Duplicates
  • Connect to external data
  • Create a real Total Row (unlike PowerPoint’s Total Row, which only formats the last row)
  • Create a variety of indicators (known as Conditional Formatting)
  • Add Sparklines (tiny one row charts)
  • Create a Custom Style and save it as a template
  • and more

Editing Excel Spreadsheets in PowerPoint

Before I start this article I want to point out that there are two methods for editing spreadsheets in PowerPoint. To have a choice of method, right click on the spreadsheet on your slide, click Worksheet Object, and choose Edit or Open. The default method (Edit) opens an “in place” Excel window on the slide and replaces the PowerPoint toolbar with the Excel toolbar. You click outside the “in place” Excel window to return to PowerPoint. The second method (Open) opens the Excel spreadsheet in a separate Excel window. You close the Excel window to return to PowerPoint. Both methods have their merits depending on what you’re doing with the spreadsheet.

Embedded Excel Tables

For example, you have a PowerPoint Table for your sales team. You’ve sorted the table to reflect highest to lowest sales by team member for the last month. You’ve also added stars to reflect the performance of your sales team. But what happens when next month rolls around? In PowerPoint your only option is to add a column and retype the entire table and move around or redo your star shapes. If you’d used an Excel table instead, all you’d have to do is add the column of data and sort on the new column. Plus you could use the Conditional Formatting features for your performance indicators as well as other features. The only caveat is you must format your table in Excel, although I don’t consider this a downside since Excel’s table formatting features are better than PowerPoint’s.

PowerPoint Table

 

Excel Table

 

Note: As my fellow PowerPoint MVP, Nolan Haims, has reported, the Sparklines (shown in the last column) are very pixilated in PowerPoint. We’re hoping Microsoft is sufficiently motivated to fix that.

Linked Excel Tables

Linking tables is a little trickier. When you copy/Paste link a table into PowerPoint, it doesn’t recognize the table name. Instead, it will map the link only to the selected cells. This is fine if your table never adds columns or rows, but if your table grows or shrinks, the table on your PowerPoint will not grow/shrink with it. You can get around this by using a Named Range, which PowerPoint does recognize. The difference is subtle, but important. Basically it means you need to select your range of data, name it and copy/paste link it into PowerPoint BEFORE you format it as a table in Excel. After you’ve done this, you can format the data as a Table in Excel and using the Name Manager in Excel, tell it your Name Range = Table Name.

Excel Name Manager

 

If you just try and copy/paste link a Table into PowerPoint, the link will say something like A2:H6 instead of Sales (or whatever name you choose) and the table will not dynamically update. I have no idea why PowerPoint will recognize a Named Range and not a Table, but I’m hoping Microsoft fixes that snafu in the next iteration.

PowerPoint Links

 

My fellow MVP, Steve Rindsberg, has been telling me for ages that PowerPoint recognizes a Named Range, but it wasn’t until I read this article, that I was able to figure it out how to make both Named Ranges and Tables work together. They don’t quite cover the whole story because, for the link to work, you must copy/paste link it as a Named Range before you turn it into a Table. But it did help me realize a Named Range could refer to a Table.

If you already have your cells formatted as a Table, simply convert it to a Named Range, copy/paste link it to PowerPoint, and format it as a Table again. Then use Name Manager to refer your Named Range to the table.

I realize this is confusing, so to recap:

  • A Named Range is a designated group of cells that PowerPoint recognizes and will add/remove rows/columns automatically
  • A Table is a designated group of cells that supports Excel Table features but PowerPoint doesn’t recognize and won’t add/remove rows/columns automatically
  • A Named Range can refer to a Table by name, in effect making the Table a Named Range
  • You must copy/paste link as a Named Range first or PowerPoint won’t pick up the Name as the link

If that’s still not clear, don’t fret. My example file has step by step directions. Download the PPTExcelTables.zip file and unzip all three files to the same folder on your hard drive.

Embedded and/or Linked Excel Dashboards

In a previous article on the MVP Award blog, I covered how to create dashboards primarily using PowerPoint’s interactivity features. And while this is very cool, Excel has some smoking hot interactivity features of its own if you’re only talking dashboards. Through Pivot Tables, Pivot Charts and Slicers you can create some pretty amazing interactive visuals. On the downside, a dashboard in Excel doesn’t resize to full screen automatically and just doesn’t look as nice when presented on the big screen. So can you have the best of both worlds? Can you have a functioning Excel dashboard in a PowerPoint slideshow? The answer is yes, but it’s tricky. The tricky part isn’t getting the dashboard onto a slide, that’s fairly straight forward. The tricky part is actually using the dashboard during a running presentation.

To create an embedded Excel dashboard in a PowerPoint slide, use the Insert Spreadsheet feature and create a dashboard in the Excel window. You may find it easier to insert the spreadsheet, click outside the spreadsheet and then right click on the spreadsheet and choose Worksheet Object, Open. This will open a full Excel window instead of the “in place” Excel window. I find this easier to use, especially when working on dashboards which use multiple worksheets.

If you need to learn how to create dashboards in Excel, check out Chandoo.org, it is one of my favorite Excel resources. The key thing to remember is to uncheck Gridlines in the Excel worksheet. If you want, you can also format the cells in your spreadsheet to have the same background as your slide. Since I use a white background on my slides, and the spreadsheet defaults to a white background, I can skip this step. Note: save yourself time and effort by using the same theme for your presentation and your spreadsheet.

If you want a linked dashboard, simply create a dashboard in an Excel file, select the desired range of cells, make it a Named Range and copy/paste link to your PowerPoint slide. If your dashboard changes size, make sure to allow for the largest size when selecting the range of cells. Keep in mind this is for display on the big screen, so design your dashboard accordingly. Simple dashboards on multiple slides will work better than one complex dashboard on a single slide.

Simple Excel Dashboard

 

Finally, set up the Excel dashboard to open in an Excel window when you run the presentation. To do this, click on the spreadsheet object and either use animations to set the OLE verb to Open with previous (this will pop up the Excel dashboard automatically) or use Insert, Actions, Object Action, Open (Excel dashboard will pop up when you click on it). Again, I’ve included step by step instructions in the example file. Download the PPTExcelTables.zip file and unzip all three files to the same folder on your hard drive.

Now comes the trick, and I have to thank my fellow MVP, Chantal Bossé, for figuring this out. Set up the projector as a second monitor and set up your slideshow to run on the second monitor. There’s an article here that tells you how. You can choose to use Presenter View or not, the important thing is that the slideshow runs on the second monitor and the Excel window pops up on the primary monitor. You’ll want to do a test run to make sure the Excel window pops up on the primary monitor and isn’t seen by the audience.

Excel Window on top of Presenter View

 

During your slideshow, click on the Slicer buttons in the Excel window to change the Team or Rep (in this example) and the slide will update on the slideshow in real time. Since your audience never sees the Excel window, it appears as if the dashboard updates seamlessly. A word of caution, do not close the Excel window. I had some odd results happen in Presenter View when I closed the Excel window and it isn’t necessary to close it. It will automatically close as soon as you move to a different slide.

Now that you know how, I hope you enjoy showing off your new skills by combining PowerPoint and Excel for some truly amazing effects. The example file is available for download from here. Download the PPTExcelTables.zip file and unzip all three files to the same folder for it to work correctly.

 

About the author

Glenna Shaw is a Most Valued Professional (MVP) for PowerPoint and the owner of the PPT Magic and the Visualology.Net sites. She is a Project Management Professional (PMP) and holds certificates in Knowledge Management, Accessible Information Technology, Graphic Design and Professional Technical Writing.  Follow her on Twitter.

About MVP Monday

The MVP Monday Series is created by Melissa Travers. In this series we work to provide readers with a guest post from an MVP every Monday. Melissa is a Community Program Manager, formerly known as MVP Lead, for Messaging and Collaboration (Exchange, Lync, Office 365 and SharePoint) and Microsoft Dynamics in the US. She began her career at Microsoft as an Exchange Support Engineer and has been working with the technical community in some capacity for almost a decade. In her spare time she enjoys going to the gym, shopping for handbags, watching period and fantasy dramas, and spending time with her children and miniature Dachshund. Melissa lives in North Carolina and works out of the Microsoft Charlotte office.

Setting Internet Explorer Trusted Site Settings via Group Policy Object in Windows Server 2012 R2

$
0
0

Editor’s Note: In partnership with Microsoft Press, MVPs have been contributing to an ongoing guest series on their official team blog. Today’s article is from Windows Expert – IT Pro MVP Philippe Levesquewhich is the 39th in the series.

 Today I will talk about setting up trusted sites via Group Policy Objects (GPOs) in Windows Server 2012 R2

Seem like an easy topic, but if you have never done this before it is important to do so the correct way to avoid problems down the road.

The first method is fairly straightforward.

First Method:  Internet Explorer Maintenance. Wait, where is it in Windows Server 2012 R2?  (Appendix B: Replacements for Internet Explorer Maintenance or see that link for further reading Missing Internet Explorer Maintenance settings for Internet Explorer 11)

 

(Figure from there: How Internet Explorer Maintenance Extension Works)

 

To read the full article, click here

 

 

About the author

Philippe is a system administrator with over 10 years of on field experiences. He got is MCP, MCTS: Exchange 2007, Citrix Certified Administrator XenApp & XenDesktop, HP Accredited Platform Specialist (HP APS) and many others. These days he mostly manage and configure private cloud's setup. He love to help on TechNet Forum. (Moderator on Windows Server General Forum& Discussion TechNet Wiki (French TechNet Wiki Discussion)). Philippe love the TechNet Wiki and to promote it. He is a TechNet Wiki Community Council and International's Council member. Blogger on MSDN's blog (http://blogs.technet.com/b/wikininjas/). French Translation Widget Moderator for the TechNet Wiki.

 About MVP Monday

The MVP Monday Series is created by Melissa Travers. In this series we work to provide readers with a guest post from an MVP every Monday. Melissa is a Community Program Manager, formerly known as MVP Lead, for Messaging and Collaboration (Exchange, Lync, Office 365 and SharePoint) and Microsoft Dynamics in the US. She began her career at Microsoft as an Exchange Support Engineer and has been working with the technical community in some capacity for almost a decade. In her spare time she enjoys going to the gym, shopping for handbags, watching period and fantasy dramas, and spending time with her children and miniature Dachshund. Melissa lives in North Carolina and works out of the Microsoft Charlotte office.

MVP Community Open Days - Berlin

$
0
0

 

Nearly 100 MVPs met at the Microsoft offices in Berlin to learn, share ideas and celebrate.  The two-day Community Open Days event boasted its highest number of attendees ever and included eight MVP presentations.  "In my eyes, this event was a great success," said Client Development MVP Marco Richardson.  "Additional to the high-quality presentations by other MVPs and Microsoft employees, it was just the warmth of the community (MVP and CLIP), which impressed me."

The Berlin Open Days showcased 17 sessions that ranged in topics from IT Pro and Developer to Consumer sessions. 

“For me, personally, networking was a great part," said Windows Phone Development MVP Peter Nowak.   "The sessions have been cool as well, but getting in contact with other MVPs is something you can’t deliver online. A lot of great ideas are born on such occasions.”

Office365 MVP Kerstin Rachfahl, who also presented during the event, agrees. "It is great to get input in different competencies and sometimes you notice – hey, this helps you too."

Congratulations to the Communtiy Open Days organizers, participants and presenters! 

Topics and sessions delivered by MVPs:

Private vs Public Cloud 

Carsten Rachfahl - Hyper-V MVP

 

Kerstin Rachfahl - Office 365 MVP

Internet of Things - From Sensor to Azure Cloud

Stefan Hoppe - Windows Embedded MVP

Video and Podcasting 

Hans Brender - SharePoint MVP

 

Michael Greth -  SharePoint MVP

Office Apps New Trends

Senaj Lelic - Visio MVP

The Awesome Windows Platform Including 3D Printing and Windows Kinect

Lars Keller -  Client Development MVP 


Viewing all 788 articles
Browse latest View live




Latest Images