Quantcast
Channel: The Microsoft MVP Award Program Blog
Viewing all 788 articles
Browse latest View live

Automating Visual Studio Team Foundation Server (TFS) 2013 Backlog

$
0
0

Editor’s note: The following post was written by Visual Studio ALM MVP Mohamed Radwan

Automating Visual Studio Team Foundation Server (TFS) 2013 Backlog

In this blog post I will show a PowerShell script that automates a comprehensive customization for TFS 2013 Backlog.

The script perform the following:

  • Export Categories.xml, ProcessConfig.xml, Bug.xml as CustomBug.xml and Feature.xml as Objective.xml
  • Rename the Bug to be "Custom Bug" in CustomBug.xml
  • Rename the Feature to be Objective in Objective.xml
  • Change the Link Control option for the WI (Work-Item) layout in Objective.xml to display Epics
  • Edit the Categories.xml to change the Feature category to be Epic category
  • Add the Custom Bug WI (Work-Item) to the Requirement category
  • Create a new custom category for the new Portfolio level
  • Add the Objective as the default WI (Work-Item) to that custom category
  • Clone the Portfolio section in ProcessConfig.xml
  • Change the new cloned portfolio to has the new custom category
  • Rename the items in the second portfolio from Feature to Epic
  • Added a new color section for the new WI (Work-Item) Epic and renamed the Feature to be Objective
  • Add AssignedTo column to the panel

So you will need to perform the following steps to work with that script:

  1. Create a new project
  2. Create a folder in C drive with name “Exported” “C:\Exported”
  3. Rename the project and the collection variables in the PowerShell script
  4. Run the script


Download Script

By going to the exported folder “C:\Exported” and start compare files before and after using any comparing tool like WinMerge as the following:

 

In the Categories.xml file we will find that the script performed the following:

  • Change the Default Work Item Type for “Microsoft.FeatureCategory“ from Feature to Epic

 

  • Add the Custom Bug to “Microsoft.RequirementCategory
  • Create a new custom category “Radwan.ObjectiveCategory” and make the Default Work Item Type for that category is Objective

 

In the ProcessConfig.xml file we will find that the script performed the following:

  • Add the column AssignedTo to the Add Panel

 

  • Add a new Portfolio section by cloning the existing one and change the following:
    • Use the new created custom category “Radwan.ObjectiveCategory
    • Change the plural name to be Objectives and single name Objective

 

  • In the Work Item Colors section:
    • Rename the Feature to be Objective
    • Add a new color section for the Epic

 

The previous configuration reflected on the project as the following

  • We will have 3 new WIs (Work-Items)
    • Custom Bug
    • Epic
    • Objective

 

 

  • A new Portfolio level has been added and the Feature renamed to Epic with new color (green)

 

The Custom Bug added to the Requirements Backlog in the Add Panel

  • The AssignedTo column added to the columns in the Add Panel

 

 

 

 

 

About the author

 

Mohamed Radwan is a Visual Studio ALM MVP and Senior ALM Consultant focusing on providing solutions for the various roles involved in software development and delivery to enable them build better software using Agile Methodologies and Microsoft Visual Studio ALM Tools & Technologies.

Mohamed excels in software practices and automation with 12+ years of professional experience spanning the different stages and phases of the Software Development Lifecycle. Follow him on Twitter 

 

 About MVP Monday

The MVP Monday Series is created by Melissa Travers. In this series we work to provide readers with a guest post from an MVP every Monday. Melissa is a Community Program Manager, formerly known as MVP Lead, for Messaging and Collaboration (Exchange, Lync, Office 365 and SharePoint) and Microsoft Dynamics in the US. She began her career at Microsoft as an Exchange Support Engineer and has been working with the technical community in some capacity for almost a decade. In her spare time she enjoys going to the gym, shopping for handbags, watching period and fantasy dramas, and spending time with her children and miniature Dachshund. Melissa lives in North Carolina and works out of the Microsoft Charlotte office.

 

 


MVPs Participate in //Learn/ - Global Community Webcast

$
0
0

Over 50 MVPs are joining forces on April 24, 2014 for the //learn/ Global Community Webcast to present nearly 70 sessions. This is a unique opportunity for app developer enthusiasts around the globe to meet and share best practices, tips and tricks and participate in deep-dive sessions presented by MVPs.  

"A live webcast about Windows Phone development, in 10+ languages, with experts and MVPs speaking... this is the first time ever! It is really exciting to be part of it," said Windows Phone Development MVP Cyril Cathala

Sessions will cover the latest features and technologies for Windows Phone, PCs and tablets as MVPs focus on making each session "for the community and by the community."  Sessions will be presented in multiple languages across the globe in an effort to provide content to the ever-expanding global app development community. 

"It is very important to share my passion about Windows Phone platform with other developers," said Client Development MVP Michael Samarin.  "And demonstrate to the development community that there is another great platform out there."

First-time app developers and seasoned veterans alike will find content readily available and applicable for every level of expertise.  Sessions will cover basic concepts to develop Universal Windows apps - One App for all form factors( PC, Tablet and Phone), Windows Phone 8.1 apps, XAML and provide attendees the opportunity to chat live with MVPs. 

"This is a huge opportunity for every developer, no matter if they already have experience with the Windows platform or they’re just mobile enthusiast," said Windows Phone Development MVP Matteo Pagani

Register Now!

//build/, //learn/ and //publish/

$
0
0

Editor's note: The following post was written by Windows Phone Development MVP Peter Nowak

If you have been one of the lucky ones to attend //build, or following the announcements of //build from home a ton of new announcements might have caught your eye. Especially looking into Windows Phone 8.1 not only consumers, but also developers get a lot of new functionality to enhance their apps, or to create new ones that span over Windows and Windows phone simultaneously.

For Windows Phone //build was just the starting point to get new bits to play with. As dust settles now a bit it is time to take a look what really has been made available in this package. Do you know everything about it already? Did you know that you the new emulator features enhanced functionality to test geofencing properly, or that you can roam credentials among devices in a secured way? These were topics that were covered at //build briefly, but there is another event coming up for you: //learn.

//learn is an online event by Microsoft powered by the MVP Community to deliver the content you might need for developing great apps for the Windows Phone and beyond. If you know the famous Jumpstart series by Microsoft for several products, than you know already how a lot of the sessions will be structured. But there is more – the sessions will be delivered in 8 different languages: Chinese, English, German, French, Portuguese, Spanish, Italian and Russian. These events also include local start times so that the information gets delivered as smooth as possible.

The exciting thing is, that the base for this event has been created by Windows Phone Development MVPs, who wanted to deliver content independent to a venue. Having the chance to test drive this new concept with the WPDev Fusion - New Year Edition back in January led to //learn! And even The WPDev Fusion event was not the root- it all started last year in October with the Windows Phone week - a worldwide initiative of Windows Phone Development MVPs, who brought 17 worldwide events with over 1500 attendees closer to developing apps for Windows Phone.

Register yourself for April 24th to be a part of //learn here:

Chinese Simplified: https://vts.inxpo.com/Launch/QReg.htm?ShowKey=18934&LangLocaleID=2052&GroupID=ChineseS

Chinese Traditional: https://vts.inxpo.com/Launch/QReg.htm?ShowKey=18934&LangLocaleID=1028&GroupID=ChineseT

English: https://vts.inxpo.com/Launch/QReg.htm?ShowKey=18934&LangLocaleID=1033&GroupID=english

French: https://vts.inxpo.com/Launch/QReg.htm?ShowKey=18934&LangLocaleID=1036&GroupID=french

German: https://vts.inxpo.com/Launch/QReg.htm?ShowKey=18934&LangLocaleID=1031&GroupID=german

Italian: https://vts.inxpo.com/Launch/QReg.htm?ShowKey=18934&LangLocaleID=1040&GroupID=Italian

Russian: https://vts.inxpo.com/Launch/QReg.htm?ShowKey=18934&LangLocaleID=1049&GroupID=russian

Portuguese: https://vts.inxpo.com/Launch/QReg.htm?ShowKey=18934&LangLocaleID=1046&GroupID=portuguese

Spanish: https://vts.inxpo.com/Launch/QReg.htm?ShowKey=18934&LangLocaleID=1034&GroupID=spanish

 

Getting the tools and getting the education are only 2 parts of a trinity. This is, where //publish comes in. //publish by itself is a hackathon with a shifted focus. Instead of getting over to start a new app from scratch (which you still can do) the idea here is to finish an already started app and to get it published to the store. And the base idea here is great. How often did you start already creating an app, which hasn’t been published as you found a problem which you might not getting solved or where you’d need a professional advice regarding best practices to get back on track? This is the strength of //publish.

If you are interested to attend a //publish event close on May 16th and 17th, you might want to check the website publishwindows.com for further information and to find a venue close to you. The event is also supported by MVPs and Nokia Developer Champions as well. With this I hope you will have a lot of fun creating apps and learn how easy it is.

 

Understanding the Windows Server Failover Cluster Quorum in Windows Server 2012 R2

$
0
0

 Editor’s note: In partnership with Microsoft Press, now celebrating their 30th year, MVPs have been contributing to an ongoing guest series on their official team blog. Today’s article is from Cluster MVP David Bermingham which is the 40th in the series.

 

Understanding the Windows Server Failover Cluster Quorum in Windows Server 2012 R2

Before we get started with all the great new cluster quorum features in Windows Server 2012 R2, we should take a moment and understand what the quorum does and how we got to where we are today.  Rob Hindman describes quorum best in his blog post

“The quorum configuration in a failover cluster determines the number of failures that the cluster can sustain while still remaining online.”

Prior to Windows Server 2003, there was only one quorum type, Disk Only. This quorum type is still available today, but is not recommended as the quorum disk is a single point of failure. In Windows Server 2003 Microsoft introduce the Majority Node Set (MNS) quorum. This was an improvement as it eliminated the disk only quorum as a single point of failure in the cluster. However, it did have its limitations. As implied in its name, Majority Node Set must have a majority of nodes to form a quorum and stay online, so this quorum model is not ideal for a two node cluster where the failure of one node would only leave one node remaining. One out of two is not a majority, so the remaining node would go offline.  Continue reading full article here

 About the author

Dave Bermingham is recognized within the technology community as a high
availability expert and has been honored by his peers by being awarded as a Microsoft MVP in Clustering since 2010.

Dave's work as director of Technical Evangelist at SIOS has him focused on evangelizing Microsoft
high availability and disaster recovery solutions as well as providing
hands on support, training and professional services for cluster
implementations. Dave hold numerous technical certifications and draws
from over twenty years of experience IT, including work in the finance,
healthcare and education fields, to help organizations design solutions to
meet their high availability and disaster recovery needs.  Read David’s blog or follow him on Twitter.  

 About MVP Monday

The MVP Monday Series is created by Melissa Travers. In this series we work to provide readers with a guest post from an MVP every Monday. Melissa is a Community Program Manager, formerly known as MVP Lead, for Messaging and Collaboration (Exchange, Lync, Office 365 and SharePoint) and Microsoft Dynamics in the US. She began her career at Microsoft as an Exchange Support Engineer and has been working with the technical community in some capacity for almost a decade. In her spare time she enjoys going to the gym, shopping for handbags, watching period and fantasy dramas, and spending time with her children and miniature Dachshund. Melissa lives in North Carolina and works out of the Microsoft Charlotte office.

 

Friday Five - April 28, 2014

Creating Unit Test for Projects Using Microsoft’s Entity Framework

$
0
0

Editor’s note: The following post was written by Visual C# MVP Ming Man Chan

Creating Unit Test for the projects using Microsoft’s Entity Framework

This article consist of three subsection:

  • Create an ADO.NET Entity Data Model in a Console Application
  • Create the Unit Test Project to reproduce the error.
  • Fix the error with adding the Entity Framework assembly and connection string into the Unit Test Project.

When you are creating unit test method then you might have hit the following error:

Test method UnitTestProject1.UnitTest1.TestMethod1 threw exception: System.InvalidOperationException: No connection string named 'NorthwindEntities' could be found in the application config file.

The error with NorthwindEntities is because I am using the sample database so for you it may be any xxxxxxEntities.

Create an ADO.NET Entity Data Model in a Console Application

 

Let us simulate the problem by using a Console Application, this can apply to other type of projects such as Web and Windows client. We will be using Visual Studio 2013 for this article. The Entity Framework that this article is using is version 6.0.

  1. Choose File -> New -> Project
  2. Select Console Application from project template.

 

 3. Right the console project.

4. Select Add -> New Item

 

5. Select Data then ADO.NET Entity Data Model.

6. Type in the Name for example, NWModel.edmx.

7. Click on Add button.

 

8. Click on Next > button.

 

 9. Click on Which data connection should your application use to connect to the database? (combo box) in Entity Data Model Wizard.

 

10. Click on New Connection... button in Entity Data Model Wizard.

 

11. Type on Server name: in "Connection Properties" for example, .\SQLEXPRESS

 

12. Click on Open button in "Connection Properties".

 

13. In this sample you can click on northwind in list item.

14. Click on OK button in "Connection Properties".

 

15. Click on Next > button in Entity Data Model Wizard.

 

16. Click on "Tables (tree item)" in Entity Data Model Wizard.

17. Click on dbo (tree item) in Entity Data Model Wizard.

18. Select the Products table.

19. Click on Finish button in Entity Data Model Wizard.

The ADO.NET Entity Model is now created.

20. Click on Build menu item to build your project.

Replace the following code to the Program.cs file.

using System;

using System.Collections.Generic;

using System.Linq;

using System.Text;

using System.Threading.Tasks;

 

namespace ConsoleApplication1

{

  public classProgram

  {

      public static void AddProduct()

      {

          NORTHWNDEntities ctx = newNORTHWNDEntities();

          Product product = newProduct();

 

          product.CategoryID = 1;

          product.ProductName = "toy";

 

          ctx.Products.Add(product);

          ctx.SaveChanges();

      }

    static void Main(string[] args)

    {

        AddProduct();

    }

  }

}

The AddProduct is hardcoded for testing proposes. In real life then you might pass the ID and ProductName as arguments.

Create the Unit Test Project to reproduce the error

 

  1. Right click in ConsoleApplication1solution.
  2. Left click on Add.

 

3. Left click on New Project...

 

 4. Left click on OK (button) in Add New Project.

 

5. Right click Reference in UnitTestProject1 project.

6. Left click on Add Reference...  

 

 

 7. Click on ConsoleApplication1 (dataitem) in Reference Manager under Solution -> Project.

8. click on OK (button) in Reference Manager.

 

 

Now the ConsoleApplication1 added as the reference for UnitTestProject1.

Replace the UnitTest1.cs file with the following code.

using System;

using Microsoft.VisualStudio.TestTools.UnitTesting;

 

namespace UnitTestProject1

{

    [TestClass]

    public classUnitTest1

    {

        [TestMethod]

        public void TestMethod1()

        {

            ConsoleApplication1.Program.AddProduct();

        }

    }

}

Right click inside the TestMothod1 then select Run Test.

 

Now you will get the error.

Test method UnitTestProject1.UnitTest1.TestMethod1 threw exception:

System.InvalidOperationException: No connection string named 'NORTHWNDEntities' could be found in the application config file.”

 

Fix the error with adding the Entity Framework assembly and connection string into the Unit Test Project.

Well, we can manual create an app.config file but that is not going to be easy. The easy way is add the Entity Data Model follow the step 4 through step 20 in section Create an ADO.NET Entity Data Model in a Console Application.

You must then delete the edmx and other files that were created by the wizard except the App.config file.

You can now run the unit test again. You should see it green this time.

 

About the author

 

Ming Man is Microsoft MVP since year 2006. He is a software development manager for a multinational company. With 25 years of experience in the IT field, he has developed system using Clipper, COBOL, VB5, VB6, VB.NET, Java and C #. He has been using Visual Studio (.NET) since the Beta back in year 2000.  He and the team have developed many projects using .NET platform such as SCM, and HR based applications. He is familiar with the N-Tier design of business application and is also an expert with database experience in MS SQL, Oracle and AS 400.  Additionally you can read Ming’s Channingham’s blog

 About MVP Monday

The MVP Monday Series is created by Melissa Travers. In this series we work to provide readers with a guest post from an MVP every Monday. Melissa is a Community Program Manager, formerly known as MVP Lead, for Messaging and Collaboration (Exchange, Lync, Office 365 and SharePoint) and Microsoft Dynamics in the US. She began her career at Microsoft as an Exchange Support Engineer and has been working with the technical community in some capacity for almost a decade. In her spare time she enjoys going to the gym, shopping for handbags, watching period and fantasy dramas, and spending time with her children and miniature Dachshund. Melissa lives in North Carolina and works out of the Microsoft Charlotte office.

 

Battle of the Lowlands

$
0
0

Two countries enter, only one is left standing.  

Belgium versus The Netherlands in a knock out, no holds barred, fight to the bitter end!

Well, not exactly.  The Battle of the Lowlands used code in place of combat, apps instead of artillery.  Participants worked alone or on teams to win achievements for their country based on the apps they were creating. Points were given for predefined goals such as including NFC, live tiles, using sensors, etc. At the end of the day, event organizers calculated the average number of achievement points per team.  The country with the highest average won bragging rights as the first ever winner of the Battle of the Lowlands. 

Belgian Client Development MVPs Glenn Versweyveld and Nico Vermeir came up with the idea of a hackathon that pitted the two countries against each other.  They reached out to Dutch Windows Phone Development MVPs Joost van Schaik and Tom Verhoeff and Hardware Interaction Design and Development MVP Dennis Vroegop to establish the rules of engagement for the battle.  A live video feed allowed both countries to keep a watchful eye on each other during the competition as participants tweeted photos of their efforts.

Windows Phone Development MVP Joost van Shaik created a video of the event, check it out!

(Please visit the site to view this video)

 

Congratulation to the Dutch team on their win in the well fought Battle of the Lowlands!

 

PowerShell Summit North America 2014

$
0
0

"All in all, one of the great strengths of PowerShell are our MVP's."

- Senior Programming Manager Ed Wilson


 

"The shell is incredibly deep and complex, and general conferences just can't explore the nooks and crannies," said PowerShell MVP Don Jones.  

With new features in PowerShell v4 like the PowerShell Desired State Configuration (DSC), MVP organizers saw the need to for a community event that covered specific functionality of the "deep and complex" nature of PowerShell.  Over a dozen MVPs participated and presented during the conference that saw nearly 200 participants in the sessions. 

"These types of events bring together experts in the field with those who are using the products on a regular basis," said PowerShell Senior Programming Manager Ed Wilson. "MVP's are crucial to the PowerShell team, and to Microsoft as a whole."

Everyone knows that MVPs are technofiles with their fingers on the pulse of innovation and new trends but what really sets them apart is their passion for community.  "The attendees usually do not have the chance to have a one on one with people like Jeffrey Snover and Lee Holmes," said MVP Teresa Wilson. "Even people like [MVPs] Don Jones, Jeff Hicks, Jason Helmick and so on, who are out teaching and presenting as part of their jobs."

When asked about the value of MVPs to the PowerShell team, Microsoft Program Manager John Slack didn't skip a beat, "MVPs create a direct line between the team and our customers, and have been critical to the success of PowerShell." 

Congratulations on a successful event and to each of the MVP participants and presenters!

Mike Pfeiffer - Exchange Server
Jim Christopher - PowerShell
Jason Helmick - PowerShell
Steven Murawski - PowerShell
Aleksandar Nikolic - PowerShell
Trevor Sullivan - PowerShell
Tome Tanasovski - PowerShell
Teresa Wilson - PowerShell
Sean Kearney - PowerShell
Don Jones - PowerShell
Adam Driscoll - PowerShell
Jeff Hicks - PowerShell
Richard Siddaway - PowerShell

Check out these links for additional information and upcoming community events!

PowerShell Summit North America

 PowerShell Summit  Europe

Additional Community Events

 


Being a Mentor in the Imagine Cup

$
0
0

Editor’s note: The following post was written by Client Development MVP Bruno Sonnino

Being a mentor in the Imagine Cup

In July 2012, I’ve had the pleasure of mentoring Team Virtual Dreams, winner of the Windows 8 Metro Challenge and Azure Challenge on the Imagine Cup 2012, the Microsoft-sponsored technology world cup for students. As a Microsoft MVP, mentoring a finalist team is a unique experience. However, what is the Imagine Cup and what does it mean to mentor a team?

What is the Imagine Cup?

The Imagine Cup is a competition sponsored by Microsoft and it’s now in its 12th edition. It has many competitions, in the technological area: software development, IT and digital media (until 2011 there were video and photo competitions). Each year, there’s a citizenship theme, besides competitions for business innovation and game development. With this kind of theme, the competition stimulates the students to find solutions for many world problems: health care, education and environment are some of them.

The project must be developed by high school, college, grad or post-grad students. The students are completely responsible by their projects and they own their projects (Microsoft doesn’t own anything after the competition – many teams, after the competition, become new startups with great success in marketing their projects).  Click here to continue reading the full article

SQL Server 2014 Performance Enhancements

$
0
0

Editor’s note: In partnership with Microsoft Press, now celebrating their 30th year, MVPs have been contributing to an ongoing guest series on their official team blog. Today’s article is from SQL Server MVP Shehap El Nagar which is the 41stin the series. 

SQL Server 2014 Performance Enhancements

Database Performance has become an important subject for any database administrator, database Analysts and IT Directors. That is why Microsoft has focused on the performance factor within SQL Server 2014 to achieve 10x-30x improvementwithout touching your code whatsoever, they did that through in-Memory OLTP with no page buffer. Then you can create memory optimized table and add it to a memory optimized file group which is translated to .dll entry points. Moreover its data is more resilient, in other words its data remains intact after a server crash as they look like resident on disk. Awesome! I don’t have enough words to say how much it improved significantly an OLTP transaction to insert and delete 1 million records from 32 sec to just 2 seconds! In addition you no longer worry about heavy delete processes used for archiving because records are actually not deleted but just marked as deleted and another garbage collection process will take place asynchronously to clean up the deleted data without affecting on live transactions. Additionally, no changes are needed from developers on code level as it is one of the major myths of in-memory OLTP technology

This dream of performance was achievable through in-memory OLTP and other important projects targeted at rebuilding the architecture of the SQL engine….read full article here 

 

About the author

Shehap El Nagar is an MVP, MCTS and MCITP SQL Server and the founder of SQL Server Performance Tuning which is the largest SQL community in the Middle East along with their Facebook group with more than 8000 members. He is currently the Database manager for the ministry of Higher Education of KSA and well known SQL Server independent expert in the Middle East. Shehap has more than 60 blog articles in English and Arabic that touch on trickier SQL Server subjects.  He is also the first SQL Author on MSDN Arabia and frequent speaker at SQL Saturday events worldwide in Peru, Italy and Australia as well as other local events.  He has more than 90 video tutorials and also many private sessions for .net developers and Database Administrators  and active participant in the Microsoft TechNet Forums for SQL Server.

 About MVP Monday

The MVP Monday Series is created by Melissa Travers. In this series we work to provide readers with a guest post from an MVP every Monday. Melissa is a Community Program Manager, formerly known as MVP Lead, for Messaging and Collaboration (Exchange, Lync, Office 365 and SharePoint) and Microsoft Dynamics in the US. She began her career at Microsoft as an Exchange Support Engineer and has been working with the technical community in some capacity for almost a decade. In her spare time she enjoys going to the gym, shopping for handbags, watching period and fantasy dramas, and spending time with her children and miniature Dachshund. Melissa lives in North Carolina and works out of the Microsoft Charlotte office.

 

 

MVP Open Days - Argentina

$
0
0

Editor's note:  The following post was written by Community Program Manager Erika Vilches

Best practices, tips and tricks and deep-dive technical discussions were the focal point of the 2014 Argentina Open Days event held in Buenos Aires.   At this event, both MVPs and Influencers get to spend a day inside a Microsoft subsidiary’s core, having the opportunity to meet other experts working with the same products and share their passion and  knowledge with the communities. They also have the opportunity to talk with and get to know the Microsoft employees that drive the product adoption strategies in their country, giving both of them the opportunity to work together in the near future towards a common goal: helping the technological communities and the final users to get the most out of their Microsoft products. 

MVPs from around South America presented on a variety of topics from ASP.NET MVP Gonzalo Eduardo Pérez's "Tips and Tricks - Technical Community Recognition to Windows Expert - IT Pro MVP Daniel Rodrigo Vargas Imbachi's "Train the Speaker" and Software Packaging, Deployment & Servicing MVP Jesús Octavio Rodríguez de Santiago's "Using our Experience and Knowledge to Help Non-Profits"

"It’s an excellent opportunity to strengthen our network of Microsoft technology professionals, and the chance to interact with our MVP lead and local DPE team about how we can build and enhance our community," said Windows Phone Development MVP Ivan Toledo

Felicitaciones to the organizers and MVPs who participated.  Looking forward to seeing you all soon!

2014 MVP Excel Summit - The Netherlands

$
0
0

Editor's note:  The following post was written by Business Program Manager William Jansen

Microsoft Excel. Just a spreadsheet application? Simple graphing tools? Pivot tables or a series of grids or cells?  I think not, and I believe the Microsoft Excel MVPs would agree!  Power Queries, PowerView and Excel’s Function Library were just a few of the topics delivered by and for the Excel MVPs during the 2014 Excel MVP Summit that took place on May 15th and 16th at Microsoft Netherlands. 

 "We have all contributed to share our knowledge and we learned a lot about these exchanges," said Excel MVP Jan Karel Pieterse

Excel MVPs from as far away as Australia, Canada, USA and the planet Zortran (no, not really) flew in for this Excellent event! Closer to home, MVPs from Belgium, Bulgaria, France, Malta, Netherlands and the United Kingdom were here for two-days of Excel fun and adventure!

Kicking off Day One was Canadian MVP Ken Puls. Ken delivered a gripping session on ‘Power Query to Power Pivot: Building BI Solutions from Text Files.’  His session was followed by UK MVP Charles Williams, who presented ‘Extending Excel’s Function Library.'  The post-lunch sessions kicked off with Dutch MVP Jan Karel Pieterse presenting ‘Creating a Treeview in Excel.’  Day One finished with ‘Dictator Applications; Problems and Solutions’, presented by UK MVP Bill Manville

"It was great to meet some of the MVPs I probably wouldn't have had a chance to meet otherwise and I picked up some useful information as well," said Excel MVP Liam Bastick

Day two sessions started with ‘Building and Selling Excel Add-ins' by MVP Charles Williams, followed by Senior PM Sam Radakovitz’s session on ‘Excel Team Investments’ After another satisfying lunch, Ken Puls retook the stage with 'Date Intelligent Measures, Building a PowerPivot Time Machine' and finally it is was up to Frédéric Le Guen, whose idea sparked the event, to finish up with ‘PowerView and PowerQuery.  

 

 

Aligning Skills with Real World Business Benefits

$
0
0

Editor’s note: The following post was written by SharePoint MVP Steve Smith

Steve SmithThis article deals with the subject of ‘Aligning skills with real world business benefits’ and why continued investment in technology skillsets is even more important today than it ever was. We will be looking at the importance of practical training alongside real world skills and aligning it all with qualifications and how a company as well an individual or team would benefit in both the short and long term after being through this process.

After many years in the education space especially around Microsoft products there is no doubt that the products themselves have evolved into much more complex platforms. The knowledge and skills that we developed in the nineties and early 2000 certainly provided a solid foundation for the core skills needed in today’s world. It is also more important than ever to have our end users trained and supported on these products, historically IT would deploy the product such as email and our users would get little if no formal training on how to use the products themselves. As the product range has evolved so has the benefit to the users and the business, but only if they all know how to get the most from it.

But what if you are fairly new to this brave new world of Microsoft technologies what skills am I talking about and why are they still so important?

To read the full post, click here

SQL Server 2014 Backup Encryption

$
0
0

Editor’s note: The following post was written by SQL Server MVP Nicolas Souquet

Introduction

With the verly large amounts of data companies store today, databases are very likely to contain critical and confidential data, making database backups encryption an essential database engine feature. It is easily foreseeable that such feature will soon be required to pass accreditation audits.

Raising the security bar through releases, SQL Server 2014 now enables to encrypt database backups natively. It has been a long awaited feature, as the options to cover this need were rather cumbersome :

-          Transparent Data Encryption, released with SQL Server 2008, but it implies some CPU overhead, and the backup file compression ratio is then very low;

-          Disk encryption, such as BitLocker, but ciphering large disk volumes is a long operation;

-          Third-party software.

In this paper, we will expose how backup encryption works, before walking through the steps needed to encrypt databases backups with T-SQL and with Maintenance Plans. We will finally detail the actions required to restore a ciphered database backup. Backing up to a file and to Azure will be addressed.

How does backup encryption work ?

There is no encryption without keys, and backup encryption is no exception : we first need to create a Database Master Key (DMK) of the master system database, and a certificate. The DMK is a symmetric key, and is unique to each database in each SQL Server instance : we cannot restore an encrypted backup file on a distinct or re-installed SQL Server instance without the master system database DMK (or an encrypted database without its DMK).

The Database Master Key is encrypted by the AES 256 algorithm, using the Service Master Key, which is also a symmetric key. It is encrypted based on the SQL Server service account credentials and the Windows Data Protection API machine key. The Service Master Key is unique per SQL Server instance, and created during SQL Server installation. It is stored in the master system database and in the user database, so as to enable its automatic decryption (cf. sys.symmetric_keys system view).

Finally, a Certificate contains a public key that is digitally signed, and may contain a private key. This private key is protected by the DMK. While SQL Server can generate IETF X.509v3-compliant certificates, it also allows to use certificates generated by third-parties (cf. sys.certificates system view).


We can summarize the encryption hierarchy levels with the following diagram :

The backup process works on a data page basis : it copies data pages from the data files into the backup file. Since SQL Server 2008, we can compress database backups. This feature is supported by an algorithm similar to the ones behind file compression software : it factorizes patterns of data found in the data pages. Whether the backup data pages are compressed or not, SQL Server 2014 is able to encrypt these pages with the AES 128, AES 192, AES 256 or Triple DES algorithms.

Encrypting a database backup in T-SQL

In this example, we will use the Contoso demo database. The code needed to take an encrypted database backup is following the steps described in the previous section. First of all, we must backup the service master key :

USE master

GO

 

-- Saving the service master key

-- Ideally, the resulting file should be stored on a distinct,

-- secure machine with restricted access

BACKUP SERVICE MASTER KEY

TO FILE = 'E:\SQLServerBackup\SQL2014_service_master_key.key'

ENCRYPTION BY PASSWORD = 'a$tr0n9#!P@$$w0r2_f0rDBb@ckupEncryption';

GO

 

We then have to create the master key, which again is straightforward. The password used to encrypt the master key can be different than the one used to encrypt the service master key backup file.

-- Creating a new database master key

CREATE MASTER KEY

ENCRYPTION BY PASSWORD = 'a$tr0n9#!P@$$w0r2_f0rDBb@ckupEncryption';

GO

 

-- Saving the database master key

-- Ideally, the resulting file should be stored on a distinct,

-- secure machine with restricted access

BACKUP MASTER KEY

TO FILE = 'E:\SQLServerBackup\SQL2014_Contoso_master_key.key'

ENCRYPTION BY PASSWORD = 'a$tr0n9#!P@$$w0r2_f0rDBb@ckupEncryption';

GO

 

We can confirm the creation of the master key by examining the sys.symmetric_keys system view : it now shows one, named ##MS_DatabaseMasterKey##. Next, we need to create the certificate, which is achieved with a single instruction :

 

USE master

GO

 

-- Creating the certificate

CREATE CERTIFICATE Contoso_BackupEncryptionWithSQLServer2014

WITH SUBJECT = 'SQL Server 2014 Backup Encryption demo with Contoso';

GO

 

Similarly, interrogating the sys.certificates system view in the master database context will now return a supplementary row, and the column pvt_key_encryption_type_desc indicates ENCRYPTED_BY_MASTER_KEY.

 

As for every key, we want to save the certificate : this requires to specify a private key file. It clearly implies that one does not work without the other. Here again, a password is required to cipher the certificate file, and optionally a different one can be defined for the decryption. This password is the public key of the certificate.

 

-- Saving the certificate

-- Ideally, the resulting file should be stored on a distinct,

-- secure machine with restricted access

BACKUP CERTIFICATE Contoso_BackupEncryptionWithSQLServer2014

TO FILE = 'E:\SQLServerBackup\SQL2014_Contoso_certificate.cer'

WITH PRIVATE KEY

        (

                FILE = 'E:\SQLServerBackup\SQL2014_Contoso_certificate_private_key.key'

                , ENCRYPTION BY PASSWORD = 'a$tr0n9#!P@$$w0r2_f0rDBb@ckupEncryption'

        );

GO

 

At this point, the groundwork is laid for taking encrypted backups. We have 4 files that we need to keep in a secure and distinct storage area :

 

 

Backing up a database and ciphering the resulting file only requires us to choose an encryption algorithm and to specify the certificate we want to use :

-- Backing up the ContosoRetailDW demo database, with encryption

BACKUP DATABASE ContosoRetailDW

TO DISK = 'E:\SQLServerBackup\ContosoRetailDW\ContosoRetailDW_FULL_ENCRYPTED.bak'

WITH INIT, CHECKSUM, COMPRESSION, STATS = 1

        , ENCRYPTION

(

ALGORITHM = AES_256

, SERVER CERTIFICATE = Contoso_BackupEncryptionWithSQLServer2014

)

 

Restoring a database from an encrypted backup file

 

Before we walk thorugh the restoration process, we must keep in mind that SQL Server 2014 introduces native backup encryption. Thus, restoring a database from a natively encrypted backup file on a version of SQL Server that is anterior to SQL Server 2014 is not supported.

Restoring the database on the same SQL Server 2014 instance

Restoring a database from an encrypted backup file on the same SQL Server 2014 instance as the one on which its backup has been taken is operated as usual : all the keys and the certificate are already registered in the master database. Consequently, they are opened automatically when needed for decryption.

RESTORE DATABASE ContosoRetailDW_RestoredFromEncryptedBackupFile

FROM DISK = 'E:\SQLServerBackup\ContosoRetailDW\ContosoRetailDW_FULL_ENCRYPTED.bak'

WITH MOVE'ContosoRetailDW2.0' TO 'E:\Contoso\ContosoRetailDW_EncrypteBackup_data.mdf'

, MOVE'ContosoRetailDW2.0_log' TO 'E:\Contoso\ContosoRetailDW_EncrypteBackup_log.ldf'

, STATS = 1

 

Restoring a database from an encrypted backup file on another SQL Server 2014 instance

This operation requires to :

  • Restore the Database Master Key from its backup file
  • Create the Certificate from its backup file, which involves the private key file.

The restoration process is as follows :

USE master

GO

 

-- Restoring the master key on the target SQL Server instance

-- from its backup file

RESTORE MASTER KEY

FROM FILE = 'E:\SQLServerBackup\SQL2014_Contoso_master_key.key'

-- the password that was used to encrypt the master key in the source SQL Server instance

DECRYPTION BY PASSWORD = 'a$tr0n9#!P@$$w0r2_f0rDBb@ckupEncryption'

-- the password with which we want to encrpyt it on the target SQL Server instance

-- the service key is different on the source and the target SQL Server instances

ENCRYPTION BY PASSWORD = '@n0therStrongP@$$w0r2!';

 

-- Since the master key is not registered in the master database

-- We need to open it in order to decrypt it

-- It stays opened for the session duration

OPEN MASTER KEY

DECRYPTION BY PASSWORD = '@n0therStrongP@$$w0r2!'

 

-- Restoring the certificate by the private key

-- the password is the one we used to encrypt it on the source SQL Server instance

CREATE CERTIFICATE Contoso_BackupEncryptionWithSQLServer2014

FROM FILE = 'E:\SQLServerBackup\SQL2014_Contoso_certificate.cer'

WITH PRIVATE KEY

        (

                FILE = 'E:\SQLServerBackup\SQL2014_Contoso_certificate_private_key.key'

                , DECRYPTION BY PASSWORD = 'a$tr0n9#!P@$$w0r2_f0rDBb@ckupEncryption'

        )

 

-- Getting the list of files in the backup file

-- The instruction is identical for a non-encrypted database

RESTORE FILELISTONLY

FROM DISK = 'E:\SQLServerBackup\ContosoRetailDW\ContosoRetailDW_FULL_ENCRYPTED.bak'

 

-- Finally restoring the database

-- The instruction is identical for a non-encrypted database

RESTORE DATABASE ContosoRetailDW

FROM DISK = 'E:\SQLServerBackup\ContosoRetailDW\ContosoRetailDW_FULL_ENCRYPTED.bak'

WITH MOVE'ContosoRetailDW2.0' TO'D:\SQL Server\\ContosoRetailDW_EncryptBackup_data.mdf'

, MOVE'ContosoRetailDW2.0_log'TO'D:\SQL Server\ContosoRetailDW_EncryptBackup_log.ldf'

, STATS = 1

 

-- Closing the master key

CLOSE MASTER KEY

 

In the case of a restoration automation, we may not want to restore and open the database master key each time.

This is achievable by running the following command, after having restored and opened the master key in the same session :

 

ALTER MASTER KEY REGENERATE

WITH ENCRYPTION BY PASSWORD = 'aN0th€r$tr0n9#!P@$$w0r2_f0rDBb@ckupEncryption'

Taking encrypted backups with Maintenance Plans


The dialog window to customize maintenance plan backup tasks had been revamped in SQL Server 2014, and reveals the option to take encrypted backups. It lets you pick the encryption algorithm among the four available.

 

However, there is not a way yet to manually create a certificate or an asymmetric key from this interface, or from SQL Server Management Studio Object Explorer : it has to be created following the steps described earlier.

Backing up and restoring in Azure

SQL Server 2014 enables to backup databases to Microsoft’s cloud platform : Azure (More exactly since SQL Server 2012 SP1 CU4). This nice feature is now enabling companies :

-          to store their backups outside their data center, covering the case in which it would face a complete outage;

-          to save their data in a geographically remote location at a low cost compared to investing in a secondary data center.

Prerequisites

In order to backup a database in Azure, we need to have :

-          An Azure subscription

-          Access to Azure by the Azure Portal (or via PowerShell)

-          An Azure Storage Account created

-          A storage container created. In this example, it is named “backup”.

Instructions to save a database to Azure

As the documentation reveals, instead of specifying the TAPE or DISK clause of the BACKUP instruction, we use URL, which makes the implementation of this feature remarkably simple.

Prior to this, we need to create a Credential : it is a security element that contains a user name and password for a user to access resources external to SQL Server. In our case, the Credential will contain the name of the storage account, and the primary key to this storage account.

In the print-screen below, we find the name of the storage account under the NAME column :

 

Clicking the MANAGE ACCESS KEYS button brings a popup window, in which we find the storage primary access key. So to create the credential with which we will gain access to Azure storage, we write :

USE master;

 

CREATE  CREDENTIAL SQLServer2014EncryptedBackupInTheCloud

WITH    IDENTITY = 'account_name'

        , SECRET = 'AzureStoragePrimaryAccessKey_ItIsNormallyALongString'

Before issuing the BACKUP instruction, we need to get the URL that we will specify. We can get it by successively clicking on the storage account name, and on the CONTAINERS tab :

 

This is the occasion to confirm the name of the storage container : “backup”, which is also visible in the URL.

Finally, the instruction is :

BACKUP DATABASE ContosoRetailDW

TO URL = 'http://account_name.blob.core.windows.net/backup/ContosoRetailDW.bak'

WITH INIT, COMPRESSION, CHECKSUM, STATS = 1

        , CREDENTIAL = 'SQLServer2014EncryptedBackupInTheCloud'

        , ENCRYPTION

(

ALGORITHM = AES_256

, SERVER CERTIFICATE = Contoso_BackupEncryptionWithSQLServer2014

)

 

Once the backup is complete, we can see the file in the container :

 

 

Restoring a database from Azure

Similar to the restoration instruction sequence we have seen for file backups, it is required to have :

-          the database master key and the certificate restored;

-          the credential existing; if necessary, we can recreate it based on the data gathered from the Azure portal website.

The command batch is then :

OPEN MASTER KEY

DECRYPTION BY PASSWORD = '@n0therStrongP@$$w0r2!'

 

RESTORE DATABASE ContosoRetailDW

FROM URL = 'http://account_name.blob.core.windows.net/backup/ContosoRetailDW.bak'

WITH CREDENTIAL = 'SQLServer2014EncryptedBackupInTheCloud'

Performance Considerations

SQL Server 2014 offers four algorithms to encrypt backup files, and ciphering is an intensive CPU activity. It is also possible that the various algorithms have a different impact on the size of the resulting backup files.

To measure the impact of the various algorithm on CPU usage, we have taken a Performance Monitor trace with the Processor / % Processor Time counter only, and taken a backup of the same database with each of the four algorithms, plus a non-encrypted backup. All backups were compressed. The batch of commands is :

BACKUP DATABASE [ContosoRetailDW]

TO DISK = 'E:\SQLServerBackup\ContosoRetailDW_NO_ENCRYPTION.bak'

WITH INIT, COMPRESSION, CHECKSUM

 

WAITFOR DELAY'00:00:10'

 

BACKUP DATABASE [ContosoRetailDW]

TO DISK = 'E:\SQLServerBackup\ContosoRetailDW_AES_128.bak'

WITH INIT, COMPRESSION, CHECKSUM , ENCRYPTION

(

ALGORITHM = AES_256

, SERVER CERTIFICATE = Contoso_BackupEncryptionWithSQLServer2014

)

 

 

WAITFOR DELAY'00:00:10'

 

BACKUP DATABASE [ContosoRetailDW]

TO DISK = 'E:\SQLServerBackup\ContosoRetailDW_AES_192.bak'

WITH INIT, COMPRESSION, CHECKSUM , ENCRYPTION

(

ALGORITHM = AES_256

, SERVER CERTIFICATE = Contoso_BackupEncryptionWithSQLServer2014

)

 

WAITFOR DELAY'00:00:10'

 

BACKUP DATABASE [ContosoRetailDW]

TO DISK = 'E:\SQLServerBackup\ContosoRetailDW_AES_256.bak'

WITH INIT, COMPRESSION, CHECKSUM , ENCRYPTION

(

ALGORITHM = AES_256

, SERVER CERTIFICATE = Contoso_BackupEncryptionWithSQLServer2014

)

 

WAITFOR DELAY'00:00:10'

 

BACKUP DATABASE [ContosoRetailDW]

TO DISK = 'E:\SQLServerBackup\ContosoRetailDW_TRIPLE_DES_3KEY.bak'

WITH INIT, COMPRESSION, CHECKSUM , ENCRYPTION

(

ALGORITHM = AES_256

, SERVER CERTIFICATE = Contoso_BackupEncryptionWithSQLServer2014

)

 

We wait 10 seconds between each backup, so as for the CPU usage to show bumps on the graph between each backup execution. Below is the commented Perfmon trace graph. It shows that while the AES algorithms have not much impact on the CPU usage compared to each other and to a backup taken with no encryption, the TRIPLE DES algorithm uses 25-30% more CPU than the AES ones.

 

 

After having integrated the Perfmon trace file into a table, and associated the CPU usage to the encryption algorithm used based on the time, we computed the median of the CPU usage, which produces the result below :

 

 

 

 

 

Finally, we can look at the backup duration and the backup file size. A few columns have been added to the system tables in the msdb system database, to expose backup encryption :

-          dbo.backupset

  • key_algorithm
  • encryptor_thumbprint
  • encryptor_type

-          dbo.backupmediaset

  • is_encrypted

Below is a query that retrieves the backup history, specifying the columns from the dbo.backupset table listed above :

;WITH

        CTE AS

        (

                SELECT  database_name

                        , last_backup_start_date_time

                        , physical_device_name

                        , backup_size / (1024 * 1024) AS backup_size_MB

                        , compressed_backup_size / (1024 * 1024)  AS compressed_backup_size_MB

                        , CAST(backup_size / compressed_backup_size AS decimal(5,2)) AS compression_ratio

                        , key_algorithm

                        , encryptor_thumbprint

                        , encryptor_type

                FROM    (

                                SELECT          S.database_name

                                                , MAX(S.backup_start_date) AS last_backup_start_date_time

                                                , MF.physical_device_name

                                                , S.backup_size

                                                , S.compressed_backup_size

                                                , S.key_algorithm

                                                , S.encryptor_thumbprint

                                                , S.encryptor_type

                                FROM            msdb.dbo.backupset AS S

                                INNER JOIN      msdb.dbo.backupmediafamily AS MF

                                                        ON S.media_set_id = MF.media_set_id

                                GROUP BY                S.database_name

                                                , S.type

                                                , MF.physical_device_name

                                                , S.backup_size

                                                , S.compressed_backup_size

                                                , S.key_algorithm

                                                , S.encryptor_thumbprint

                                                , S.encryptor_type

                        ) AS BH

        )

SELECT          D.name AS database_name

                , C.last_backup_start_date_time

                , C.physical_device_name

                , CAST(C.backup_size_MB ASdecimal(15,2)) AS backup_size_MB

                , CAST(C.compressed_backup_size_MB AS decimal(15,2)) AS compressed_backup_size_MB

                , compression_ratio

                , C.key_algorithm

                , C.encryptor_thumbprint

                , C.encryptor_type

FROM            sys.databasesAS D

INNER JOIN      CTE AS C

                        ON D.name = C.database_name

ORDER BY                C.last_backup_start_date_time

 

 

We obtain the following result, which clearly demonstrates that the encryption has no impact on the compression of the backup file size :

 

 

About the author

Nicolas Souquet is a Bangkok-based SQL Server database architect, and also a writer and speaker. He has centered his career on SQL Server databases modeling and performance tuning. With 10 years of industry experience, he has been working on all versions of SQL Server, ranging from 2000 to 2014, and has been continuously involved in several OLTP data-intensive environments. He has been rewarded SQL Server MVP in 2009, 2012 and 2013 for publishing articles on his blog (http://blog.developpez.com/elsuket/) and animating SQL Server conferences in Bangkok. He is now gaining interest in Business Intelligence implementations with SQL Server. Finally, he has participated in writing a book on SQL Server 2014 in French with two other SQL Server MVPs, Frédéric Brouard and David Barbarin, which will be released in July 2014.

 About MVP Monday

The MVP Monday Series is created by Melissa Travers. In this series we work to provide readers with a guest post from an MVP every Monday. Melissa is a Community Program Manager, formerly known as MVP Lead, for Messaging and Collaboration (Exchange, Lync, Office 365 and SharePoint) and Microsoft Dynamics in the US. She began her career at Microsoft as an Exchange Support Engineer and has been working with the technical community in some capacity for almost a decade. In her spare time she enjoys going to the gym, shopping for handbags, watching period and fantasy dramas, and spending time with her children and miniature Dachshund. Melissa lives in North Carolina and works out of the Microsoft Charlotte office.


Friday Five - May 30, 2014

$
0
0

Techorama 2014

$
0
0

Editor's note: The following post was written by Community Program Manager William Jansen

 

Three men and a vision. A vision of a new type of conference. The result: Techorama.

Techorama is the brain-child of MVPs Gill Cleeren, Pieter Gheysens and Kevin DeRudder. These three community-driven techies felt it was time to bring a new view on technology to Belgium. Techorama delivered deep-dive developer sessions, numerous integration scenarios and a wide-range of best practice advice, both from world-renowned speakers and local community rock stars. 

 

Not only did Techorama sell out well ahead of the event, more than 600 people attended a total of 75 sessions presented by 50 speakers.  MVPs Maarten Balliauw and Mike Martin who were the dedicated track owners during this two-day event 

Notable is that 32 of the speakers at Techorama were MVPs from around the globe, including:

  • Belgium: Kevin Dockx, Mike Martin, Sam Vanhoutte, Xavier Decoster, Yves Goeleven
  • Canada: Richard Campbell
  • Denmark: Mark Seemann
  • Germany; Christian Wenz, Neno Loje
  • Netherlands: Fons Sonnemans, Marcel de Vries, Maurice de Beijer, Patriek van Dorp
  • Norway: Bjoern Rapp
  • Portugal: Tiago Pascoal
  • Romania: Tiberiu Covaci
  • Spain: David Rodriguez
  • Switzerland: Laurent Bugnion
  • Sweden: Alan Smith, Chris Klug and Magnus Martensson
  • UK: Andy Cross, Richard Fennel, Mark Rendle,
  • USA: Doug Finke, Grant Fritchey, Jon Galloway, Lynn Langit, Mike Wood, Mike Taulty, Nik Molnar, Phil Japikse, Wallace McClure

I caught up with the visionaries during the event, and was able to ask them a few questions:

Gill Cleeren, What was the inspiration for creating Techorama?“Having organized Community Day for 7 years in a row, I knew that developers in Belgium needed a community conference, he explained, adding “Because Microsoft is no longer organizing TechDays in Belgium, an real opportunity presented itself - an event for Belgian developers. An event for new ideas, the opportunity to learn from the masters and of course to connect with peers. This inspired me to start Techorama” he concluded.

Kevin DeRudder, What sets Techorama apart from other community events? “Techorama is a conference organized by three techies, who are deep into the technologies they love” Kevin explained. “We are aware of what developers are doing and where their interests lie. Together with a few other specialized MVPs, we crafted an agenda full of useful sessions, in a way that they should be able to apply the information in their jobs” Kevin happily added.

Pieter Gheysens, What was your personal highlight of the event? “It may sound a bit boastful – but at the end of the second day keynote, we received a standing ovation from the attendees!” Pieter enthusiastically told me.  “For me, that was a confirmation that we had done a good thing!” 

A full list of speakers can be found here: http://www.techorama.be/speakers/

A full list of sponsors can be found here: www.techorama.be/sponsors

Video or photo links? www.flickr.com/techorama

 

Plans for next year’s Techorama are already underway. Dates will be May 12-13, 2015

 

Verbatim feedback from attendees: https://twitter.com/search?q=techorama&src=typd

 

 

 

 

 

TechDays - San Francisco

$
0
0

Learn IT today, use IT tomorrow

June 5-6

With more than 200 events under their belt and covering topics such as Windows Server, Exchange Server, IIS, Small Business Server (SBS), Automating Windows workstation deployments, Server/Network Security, Active Directory Troubleshooting, Active Directory Disaster Recovery, Windows Server Clustering, Network Troubleshooting, Server Virtualization...it's no wonder TechDays has become a West Coast staple in the IT Pro and Dev communities. 

"The point is to educate people.  Keep them current with Microsoft products and technologies and make sure they are able to solve whatever business problems they come up with," said MVP and event organizer Doug Spindler.  Following the success of TechEd, MEC and other large-scale IT events, TechDays allows attendees to continue conversations about new technologies and techniques. 

"The reason that we do this is because Microsoft has so much new technology, it's hard for an IT Pro to keep current," said Spindler.  "What solutions has Microsoft come up with that IT pros don't know about that will actually solve a business need that they have."

MVPs play a crucial role in the success of TechDays, here is a list of MVPs who will be presenting sessions:

Jason Helmick

Jason Helmick

Jason is a 25-year IT veteran and Senior Technologist at Concentrated Technology. He’s an avid supporter of the PowerShell community as board member and CFO of PowerShell.Org and a Windows PowerShell MVP. He is the author of “Learn   Windows IIS in a Month of Lunches” and contributing author to “PowerShell Deep Dives”, along with a columnist for  TechNet and TechTarget Magazine and other industry publications.

Jessica Deen

Jessica Deen
For over 10 years, Jessica has worked as an IT Consultant or Systems Administrator in various corporate and enterprise environments, catering to end users and IT professionals in the San Francisco Bay Area. She is currently a Systems Integration Engineer for SPK and Associates in the Bay Area. Jessica specializes in a wide variety of areas including managing social engagement via platforms such as forums, blogs, and Twitter, web content delivery, CMS web design, firewalls, DHCP, DNS, and both the Windows Client and Apple Client. She holds 3 CompTIA certifications, 4 Apple Certifications, is a 2 time Microsoft Valuable Professional for Windows Client and, as of 2013, achieved her FEMA certification from the U.S Department of Homeland Security which recognizes her leadership and influence abilities during times of crisis and emergency. In her free time she enjoys riding her 2005 Triumph Bonneville T100, bike riding with friends and participating in UFC Training at her local gym.

Steve Evans

Steve Evans

Steve Evans has been doing DevOps since before the term DevOps was invented. He is a Pluralsight Author, five time Microsoft Most Valuable Professional (MVP), and technical Speaker at various industry events. In his spare time he manages a DevOps team at a Silicon Valley Biotech focused on improving the lives of cancer patients. For over 15 years Steve has focused on making technology better for businesses by bridging the gap between IT and Development teams.

You can follow his technical blog at http://www.LoudSteve.com or find him on twitter at @TheLoudSteve.

Cliff Galiher

Nestled in the north Rockies, Cliff works with small businesses in western Montana.  He has carved out a reputation for helping the local business community and non-profit sector take advantage of cutting-edge technology while working within the smaller budget that these local organizations have.

Stephen Foskett

Stephen Foskett

Stephen Foskett is an active participant in the world of enterprise information technology, currently focusing on enterprise storage, server virtualization, networking, and cloud computing. He organizes the popular Tech Field Day event series for Gestalt IT and runs Foskett Services. A long-time voice in the storage industry, Stephen has authored numerous articles for industry publications, and is a popular presenter at industry events. His contributions to the enterprise IT community have earned him recognition as both a Microsoft MVP and VMware vExpert. He can be found online at TechFieldDay.comblog.FoskettS.net, and on Twitter at @SFoskett.

Richard Hicks

Richard Hicks

Richard Hicks is a network security specialist and Microsoft Most Valuable Professional (MVP) in Forefront protection technologies. He has designed and deployed edge security and remote access solutions for small and mid-sized businesses, military, government, and Fortune 500 companies around the world. With nearly two decades of experience in the information technology field, Richard holds many certifications including MCP, MCSE, MCTS, and MCITP-Enterprise Administrator. In addition, Richard is a contributing author for popular technology web sites TechRepublic.com and ISAserver.org. You can find his blog at http://tmgblog.richardhicks.com/

Darren Mar-Elia

Darren Mar-Elia

Darren Mar-Elia (aka “GPOGUY”, aka “CGPO”) is President and CTO of SDM Software, Inc. (www.sdmsoftware.com)–a Windows management products company focusing on Group Policy solutions. Prior to SDM, he worked at DesktopStandard, which was acquired by Microsoft, as Sr. Director of Product Engineering. Prior to joining DesktopStandard he was CTO for Quest Software’s Windows Management products. Darren has 20+ years of experience in systems and network administration design and architecture. Before Quest, he worked as director of Windows architecture and planning for Charles Schwab & Co., Inc. In that capacity he was technical lead for the company’s Windows NT & 2000 design and migration efforts.

Darren maintains a popular Windows Group Policy resource site at www.gpoguy.com

Darren has been a contributing editor for Windows IT Pro Magazine since 1997. He has written and contributed to 11 different books on Windows including, most recently the “Windows Group Policy Guide”, published in Summer 2005 by Microsoft Press. Additional titles include, “The Definitive Guide to Windows 2000 Administration,” “The Definitive Guide to Windows 2000 Group Policy” and “The Tips & Tricks Guide to Group Policy,” all published online by Realtimepublishers.com.

 

Kirk Munro

Kirk MunroKirk Munro is a Technical Product Manager at Provance Technologies, where he is helping build the next generation of Provance’s flagship IT Asset Management product.  He is also a 7-time recipient of the Microsoft Most Valued Professional (MVP) award for his involvement in the PowerShell community.  For the past 8 years, Kirk has focused almost all of his time on PowerShell and PowerShell solutions, including managing popular products such as PowerGUI, PowerWF and PowerSE.  It is through this work he became known as the world’s first self-proclaimed Poshoholic.  Outside of work these days Kirk is returning to his software developer roots, learning mobile technologies like Xamarin and Ruby on Rails, and taking courses on Coursera or edX whenever he can make the time to do so.

Aleksandar Nikolic

Aleksandar Nikolic

Aleksandar Nikolic is Microsoft MVP for Windows PowerShell. He is one of the earliest adaptors of Windows PowerShell. Aleksandar is a frequent speaker at the conferences (Microsoft Sinergija, PowerShell Deep Dive, NYC Techstravaganza, KulenDayz) and participates regularly in IT Pro/PowerShell user groups worldwide. He is also a co-founder and editor of the PowerShell Magazine     (http://powershellmagazine.com). You can find him on Twitter: https://twitter.com/alexandair

Doug Spindler

 Doug Spindler

 

Douglas R. Spindler grew up in the San Francisco Bay Area and Silicon Valley. He wrote his fist computer program was when he was 12 years old on a Teletype machine with paper tape and a 300 baud modem connected to the University of California at Berkeley. After writing a program too long for tape he began using punch cards then cassette tapes on an IBM-PC. Doug attended the University of California and received a degree in Molecular-Cellular and Developmental Biology while still maintaining an interest in computers and technology. While in college Doug became involved in technology user groups and later founded Pacific IT Professionals or Pac IT Pros. The interest in the Pac IT Pros community still continues to grow. Doug’s monthly Pac IT Pros meetings are simulcast live to 9 other IT Pro community groups on the west coast. Microsoft recognized Doug’s work in the IT Pro community and invited him to become a member of Microsoft’s IT Pro Council which lead to him supporting IT Pro communities across the US, South America, Europe and India. Doug is the creator of TechDays, technical training events “where IT Pros learn IT today, and deploy IT tomorrow”. Doug works as a Technology Consultant for the past twenty-five years. He has designed solutions for the US Department of Energy, UCSF Medical Center, Kaiser Permanente, Lawrence National Laboratory, GTE, and several financial institutions. Doug also teaches Microsoft technology classes at a Bay Area College, is an author, lecturer and on occasion can be found on the local evening TV news broadcast commenting on technology. Doug has earned several Microsoft certifications which include MCSE 2003, MCITP Server 2008, and Microsoft Certified Trainer. Doug is a four time award winner of Microsoft’s prestigious MVP award and was selected by Windows IT Pro Magazine’s as their first IT Pro Hero. Doug lives in the San Francisco Bay area with his four kids, and 12 Microsoft Servers.

Friday Five - June 6, 2014

Referring to Content Control Using C#

$
0
0

Editor’s note: The following post was written by Visual C# MVP Ming Man Chan

Referring to Content Control using C#

For the Microsoft Office developers before Office 2010, they will be familiar with ActiveX control. Microsoft Office has come out with a Content Control since Office 2010.  How the controls are referenced is different from the ActiveX control. This article will show you how to refer to it. 

This article has 3 sections to illustrate how to referring to Content Control using C#. 

  • Create an Office project using the Word template.
  • Add the Content Control into the page.
  • Referring the Content Control using C# and change the content of the control.

 Create an Office project using the Word template

 1. Click on File -> Project... -> New Project.

 

2. Choose Word 2013 Document in Office / SharePoint template.

 

3. Select the docx then click in Visual Studio Tools for Office Project Wizard leave others as default then click OK.

 

Add the Content Control into the page.

1. Type something in the Word document page for example, “Here is Content Control”.

 

2. Click on the Developer tab in the Word Document then click on the Plain Text Content Control and click below the text you typed just now.

 

If you do not see the Developer tab then do this How to: Show the Developer Tab on the Ribbonhttp://msdn.microsoft.com/en-us/library/bb608625.aspx.

Referring the Content Control using C# and change the content of the control

 

1. Click on the Properties tab on the left of the Visual Studio.

2. Click on the Events (button).

 

3. Click on ContentControlOnExit ().

 

4. Type the below code in ThisDocument_ContentControlOnExit event.

Microsoft.Office.Interop.Word.ContentControl cc = this.Application.ActiveDocument.ContentControls[1];

cc.Range.Text = "I can run";

 

5. Run the Word application from Visual Studio by pressing F5. Click the Content Control.

 

6. Click outside the Content Control then you will see your code effect.

 

Reference:

How to: Add Content Controls to Word Documents (http://msdn.microsoft.com/en-us/library/bb386200.aspx)

 

About the author

 

Ming Man is Microsoft MVP since year 2006. He is a software development manager for a multinational company. With 25 years of experience in the IT field, he has developed system using Clipper, COBOL, VB5, VB6, VB.NET, Java and C #. He has been using Visual Studio (.NET) since the Beta back in year 2000.  He and the team have developed many projects using .NET platform such as SCM, and HR based applications. He is familiar with the N-Tier design of business application and is also an expert with database experience in MS SQL, Oracle and AS 400.  Additionally you can read Ming’s Channingham’s blog

 About MVP Monday

The MVP Monday Series is created by Melissa Travers. In this series we work to provide readers with a guest post from an MVP every Monday. Melissa is a Community Program Manager, formerly known as MVP Lead, for Messaging and Collaboration (Exchange, Lync, Office 365 and SharePoint) and Microsoft Dynamics in the US. She began her career at Microsoft as an Exchange Support Engineer and has been working with the technical community in some capacity for almost a decade. In her spare time she enjoys going to the gym, shopping for handbags, watching period and fantasy dramas, and spending time with her children and miniature Dachshund. Melissa lives in North Carolina and works out of the Microsoft Charlotte office.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Updateable Column Store Indexes in SQL Server 2014

$
0
0

Editor’s note: In partnership with Microsoft Press, now celebrating their 30th year, MVPs have been contributing to an ongoing guest series on their official team blog. Today’s article is from SQL Server MVP Sergio Govoni which is the 42nd in the series. 

Updateable Column Store Indexes in SQL Server 2014

Introduction

Column store indexes had been released with SQL Server 2012 to optimize data warehouse workloads that have specific patterns, data are loaded through T-SQL scripts or through SSIS packages, several times a day, in any case the frequency is not important, only that the available data is loaded in the same execution. At the end of ETL process, data is read with reporting tools. Usually data is written one time, then read multiple times.

In SQL Server 2012 there was the non-clustered column store index only; like a traditional B-Tree non-clustered index, it was a secondary index. However, it differs from a traditional B-Tree index because it is based on a columnar structure, though, the base table remains organized by row (in a structure called row-store and saved in 8K data pages).

The column store indexes are part of Microsoft In-Memory Technologies because they use xVelocity engine for data compression optimization and its implementation is based on a columnar structure such as PowerPivot and SSAS Tabular. Data in column store indexes are organized by column, each memory page stores data from a single column, so each column can be accessed independently. This means that SQL Server Storage Engine will be able to fetch the only columns it needs. In addition, data is highly compressed, so more data will fit in memory and the I/O operations can greatly decrease.

Column store indexes structure

Before talking about new feature of column store indexes in SQL Server 2014, it is important to introduce three keywords: Segment, Row Group and Compression. In a column store index, a segment contains values for one column of a particular set of rows called row group. As it is possible to see in the following picture, each red and gray portions are segments. When you create a column store index, the rows in the table will be divided in groups and each row group contains about 1 million rows (the exact number of rows in each row group is 1,048,576; in other word there are 220 rows in each row group). Column store transforms the internal index organization from row organization to columnar organization and there will be one segment for each column and for each row group. Column store indexes are part of Microsoft In-Memory technologies in which data is compressed and the compression plays a very important role, so each segment is compressed and stored in a separate LOB.

This article does not detail the algorithms to compress data in column store index. At any rate, keep in mind that each segment stores the same type of data, so in a segment, there will be homogeneous data and in this scenario, the compression algorithms will be more efficient than the one used to compress table rows because row usually contains different type of data. In general, data compression can be implemented using different techniques such as:

  • Dictionary Encoding
  • Run-length Encoding
  • Bit Packing
  • Archival Compression (only in SQL Server 2014)
    • It can reduce disk space up to 27%

The techniques used by SQL Server to compress data are undocumented. The following picture shows an example of row groups and segments in a column store index.  Continue reading full article here.

About the author


Since 1999 Sergio Govoni has been a software developer; in the 2000 he received a Degrees in Computer Science  from Italian University. He has worked for over 14 years in a software house that produces multi-company ERP on Win32 platform. Today, at the same company, he is Program Manager and he’s constantly involved on several team projects, where he takes care of the architecture and the mission-critical technical details. Since 7.0 version he has been working with SQL Server and he has a deep knowledge of Implementation and Maintenance of Relational Databases, Performance Tuning and Problem Solving skills. He also works for training people to SQL Server and related technologies. Since 2010 he is a Microsoft SQL Server MVP. You can meet him at conferences, SQL Saturday or Microsoft events.  Sergio blogs in both English and Italian.  Follow him on Twitter

About MVP Monday

The MVP Monday Series is created by Melissa Travers. In this series we work to provide readers with a guest post from an MVP every Monday. Melissa is a Community Program Manager, formerly known as MVP Lead, for Messaging and Collaboration (Exchange, Lync, Office 365 and SharePoint) and Microsoft Dynamics in the US. She began her career at Microsoft as an Exchange Support Engineer and has been working with the technical community in some capacity for almost a decade. In her spare time she enjoys going to the gym, shopping for handbags, watching period and fantasy dramas, and spending time with her children and miniature Dachshund. Melissa lives in North Carolina and works out of the Microsoft Charlotte office.

 

 

Viewing all 788 articles
Browse latest View live




Latest Images