Quantcast
Channel: The Microsoft MVP Award Program Blog
Viewing all 788 articles
Browse latest View live

Build 2013 Recap

$
0
0

Editor’s note: The following post was written by Windows Phone Development MVP Mayur Tendulkar

Just few days ago Microsoft hosted an annual developer conference - BUILD 2013 in San Francisco, CA. This conference provided a lot more information and insights into what is coming up next. And indeed the future looks bright.  But more than anything, a developer working on Microsoft technologies must attend this conference for early access and networking. And this one was no exception with more than 5000 developers attending this conference at the venue and more thanclip_image002 60000 developers attending it using live streaming. This event was a huge success.

In this BUILD, Day 01 keynote delivered by Steve Ballmer, was all about “Rapid Releases” with updating Windows 8 to 8.1 and introducing more new features. Along with it, to support new Windows 8.1 development Microsoft released a public preview of Visual Studio 2013 with next version of .NET Framework. Microsoft has provided all Windows 8.1 and Visual Studio 2013 preview bits for download and evaluate. This keynote was followed by various sessions on building Windows Store apps for Windows 8.1

 

Day 02 Keynote delivered by Satya Nadella was exciting too. It was all about Server and Azure offerings. The key benefits offered during this keynote were: General Availability (GA) of Windows Azure Websites with full SLA and enterprise support. Along with it, now Windows Azure Mobile services are available with 20 MB free SQL Database with 10 free Web Services. Visual Studio 2013 Preview will support Azure development, with in-built server explorer browsing through mobile services and new wizard to add Push Notifications to Windows Store apps. The biggest news in this keynote was Auto-Scaling on Windows Azure and no-billing when Virtual Machine is stopped/shut-down mode. This also included per-minute billing.

clip_image004

Right from the Keynote till last session, attendees could feel zest in the atmosphere. There were queues not just for getting free goodies like Surface Pro, Acer Iconia Tablet, but there were queues to attend sessions as well. There were hundreds of people standing in queue outside room to attend Scott’s session. And most importantly, Scott visited each one in the queue. Scott played some nice videos, did pushups on stage and entertained people while his session started and all people were enjoying it.  Anyone working on Microsoft technologies MUST attend Scott Hanselman’s session LIVE in room. This is where you get to see nice people like Scott, Anders, etc… and learn from them.

Overall, the 3 days event was a nice get together and learning from tech pundit experience. The cool, windy atmosphere at that time in San Francisco, made the event more enjoyable. I wish, the event was at least hosted for 5 days with more content and more fun. But for it, we’ll have next BUILD J

Till then, you can find all recorded content from BUILD 2013 here:

http://channel9.msdn.com/Events/Build/2013


Speaker Idol 2013 at TechEd North America

$
0
0

This year, TechEd North America, was held in New Orleans and featured four full days of sessions, exhibitions, certifications and networking. In all of these activities, MVPs had a prominent presence as presenters and attendees. One strong example of MVP involvement was in the Speaker Idol competition, a contest where speakers gave a five-minute presentation in front of an audience and a panel of judges. Contestants spoke on a wide variety of topics, ranging from ASP.NET to Network Monitor to Powershell.

clip_image002

ASP.NET/IIS MVP Richard Campbell was the host of the event and Enterprise Security MVP Paula Januszklewicz was one of the judges. Four of the twelve participants in the contest were MVPs. In the end, Windows Expert-IT Pro MVP Jessica DeVita, from Los Angeles, California, won first place in the competition after delivering a presentation on Office 365 Migrations. Her prize will be the opportunity to present during next year’s TechEd!

 

(DeVita, Campbell and Januszklewicz pictured, from left to right.)

“It was a really fun experience,” Jessica exclaimed, adding, “5 minutes really forces you to get to the point. I learned how to connect with my audience and relax, and also learned a few things not to do, such as look back and forth between your screen and the projection screen.”

Jessica credits the MVP Award with creating new opportunities and avenues for her talent. “I've had more opportunities to give technical presentations since receiving the MVP award, and conferences provide the best opportunity to talk with other MVP's and the community as a whole.”

Other Speaker Idol MVP contestants included Virtual Machine MVP Aidan Finn, Dynamics GP MVP John Lowther, and SQL Server MVP Karen Lopez.

 

Congratulations to all the participants, and watch for Jessica DeVita on the TechEd speaking schedule next year!


MVP Featured App: Spell and Speak

$
0
0

spellandspeak_fullwidth1

 

 

 

 

 

 

 

 

 

 

 

Xbox MVP Dave Voyles has released a Windows Phone 8 app entitled Spell and Speak, which teaches literacy to children.

He found inspiration for this app through his girlfriend’s 5-year-old son, wanting to create something for his unique perspective that was both educational and simple enough for him to use.  He also found it a valuable experience in learning Windows Phone development, which led to his creation of tutorials along the way, along with providing open source code for others to learn from.

He describes the app, “Children begin by selecting one of 26 objects on screen, where each object corresponds to one letter of the alphabet. If the user selects an image of an elephant, the next screen advances, with the picture of the elephant, along with a brief description, and then prompts the user to write the name of the object they see on screen.

If the user's text matches that of the object on screen, they're greeted with confetti falling from the top of the screen and the cheering of children in the background.”

While fairly new to the MVP community, Dave was inspired to create his app by another MVP:

“I watched Windows Phone MVP Nick Landry give his talk "What's new in Windows phone 8 for developers" at an NYC movie .NET developers meet up, and it sparked my interest. I had just finished writing a book, UnrealScript Game Programming, which used a completely different programming language, and wanted to get back into creating something. Rather than focus on games, which was my previous forte, I wanted to craft an app.

In addition, I had just interviewed with Microsoft to become a Tech Evangelist, but they later followed up and said they wanted me to have more experience, and with Windows and Windows Phone 8 in particular. This was the perfect opportunity to improve those skills while working on something new.

The app proved beneficial to my career in a number of ways:

I was very familiar with both the Speech API for Win Phone 8, and now use an advanced version of that for my current job. Additionally, much of the API for Win Phone 8 is similar to Windows 8, therefore it made transitioning to Windows 8 development not too difficult.

These skills are largely what led to my next role, as I was hired by Comcast soon after releasing this app, with the role of building apps for various Microsoft technologies. Spell and Speak allowed me to prove that I could create more than just games, in addition to wanting to work with the latest mobile technology.”

davefoyles

Dave Voyles, Xbox MVP, is the coordinator for Xbox LIVE's Indie Games Uprisings. He also writes for Armless Octopus, where he works as a Managing Editor, covering Xbox LIVE indie games and the indie gaming scene.  He has released one Xbox LIVE game of his own using XNA (Piz-ong), as well as projects using Unity and the Unreal Engine. He’s also proficient with HTML5 and JavaScript, and has released Super Rawr-Type, a side scrolling space shooter for the browser as well as Windows 8.  He works full time as a software engineer on the Xbox team at Comcast in Philadelphia. His first book is out now: "The Advanced UnrealScript Programming Cookbook" by Pakt Publishing,

Did You Miss This Top MVP Guest Post on eDiscovery in SharePoint 2013?

$
0
0
Editor’s note:  Due to the Independence Day holiday in the US we'd like to pause 
and celebrate one of our most popular guest posts from the past quarter. The 
following is a re-post of guest post SharePoint Paul Olenick 

Overview of the new eDiscovery features available in SharePoint 2013

As you have no doubt heard, or seen for yourself by now, SharePoint 2013 represents massive leaps forward in many areas of the platform. One such area that doesn’t get a whole lot of attention is eDiscovery. This article is meant to serve as an introduction and overview of SharePoint 2013's new eDiscovery capabilities.

If you work in the legal or eDiscovery space I don't have to tell you how critical it is to have powerful, accurate and efficient eDiscovery tools but for those of you who don't know what eDiscovery is I'll provide a brief overview.

What is eDiscovery?

eDiscovery, or electronic discovery, is the process of discovering (finding) electronically stored information that is relevant to legal matters such as litigation, audits and investigations. Though it is called eDiscovery, the process typically entails more than just the discovery. The main stages of the process are roughly:

1. Discovery – Find the relevant content

2. Preservation – Place content on legal hold to prevent data destruction

3. Collection – Collect and send relevant content to be processed

4. Processing – Prepare files to be loaded into a document review platform

5. Review – Attorneys determine which content will be provided to opposition

6. Production – Provide relevant content to opposition

The SharePoint 2013 eDiscovery functionality focusses on the first three stages.

image

It should also be pointed out that eDiscovey is an extraordinarily expensive process. The most expensive aspect being the fees associated with attorneys reviewing the electronic content. The better tools we have for identifying relevant content (and weeding out anything non-essential), the less content will be presented to the lawyers resulting in cost savings. So those fist three steps that SharePoint will be involved in are crucial to get right and can represent massive savings.

Now those of you who have worked with SharePoint for a long time may know that SharePoint already included eDiscovery functionality.

A Brief History of eDiscovery in SharePoint

While there were some basic eDiscovery-related features in SharePoint 2007 (such as the ability to place records on hold) a more cohesive eDiscovery story didn’t begin to emerge until the release of SharePoint 2010. With SharePoint 2010, we now had a top-tier search engine (especially for those organizations that implemented FAST Search Server 2010 for SharePoint) to help discover content. Additionally, SharePoint 2010 introduced the concept of placing and managing site-level holds, a mechanism for automatically copying eDiscovery search results to a separate repository for review and an API to develop custom solutions against these features. For more information on how SharePoint 2010 supports eDiscovery, see the following article on TechNet. http://technet.microsoft.com/en-us/library/ff453933(office.14).aspx#How

There were, however, some major limitations in SharePoint 2010’s eDiscovery solution. For example, the features mostly only applied to SharePoint content. And if a hold was placed on a site, it prevented users from continuing to work with the content. This was especially problematic when conducting internal investigations as it would alert those being investigated to the fact that they were under scrutiny. As such, those utilizing these tools in SharePoint have been eager to see what improvements were made in SharePoint 2013 eDiscovery.

Overview of eDiscovery in SharePoint 2013

The SharePoint Server 2013 eDiscovery feature set can be broken into the following components and functional areas.

· eDiscovery site template

· In-place holds (or In-place Preservation)

· eDiscovery Export

· eDiscovery APIs

eDiscovery Site Template

The eDiscovery site template provides a central location to create and manage eDiscovery cases. A case is a SharePoint site template, created as a sub web within the eDiscovery site collection, which supports the process of discovering content across the enterprise, placing legal holds on content, filtering content and exporting it for delivery. The case site template also includes lists and libraries for collaborating on cases and keeping supporting content organized and centrally located.

In-Place Holds

An in-place hold is a mechanism for placing content (SharePoint 2013 documents, list items, pages, and Exchange Server 2013 mailboxes) on legal hold while allowing users to continue working with the content and without them being made aware of the hold. If a user edits or deletes content that has been placed on in-place hold, the content is automatically moved to a special location thus preserving the state of the content as it was at the time the hold was placed. This design decision, to only replicate data when a change has been made, limits the amount of storage needed to preserve content in its original state.

In-place holds can be placed either at the site or mailbox level, or alternatively, you can use query-based preservation. With query-based preservation, you can define eDiscovery search queries and only content that matches your query will be preserved.

eDiscovery Export

SharePoint 2013 enables eDiscovery users to export the results of eDiscovery search queries so that they can then be sent for review. The export feature is capable of exporting documents (including versions for SharePoint content), list items and pages as well as exchange objects. The export tool also generates reports about the content, logs describing the export and an XML manifest which describes the exported content (including its metadata) in a format that complies with the Electronic Discovery Reference Model (ERDM).

eDiscovery APIs

There are a number of APIs available in SharePoint 2013 that enable customers to develop custom solutions that leverage eDiscovery functionality. I won’t go into any level of detail around programmability in this article, but suffice to say there is a model in place to create custom eDiscovery solutions. For more about SharePoint 2013 eDiscovery programming models visit the following link. http://msdn.microsoft.com/en-us/library/jj163267.aspx#SP15_eDiscoveryInSP_eDiscoveryProgrammingModel

To better illustrate how these tools and features can be used, I’m going to walk you through a typical case lifecycle from the standpoint of an eDiscovery user.

SharePoint 2013 eDiscovery Example / Walk-thru

The high-level steps involved are to create a case, place legal holds, refine and filter content, export content and eventually release any holds and close the case.

image

Walkthrough Scenario

For this walkthrough, I’m a member of the Litigation Support team at a company called (you guessed it) Contoso. The attorneys let me know that one of our former clients called Jamison is suing us and Contoso must present all relevant data we have to the opposition.

My first task is to create a new site for the Jamison case so I log into our SharePoint 2013 eDiscovery site. I log in using a special user ID that I only use for eDiscovery purposes. This is because in order to discover content across the enterprise, the user doing the searching must have access to everything. For obvious reasons, it is not a good idea to give a normal user account access to everything, so instead I have a separate account that I use just for eDiscovery.

When I first log into the site I see the eDiscovery Center template. This is where I go to manage existing and create new cases. On the default home page, Microsoft includes instructions on how to take advantage of the template.

image

After clicking “Create New Case”, I’m presented with a “New SharePoint Site” page where I can enter the name, description, URL and permissions for my new case site.

When the site has been created I’m presented with the new case site home page. The site is comprised of three sections.

1. The top section is used for finding and placing legal holds on content.

2. The bottom portion is used to refine and filter on the content until it is ready to be exported.

3. The left side of the page provides access to supporting lists and libraries for the case.

image

I’ll start by clicking “new item” in the eDiscovery Sets section to create an eDiscovery set. An eDiscovery set is comprised of a data source (a site, mailbox or other location), optionally a filter/query and the option of a legal hold. I add the URL of the Jamison project site in the sources area, provide a date range for the filter select “Enable In-Place Hold” and click “save.”

image

On the case home page, the In-Place Hold Status will indicate “Processing” for a time and eventually indicate “On Hold”.

When an in-place hold is set on a site, a special document library called the Preservation Hold Library is added to the site being preserved. After the hold is placed, if a user edits or deletes content in the site, a copy will be placed in the Preservation Hold Library. The hold also prevents anyone from deleting the site itself.

image

Now that the content is safely on legal hold I can begin the process of filtering it down to just the content that we are legally required to provide. Remember, the more content that is sent to be processed and reviewed, the more expensive our eDiscovery is going to cost so it’s important that we’re able to filter the content effectively. With that in mind, I navigate back to my case home page and click “new item” under Search and Export.

In the New Query Item page, I provide a name for my query and I have the opportunity to add search terms and filters. The Contoso lawyers and those of the opposition have agreed that only items regarding a particular deal number (809E5C95) are relevant and have agreed that the deal number will be the only query term. So I add my query term, click search and preview the items that are returned. I can mouse over the preview items to get more details and can also use the refiners on the left hand side to filter the content down more, but in this case we have exactly what we need already.

image

Next I click “Export” and am presented with a number of options related to the export. Most notable is that I am able to include all versions of SharePoint content in my export.

image

Finally, I am given the opportunity to download the actual content from my query or just reports on the contents. In this case I click “Download Results”. The download manager loads and allows me to choose a location for the export.

image

The download folder includes a number of files including an export summary, a manifest (which includes all items including their metadata in a standard format), reports and logs as well as the actual content.

image

Where there were multiple versions of a single item, the filenames of the older versions are appended to indicate the version.

Once the case is over, I go back to the case site, click the cog and select “Case Closure”. Closing the case will remove any remaining legal holds associated with the case and prevent anyone from adding additional holds to the case.

That’s a very basic walkthrough of how an organization may utilize SharePoint 2013 eDiscovery and you can see it accomplishes what it is designed to do. But it’s not all good news. As with any commercial software releases, there are going to be some gaps.

Gaps in the SharePoint 2013 eDiscovery Solution

While in general I’m impressed with the eDiscovey story in SharePoint 2013, there are a few gaps to be aware of before investing in the technology.

First, in-place holds are only for SharePoint 2013 and Exchange 2013 content. But most customers are not on these new platforms yet so how you use SharePoint 2013 eDiscovery with content that resides in SharePoint 2010? The answer is that out of the box, you can do everything related to eDiscovery except place holds on that content. So we can search 2010 content from 2013, we can filter it down, export it with all of its versions and generate reports. We just can’t place holds.

Second, when a hold is placed on a site and a user edits a document that is being preserved the original version of the document (in its state at the time the hold was placed) gets copied into the Preservation Hold Library. However, if subsequent edits are made to the same document, those additional states of the document are not captured. These types of “continuous” or “rolling” holds are necessary for some customers so it’s important for them to understand this limitation.

Lastly, there is no way out of the box to search past versions of SharePoint content. This makes sense as it would be wildly confusing to see past versions of documents showing up in your normal search results, but would be incredibly useful (even necessary for some customers) for eDiscovery purposes.

Again, there is an API exposed for developing custom SharePoint 2013 eDiscovery solutions, so the platform can certainly be extended to fill these gaps. I have it on good authority that there are partners already looking to provide solutions to these limitations.

As for the version search, this would likely be solved with a custom search connector and here too SharePoint provides a rich framework for building custom connectors

Also, not really a gap or limitation, but something to be aware of is that the eDiscovery Download Manager requires .NET 4.5 on you client system.

Recap

So, to recap, SharePoint 2013 provides vast improvements in the eDiscovery story which includes a new eDiscovery site template, the ability to place in-place holds on SharePoint 2013 and Exchange 2013 content, an export feature to download reports and content (including versions for SharePoint content) and an API to develop custom eDiscovery solutions. And it all leverages SharePoint 2013 search which is truly a great enterprise search engine combining the best of SharePoint Search and FAST Search Server 2010 for SharePoint.

This really does represent a lot of investment and effort on Microsoft’s part and it shows. I would encourage anyone interested or involved with eDiscovery to evaluate the features. Just keep in mind the gaps mentioned above so that you’re going into it with eyes open and know, depending on your scenario, the overall solution may require some customization.

About the author

Paul Olenick (SharePoint MVP, MSFT V-TSP, MCT) is a Principal Consultant for Arcovis where he leads SharePoint and Enterprise Search engagements for large organizations across multiple verticals including legal, life sciences, financial, utilities, retail, non-profit, and more.  Paul has been dedicated exclusively to SharePoint since 2006 and FAST Search Server 2010 for SharePoint since its beta release in 2009.  He has helped dozens of clients solve business problems by leveraging SharePoint and Enterprise Search and shares his experiences with the greater community by speaking at events, contributing to books and blogging at http://olenicksharepoint.com.   Follow him on Twitter

PWCrop

 

About MVP Mondays

The MVP Monday Series is created by Melissa Travers. In this series we work to provide readers with a guest post from an MVP every Monday. Melissa is a Community Program Manager, formerly known as MVP Lead, for Messaging and Collaboration (Exchange, Lync, Office 365 and SharePoint) and Microsoft Dynamics in the US. She began her career at Microsoft as an Exchange Support Engineer and has been working with the technical community in some capacity for almost a decade. In her spare time she enjoys going to the gym, shopping for handbags, watching period and fantasy dramas, and spending time with her children and miniature Dachshund. Melissa lives in North Carolina and works out of the Microsoft Charlotte office.

Friday Five-July 5, 2013

$
0
0

Build 2013 Recap by Michael Collier

$
0
0

Editor’s note: The following post was written by Windows Azure MVP Michael Collier

This week I was fortunate to attend Microsoft’s BUILD event in San Francisco, CA.  BUILD is a semi-regular conference where Microsoft shares a few details on the future versions of Windows.  BUILD has essentially replaced PDC.

This was the first BUILD (or PDC) event I could attend in person.  I had a great time!  It was exciting to be around so many other passionate technologists all eager to see the future of Windows!

BUILD this year seemed to focus on Windows 8 and Windows Azure.  The keynote on day 1 was pretty much all Windows 8, and the day 2 keynote was pretty much all Windows Azure.  There was no keynote on day 3 – just lots of sessions to learn more about developing or designing apps for Windows 8 and Windows Azure.  There was very little related to Windows Phone 8 – that was kind of surprising to me.  Perhaps Microsoft has other events to focus more on their mobile strategy with Windows Phone 8.

WP_20130626_002WP_20130626_001WP_20130626_005

Day 1

Wednesday, June 26th was the first day of BUILD.  It started off with a keynote headlined by Steve Ballmer and complemented by several other Microsoft executives.  This keynote session was all about sharing the refinements Microsoft is making to the upcoming Windows 8.1 release.  Microsoft certainly seems to be trying to address most of the major complaints people had with Windows 8 (yes, the Start button is making a comeback).  In my opinion, there was nothing exceptionally new or exciting coming in Windows 8.1.  Most of the features shown during the BUILD keynote were already previously announced.  There were a few things, like native 3D printing support, that were pretty cool though.  Windows 8.1 certainly looks like a welcome update to Windows 8.  I’m going to be sure to install the Preview and give it a spin!

BUILD/PDC has been known for the attendee goodies.  This year did not disappoint.  Attendees were first treated to an Acer Iconia W3– a 8″ tablet running a full version of Windows 8.  Later during the keynote, Microsoft’s Julie Larson-Green returned to the stage to announce all attendees would also receive a Surface Pro with a Type Cover!  I have a Surface RT already, but am certainly looking forward to playing with the Pro.

WP_20130626_003

Day 2

Day 2 kicked off with a keynote that focused on the new capabilities in Windows Azure.  As a Windows Azure geek, this was what I was really interested in!  Satya Nadella opened the keynote by sharing some of the details on the growth of the Windows Azure platform.  As someone that has been working with Windows Azure since PDC 2008, it’s exciting to see how fast the platform has changed and grown!

I think there were five big announcements on the Windows Azure front:

  1. General Availability of Windows Azure Web Sites and Windows Azure Mobile Services: Web Sites and Mobile Services are probably two of my favorite Windows Azure services.  It’s great to see the work the team has done!
  2. Auto-scaling for Cloud Services, Virtual Machines, and Web Sites: this is a feature people have been asking for since day 1.  Now, Windows Azure finally can “check the box” on auto-scaling.  Right now auto-scaling is limited to scaling based on CPU and storage account queue depth.  I’m hoping that more metrics will be supported in the future.
  3. Service alerting for compute services (Cloud Services, VMs, Web Sites, and Mobile Services):  service administrators can now configure alerts for some of the various metrics collected for their Windows Azure services.  If a service exceeds a threshold (as specified in the rules the administrator creates), an email will be sent to the administrators.
  4. No credit card needed to activate Windows Azure benefits included with MSDN subscriptions: finally!  I can’t count how many times people complained to me about the need to use a credit card to active Windows Azure benefits already available to them as an MSDN subscriber.  Now that is no longer the case – making it much easier for people to take advantage of those benefits.
  5. Windows Azure Active Directory enhancements: there are some pretty nice tooling updates in Visual Studio 2013 Preview that will make working with Windows Azure AD even easier.  Scott Guthrie showed some pretty slick upcoming features related to very easy SaaS integration with Windows Azure AD – for integrating very easily with services like Box.  Very cool – looking forward to playing with this more soon!

For more information on these, and a few other tasty announcements, be sure to visit Scott Guthrie’s blog and the official Windows Azure team blog.

I spent a few hours hanging out at the Windows Azure booth.  There was A LOT of interest in Windows Azure!  I was there to help people get started with the platform, answer questions, and discuss ideas on how to solve a few problems.  Great conversations!!

Microsoft also threw a pretty nice attendee party at Pier 48.  It was a great way to see some more of San Francisco and hang out with the other BUILD attendees.  There was plenty of drinks to go around, and some unique (as in good) food from local food trucks.

WP_20130627_008WP_20130628_007WP_20130627_001

Day 3

The final day. There were no major announcements this day.  It was all about attending sessions to learn more.  My favorite session was “Building Big: Lessons Learned from Windows Azure Customers” by Mark Simms and Christian Martinez.  They were very honest about how to best handle scenarios where you’re targeting massive scale – think Halo4 or Skype sized scale.  They shared some great stories and tips!

Going Home

Overall I think BUILD 2013 was a great experience.  Being there in person, instead of watching recordings online, was certainly valuable.  The chance to interact with other BUILD attendees was fantastic as well.  While I had a great time, learned a lot, and had a great time, I’m very much looking forward to going home (as I write this waiting at SFO for my red-eye flight home).

MVP Featured App: Notepad Classic

$
0
0

Visual Studio ALM MVP Robert MacLean has released an app for Windows 8 titled Notepad Classic.

Screen shot 1

Robert was inspired to create the app primarily as a learning experience: “This app taught me a lot about how to build Windows Store apps, which enabled me to offer technology advice to various communities. There are obvious big learnings, but also smaller nuances of the platform that you learn by doing something more complex than "Hello World."  It is not all about learning new technologies.  I had to learn a lot about some fundamentals in text encodings.”

Notepad Classic also garnered great exposure for Robert, winning the TotalApps competition through Microsoft South Africa.  Notepad Classic was also a finalist in Microsoft & StackOverflow’s competition, Apptivate.  “I didn't build the app to make money or win competitions,” Robert explained, “but as a side effect of wanting to learn and share, it was really great!

Notepad Classic is a reimagining of Windows Notepad, taking advantage of Windows 8’s features and style.  Notepad Classic also adds additional features such as spell check.

Robert MacLean received strong support from his MVP colleagues.  “My MVP experience helped a lot in the app development.  In particular, I was able to reach out for help to the same communities, forums and websites I go to to help others.”

photo

Nestled in a typical open plan development office at BBD, you will find Visual Studio ALM MVP Robert MacLean’s desk, where this passionate technology specialist is often working on the latest and greatest technology from Microsoft. This desk is often vacant as he can often be found presenting, demoing, networking or drinking free beer at many conferences, events, user groups and communities. When not at work or at community events, he can be found still coding for his many personal projects that he hopes will bring him fame and fortune, or at least a thank you from a person who has benefited from them, or for Microsoft through his involvement in the Microsoft Rangers. He often shares his thoughts, lesson learned and tools on his website or on Twitter.

MVP Award & Why Communities Rock

$
0
0

Editor’s note:  The following blog post was written by Dynamics CRM MVP Jukka Niiranen

Today I received the following email:

MVP_award_email_small

Wow! Quite an honor, I must say. Not so much for the MVP badge itself but for being recognized alongside all of the brilliant minds that have received the Microsoft Dynamics CRM MVP award before me. Thanks especially to fellow MVP Gustaf Westerlund for nominating me for the award! Also, it’s nice to notice that all of the sarcastic remarks I tend to make in my posts while explaining the do’s and don’ts of the Dynamics CRM product have not permanently angered the folks at Microsoft to put me on their blacklist ;)

It’s great to receive recognition from the makers of Dynamics CRM of course, but by far the most important thing is the support from all the other members of the Dynamics CRM community. That means anyone who contributes to the discussions on CRM forums, comments on blog posts, sharing of links on social media and all the other activities that help people like you and me to… you know, survive living with this thing we call CRM. In the spirit of award speeches, let me take this moment to ramble on a bit about why these things matter so much.

All the way back in 2005 when I first got exposed to Microsoft CRM (the pre-Dynamics era) in the role of an ICT specialist evaluating alternative applications to replace an aging yet heavily utilized Lotus Notes based CRM system for my organization, the one thing that stood out in Microsoft’s product was the amount of community contributed material that was already available at the time. Compared to the world we live in today, it was of course a tiny fraction of the vast resources we’ve got now, but compared to the other potential CRM vendors on our short list, it was a significant factor that made me push for choosing Microsoft CRM. Knowing that I would be responsible for administering, supporting and customizing the system further once deployed, I naturally wanted to work with a product that I could find answers from not just the vendor but also other users and consultants who were sharing their expertise so graciously on the Internet.

CommunityAfter having spent some time in learning the ropes and reading through a pile of invaluable blog posts (~100 RSS feeds on my Dynamics CRM daily diet) that had helped me solve the day-to-day problems encountered when trying to mold the CRM system to meet the requirements of the users in a couple of customer organizations, I decided to put up a blog of my own to have a place to share some of the tips I had found useful. Then along came social networks like Twitter, that allowed you to discover even more great experts and content on hashtags like #MSDYNCRM. Eventually I realized there was no way for me to return back to the way things were before becoming an active member of the global online community around Dynamics CRM, so the only thing left to do was to push even further and try to make the most of it – even experiment with it, if you like.

The virtuous cycle of communities is truly a powerful force. In exchange for receiving help from complete strangers with no expectation of monetary remuneration, you start to feel compelled to give back to them in one form or another, to pay it forward. Once you do, you begin to notice that there are others who in turn are benefiting from your actions, which makes the cycle just start to spin faster & faster. All that shared knowledge begins to accumulate into a source for “wisdom of crowds” type of phenomena where you are no longer bound by your own cognitive capabilities, rather you can tap onto the community as an extension of your brain to solve the problems you encounter. It’s no cyberpunk fiction, simply the best strategy for an information worker to stay on top of his game today and develop the skills needed tomorrow.

Most of the things I know about Dynamics CRM I have learned from the community surrounding the product. That is why I personally value the MVP Award, because in essence it’s all about the most important part: the community, not just the application. Therefore, my advise for anyone who’s working with Dynamics CRM and is interested in getting more out of their job, as well as getting better at their job, is to take the plunge and start contributing to the community. You don’t have to be a CRM guru, a superstar developer or even a 24/7 social media geek to be able to add value into this common pool of knowledge and insight that keeps the Dynamics CRM product moving forward and allows all of us to better solve real life business problems with it, thus eventually helping the world outside the community. All you need to do is proceed along these steps, one ladder at a time:

  1. Explore
  2. Learn
  3. Share
  4. Contribute
  5. Rinse & repeat.

Thank you. Let’s keep rockin’ with CRM.

Jukka_Niiranen_bw_150

Jukka Niiranen is a Microsoft Dynamics CRM MVP from Finland, having worked with the system since 2005 and in the field of customer relationship management for over 10 years now. Whenever he feels like saying a thing or two about Dynamics CRM, he may post it on his blog, tweet it, save it to his CRM links or post it on Google+.


Friday Five-July 12, 2013

$
0
0

1. A simple jQuery Qunit-based JavaScript Unit Test Project Template

By ASP.NET/IIS MVP John Petersen – @johnvpetersen

2. Some Thoughts about Power BI

By SQL Server MVP Chris Webb – @Technitrain

3. Configuring Jumbo Frames Using PowerShell

By Virtual Machine MVP Aidan Finn – @joe_elway

4. Behavior to hide UI elements when a bound collection is empty

By Windows Phone Development MVP Joost van Schaik – @LocalJoost

5. Create a portfolio backlog hierarchy in Team Foundation Server 2013

By Visual Studio ALM MVP Martin Hinshelwood – @mrhinsh

Excel 2013 Timelines

$
0
0

Editor’s note: The following post was written by Excel MVP Zack Barresse

Let’s explore a new feature of Excel 2013 called Timelines. In this blog post we will cover what they are, what you can do with them and how to create your very first Timeline. They are one of the best additions to this latest version of Office. Once you create one and start using them you’ll never want to use anything else.

What are they?

Timelines are a new addition to Excel 2013. They are a kind of slicer, or visual filter, for dates. These new controls give you an extreme amount of flexibility when wanting to filter a PivotTable by date. Filtering by dates has got better with every version and in 2013 it’s easier than ever. Unfortunately you can only filter a PivotTable with these controls, it will not work on standard tables. There is no special add-in required to use these features, they ship out-of-the-box in Excel 2013.

These controls also persist into the Excel Web App (as do slicers), which means if you view/open your file in SkyDrive you will still have access to utilize these controls. This is a great benefit when looking at the continuity of experience from desktop to web app.

You can assign a timeline to a PivotChart as well, as it’s based on a pivot (data) cache.

What do they do?

Quite simply – they filter. Timelines are the best date filters you’ve ever used. Traditionally there has been date filtering (i.e. clicking the filter drop-down arrow on a PivotTable and you can filter by year, quarter, month, day, etc.), and like slicers these are great visual representations of those filtering capabilities, allowing you to easily see what date range has been filtered, as well as add a nice aesthetic to your worksheet.

Creating a Timeline

For the remainder of this post I’m going to be using one of the many beautiful templates Microsoft has to offer (for free), the Budget for fundraiser event, found through Excel (internet connection required), or through Microsoft’s office.com. Another reason I’m using this template as my example is there are PivotTable’s and the data source has a date field. Two requirements for implementing timeline controls.

image

Step 1 – Start with a Pivot Table
If you’ve downloaded the template used in this post you’ll probably notice the PivotTable’s (there are two on the EVENT OVERVIEW sheet) only have the DATE field in the FILTERS area. If you click the filter drop-down arrow you will see a list of all unique items which are in that field in the data source (as seen below).

image

This is a pretty standard autofilter. If you don’t want to use the template in this example, ensure your PivotTable data source has a date field and you’ll be good to go.

NOTE: All filtering you can achieve natively (as you’ve always done) can be done through a timeline with one caveat – you cannot select a non-contiguous date range, it must be contiguous. This means that you can only select a start and end date to filter for. For non-contiguous date ranges you must use manual filtering and check the box ‘Select Multiple Items’ (as pictured above).

 

Step 2 – Insert a Timeline
With any cell in a PivotTable selected you will get the PIVOTTABLE TOOLS ribbon tab to appear. If you’re not familiar with this it’s called a contextual tab, in that it won’t be visible on the ribbon unless you’ve selected its object. Some other contextual tabs are shown for tables, charts, slicers, timelines, etc. When you select a PivotTable cell, you will see the following ribbon tab:

image

In the Filters group of this tab you will see three controls: Insert Slicer, Insert Timeline, and Filter Connections. Slicers were introduced in Excel 2010 for PivotTable’s only. In Excel 2013 we now have the same ability but it’s now available for use with tables. Filter Connections lets you assign which slicers you set to which PivotTable’s. This can be very handy if you have multiple Slicers you want to filter a multiple PivotTable’s which have the same data source. For today we’ll focus on the Insert Timeline button.

When you click on Insert Timeline you will be presented with dialog box which shows all fields in your data source which contain a date or time field as analyzed by Excel (no text-formatted dates allowed).

NOTE: While you can add a timeline for a time field, you will still only have the options available for filtering individual days, not any time increments within a day.

image

Check the box for the field you want a timeline control added for. When you click OK you will see your timeline appear.

image

As with slicers, when you have a timeline selected you will see a contextual ribbon tab appear giving you additional options to customize your new control.

image

There are only five groups of controls. As with slicers, the Report Connections let you tie a single filter control to multiple PivotTable’s which are based on the same source data. It will not work with multiple PivotTable’s with different data sources.

 

Step 3 – Customize Your Timeline
There are various parts of these controls you should become familiar with. All of which you can customize to some extent. There are four parts of the physical structure of the control.

image

You can toggle these parts visibility on or off with the ‘Show’ group controls on the TIMELINE TOOLS contextual ribbon tab. This will toggle the visibility of that part.

The standard ‘Arrange’ and ‘Size’ group controls accompany timeline controls as well, which lets you move objects backwards and forwards, as well as align controls to your desire.

As with slicers, tables and PivotTable’s, you have a styles gallery group as well. If you want control over all parts of how your timeline looks and feels, you have many options to format your controls the way you want. As with other style galleries, you have the option to setting any one of them as the default for the workbook by right-clicking that style and choosing ‘Set As Default’. This way when you create a new timeline it will have the style applied you like best.

Using timelines is as easy as point and click. When looking at the control you will see the filtered range colored, with each end containing a vertical ellipses, which you can click and drag to where you want (as seen below). In addition you can click any time segment seen in the control.

image

Your filtered data range will be shown in the Selection Label area. Too granular or course? Change the Time Level by clicking on it, where you get a drop down to choose from years, quarters, months and days. These are not customizable.

image

 

VBA

Controlling these with code is possible too. While this is a whole other topic, let’s briefly cover the basics. Timelines are in fact a type of slicer and can be handled as such with VBA. In the VBA example below it will require you to pass a valid slicer object to it. Slicers are objects which can be set from a PivotTable (attached to a data cache). To tell a timeline apart from a regular slicer you can check the slicer’s cache type. Here is a simple function to check what type of slicer you’re dealing with:

Function TypeOfSlicer(ByVal SlicerCheck As Slicer) As String
'---------------------------------------------------------------------------------------
' Procedure : TypeOfSlicer
' Author : Zack Barresse
' Date : 7/11/2013
' Purpose : Check the type of slicer passed.
'---------------------------------------------------------------------------------------
     On Error Resume Next 
     If SlicerCheck.SlicerCacheType = xlSlicer Then TypeOfSlicer = "Slicer" 
     If SlicerCheck.SlicerCacheType = xlTimeline Then TypeOfSlicer = "Timeline" 
     If TypeOfSlicer = vbNullString Then TypeOfSlicer = "ERROR!" 
     On Error GoTo 0
End Function

 

The SlicerCacheType will give you an enumerated constant of the type of slicer it is. Timelines have their own members in the object library.

Customizing your timeline in VBA boils down into two distinct objects: TimelineState and TimelineViewState. Most methods can be found in these two objects.

TimelineState
This is a child object of the slicer cache. You would use this to set the date range, check the start and end date currently filtered, etc.

TimelineViewState
This is a direct child object of the slicer itself. This object will give you options for its viewing state, or how it looks and appears.

Summary

Timeline controls are visual representations of filters, like a slicer, but specifically designed for dates.  You may copy and paste a timeline control as many times as you would like, and may tie them all to a single PivotTable (data cache).  For example if you want a separate timeline for years and months, it’s as easy as copy and paste.  Have multiple PivotTable’s based on the same data source?  No problem, connect your timeline controls to the same Report Connection, just as you would a slicer (in addition to copy and paste).  The following timelines are all tie to the same PivotTable and filter in unison.  Change one and they all update.

image

Thanks to the Excel team for creating such beautiful slicer types!

Zack

Zack Barresse lives in Oregon with his family where he likes to go camping, fishing, playing Xbox and spending time with his family.  His other love is Excel, which he has been using since December 2003.  In 2005 Microsoft recognized him with the MVP award and he has been awarded every year since.  Zack maintains helping others in Excel through social media, blogs, has been technical editor of the Missing Manual: Excel series since 2007 and is authoring his first book.  Being self-taught in Excel he tried to help others wherever he can.  Follow him on Twitter.

About MVP Mondays

The MVP Monday Series is created by Melissa Travers. In this series we work to provide readers with a guest post from an MVP every Monday. Melissa is a Community Program Manager, formerly known as MVP Lead, for Messaging and Collaboration (Exchange, Lync, Office 365 and SharePoint) and Microsoft Dynamics in the US. She began her career at Microsoft as an Exchange Support Engineer and has been working with the technical community in some capacity for almost a decade. In her spare time she enjoys going to the gym, shopping for handbags, watching period and fantasy dramas, and spending time with her children and miniature Dachshund. Melissa lives in North Carolina and works out of the Microsoft Charlotte office.

TechEd Europe 2013 Recap

$
0
0

Editor’s note: The following post was written by Silverlight MVP Tony Champion

WP_20130627_001

Well it’s the day after TechEd Europe 2013 and I have just finished up a day of sight seeing in Madrid. Let me just say that it is one beautiful city. Being a Texas boy, we simply don’t have cities with buildings this old and ornate. It is always quite interesting to see the blending of the old and the new. Sometimes it has an amazing effect and sometimes, well, maybe not so much.WP_20130628_001

The convention center, IFEMA, is a beautiful open air concept that allows you to get a little fresh air in between sessions. As with most TechEds, the facility was large enough it could easily take 10 to 15 minutes to get from place to place. However, the outdoor strolls made those enjoyable enough, even when it heated up during the day. I have to admit, I heard several comments about the heat in mid-day, but it was just another summer day for this Houston boy. At night it cooled off rather nicely, so every morning was spectacular. I do have to wonder about the 20+ ft tall gummy bears that stood guard at the entrance.

I was forewarned about the late dinners you have in Madrid. It seems that 9 to 10pm is a great time for dinner. This seemed fairly strange and can mess with ya a bit if you are use to eating several hours earlier in the day. However, I soon found out the sun didn’t set until 10pm, so it was really just the end of the day here in Madrid and that made a lot more sense. Go figure. Food in a different country is always interesting. We found some really fantastic food and some that I can only call “interesting”. Again, being a Texas boy, my limited Spanish is all Mexican Spanish. It seems there are some differences between Spain and Mexico. Let’s just say I learned a lot about the word tortilla and its different implementations in different parts of the world.

Spain has some amazing wines to enjoy and we might have tried one or two. The thing I found humorous is that almost every place you went into only had one type of beer. So you could easily simply ask for a “cervesa” and be ok. That’s a far cry from some of the large beer gardens we have in the States with over 100 types of beer on tap.

Overall, I truly enjoyed my trip to Spain and appreciate all of the patience shown to us non-Spanish speaking geeks. But this is a recap on TechEd and not a travel post. So on to the geeky stuff.

TechEd always has around a 75/25 split of IT and developers. It’s arguable that this year it was a little lighter on the developer side. However, between all of the great things going on with Azure, the Windows Phone platform, and the release of the Windows 8.1 preview at Build this week, there was a lot of excitement in the developer community. That’s not to say there wasn’t any on the IT side, but most of my interactions were with the developer community.

There was nothing new released this week, and I’ll save the new Windows 8.1 stuff for future posts. The attendees were given a summary of some of the new 8.1 features from Joe Stegman, Group Program Manager for the Windows UI Platform team. In fact, Joe hung around Ask the Experts and gave some great insight on some of the decisions and challenges that the UI team faced when dealing with feature requests.

I was asked at TechEd North America, and again this week, about what was the best way to get the most of your TechEd experience. And I think the answer to that is simple: “Take advantage of the conversations”. Now as a speaker, of course I want you to come to my session. In fact, if you want to stop by just to give me a glowing evaluation, that is ok too. :) But the real advantage of a conference is the people. How often to you get a chance to meet up with fellow developers from all over the world? When do you get a chance to talk with people from the team that creates your favorite product? How often can you open up your machine and trouble shoot whatever problem you are facing with some of the most experienced people in the industry? What ever you do, don’t just spend your time in the sessions and never ask the presenters questions, or go to the Expo, or events like Asks the Experts. If that is all you are looking to get out of a conference, then you can watch the sessions online the following week,

And since I’m on my soap box on the subject, let me give you one little bit of insight on this. The presenters, product team members, and staff spend an enormous amount of time prepping for these things. As a presenter, my first priority is to deliver comprehensive sessions on my topics. However, the thing I enjoy most is getting to have conversations with other developers on those, and quite frankly any, topics. I like to see how people are using the tools and I always learn about new things and new challenges facing developers. The product team members like to here feedback on their products, even the stuff that isn’t the most flattering. They enjoy hearing about how people are using their products and what challenges they are facing. Believe it or not, but this feedback helps to guide the direction of their products.

So get out there and jump into the conversation. It’s always a good time. It was a blast getting to be a part of TechEd Europe and I hope I get the opportunity to do it again. I have a few more conferences coming up this year, so I hope to see you all around and I look forward to speaking with each and every one of you.

Now it’s time to enjoy my last evening in Madrid. Buenas noches….

WP_20130629_020

About the author

photo

Tony Champion is a software architect with over 16 years of experience developing with Microsoft technologies. As the president of Champion DS and its lead software architect, he remains active in the latest trends and technologies, creating custom solutions on Microsoft platforms. Tony is an active participant in the community as a Microsoft MVP, international speaker, published author, and blogger. He focuses on a wide range of technologies from web solutions such as HTML5, JavaScript, and Silverlight to client platforms for Windows Phone and Windows 8. He can be found on his blog and on Twitter.

MVP Featured App: Branch Info Team Explorer (BITE) Extension

$
0
0

If you use the new Visual Studio Extensions for Git, you’ll know exactly which branch you’re working on. However, if you use Team Foundation Version Control (TFVC) you can’t easily see which branch your open solution is on. But with this small Team Explorer Extension on the VS Gallery by Visual Studio ALM MVP Colin Dembovsky, you now can! Not only will the extension show you which branch you’re working on, you’ll be able to instantly switch to the same solution on another branch instantly.

Once you’ve installed the extension, you’ll see a new Link under the Pending Changes section:

clip_image002

Click on “Branch Info” to see the extension in action.

clip_image003

You'll be able to see all the branches for the current solution in the "Other Branches" dropdown. Simply click "Switch" to open the solution from a different branch.

“Being an MVP is great. Interacting with other global ALM MVPs and the MS TFS and Visual Studio Product Team on a daily basis for news, help and discussion is a huge help to me, and ultimately to my customers,” says Colin about being an MVP. “The annual MVP Summit in Seattle is also one of my favorite conferences!”

TechEd Europe 2013 – 40 EMEA MVPs as speakers

$
0
0

Editor’s note: The following post was written by Community Program Manager Cristina Gonzales Herrero.  Photo by Windows Azure MVP Rainer Stropek.

TechEd sign - of course a tile

TechEd Europe was once again a great success.  This year the event took place in Madrid for the first time, attracting 5,129 registered participants. The MVP Community had a very active role in the event in a various number of fields, providing support and feedback while driving excitement around Microsoft technologies.

There were 69 MVPs (40 of them from EMEA) who delivered an estimated 174 breakout sessions (out of 357) and an estimated 38 Hands-On-Labs. MVPs also delivered 8 Pre-Conference Seminars (out of 13). The Expo Hall featured an area dedicated to our Information Worker, Server and Tools and Windows technologies, where an estimated 30 MVPs helped visitors solve technical questions as Ask-The-Experts.

The community benefited from a dedicated area, the Microsoft Community Lounge, where different Microsoft initiatives oriented to its influencers were presented.  Some of those initiatives included the Imagine Cup, the Microsoft Technical Communities Program, The Scripting Guy, PASS and of course, the MVP Program.  The MVP Program hosted a booth that served as a meeting point for MVPs and community members. One local MVPs, Pep Lluis Baño, held some demos with the help of his Micro-Framework programmed robots.  MVPs also had the opportunity to autograph their pictures in the “Wall-of-Fame” and meet our Community Program Manager Cristina Gonzales Herrero and EMEA Regional Manager Alessandro Teglia.

MVPs had the chance to connect with Microsoft PG members present at the event through 2 NDA technology roundtables, conducted by Christa Anderson and Brian Keller. They also got invited to a number of other MVP-exclusive events such as the MVP Gathering on Thursday evening, which was a great success: 70 MVPs and 8 MS employees were present, on a day when several other exclusive events were also taking place.

The event was also a highlight in the social media space. The @teched_europe Twitter account has 6,391 followers and 2,911 messages, while the event, using the hashtag #TEE13, became a trending topic during the week.

Friday Five-July 19, 2013

$
0
0

1. Resize a RichTextBox to fit its contents in C#

By Visual Basic MVP Rod Stephens

2. ADFS 2.1 Mex Endpoint Errors with CRM 2011 & Windows Server 2012. Here's your fix.

By CRM MVP Christopher Cognetta – @ccognetta

3. Python Tools for Visual Studio 2

By Windows Azure MVP Jan Hentschel – @Horizon_Net

4. Setup Multiple Search Pages & Result Sources (Search Scopes) for a Site Search in SharePoint 2013

By SharePoint MVP Brandon Atkinson

5. An MVP By Any Other Name

By Windows Server for Small and Medium Business MVP Tim Barrett – @timbarrett

Excel 2013–Real World Examples of New Functions

$
0
0

Editor’s note: The following post was written by Excel MVP Ben Currier

Excel 2013 – Real World Examples of New Functions

Microsoft included over 50 new functions with Excel 2013, and I wanted to walk you through a few real world examples of these new additions. You’ll soon be able to see how handy these extra tools are in your ‘Excel toolbox’. I’ll only be going over just a select few of the new functions, but feel free to leave a comment with some of your other favorite new functions or features!

FORMULATEXT()

Until Excel 2013, there has been a gap in our ability to see and use formulas in any given cell. There has been the option of going into Formula Auditing mode (either via the ribbon or Ctrl + `) which allows you to view all of the formulas in a spreadsheet. However, this is an all-or-nothing type feature and only applies to the visualization of the spreadsheet, allowing for switching between all formulas being displayed, or having all results/values show instead. Sometimes there are instances where you want to be able to track or assess a formula in another cell without having to select it directly or change viewing modes entirely. The FORMULATEXT function allows for this ability.

For example, you could very quickly check if two formulas are identical across different spreadsheets, even if their resulting values are not the same. This might be useful if you have a spreadsheet template that you’ve setup and you want to make sure that two versions of it have identical formulas. By comparing the formula text from each of the sheets, you could quickly see if any changes had been made.

Let’s say that you keep a weekly record of daily sales for your company, with Monday through Sunday sales being tracked in cells B6:E12 of multiple spreadsheets, as seen here:

image

If you were using a third ‘Sum’ spreadsheet to add up the Monday-Sunday totals from multiple sheets, you might want to make sure that the formulas in the cells are the same. Depending on how you’ve designed your workbook, knowing that the formulas haven’t changed could help ensure that the template is working as intended and that any associated formulas would reflect the correct information. In this example I use a ‘Check’ worksheet with formulas to see whether the formula text in each of the order total columns is the same. Here I’ve used the new FORMULATEXT function to compare whether the formulas are the same in column E, and I’ve also put additional FORMULATEXT functions in column G to help illustrate the formulas I’m using to compare the two sheets:

image

I’m using a direct comparison by putting the = sign between the two sheet/cell references. This comparison results in a TRUE because the formula text in the Week 1’s cells is the exact same as in Week 2’s. If I were to go in and delete row 3 from above the table in one of the sheets, the function would give me a FALSE on all of them, showing me that the formulas no longer match and something has changed with my underlying data sheets. I could go further and put a conditional format to alert me visually when any of my values turn false by using a COUNTIF statement like this =COUNTIF(E6:E13,FALSE) which would always show 0 unless a FALSE sprung up in my ‘Check’ sheet. Overall, the addition of the FORMULATEXT function certainly adds to your ability to track formulas in cells and understand changes in your workbooks.

ISFORMULA()

The new ISFORMULA function joins it’s brethren of ISBLANK, ISNONTEXT, ISNUMBER and the other information functions related to checking what kind of value the output is. In this case, however, the function allows for us to see whether or not the referenced cell contains a formula. Before this addition, it would require a lot more effort and a much more complicated formula to check whether a cell has a formula in it. Using the previous example above, if we wanted to track whether or not the weekly totals were a formula vs. something else (like a hard-coded value), we could use the ISFORMULA function to check. First, here’s a glimpse of the sheet for Week 1 sales once I’ve changed cell E7 to be a value only. I’ve included the Formula Text in column G which comes up with an #N/A value for the total that I changed:

image

Now instead of checking out the formula text in column G, I’ll change it to an ISFORMULA function, so that you can see the result. I’ve conditional formatted the FALSE result so that it stands out. This would be another good way of keeping track of which components of your spreadsheet are formulas vs. hard-coded values (and can help you identify issues with your calculations):

image

Especially when using the conditional formatting, the changed cell really jumps out at you, whereas before it might have been a lot more difficult to realize that E7 has been changed to a value. If you had gone into Tuesday’s sales to update from 3 to 4, the order total would not appropriately update. Unfortunately, without some kind of formula/error tracking, it would likely go unnoticed. If you find yourself having trouble with how errors work in Excel, or ideas for how to track changes in your data, see my tutorial on Error Checking & Data Monitoring.

XOR()

When dealing with logical functions in Excel, they require a statement that results in a TRUE or a FALSE (aka Boolean) value. The AND function will assess whether all of the logical statements passed to the function are TRUE, and the OR function will check to see whether any of the values show up as TRUE. In Excel 2013, Microsoft has added the XOR function which is an ‘Exclusive OR’ function. In the simplest version of 2 logical statements, XOR will return TRUE if that the result is TRUE for either of the two results, but not both and not neither. When more logical statements are added, the XOR works so that it will result in a TRUE if the total number of TRUE inputs is odd, and FALSE is the number of TRUE inputs is even. Here is an example which shows all of the possibilities for 3 logical statements being fed to the XOR function (I’ve highlighted the TRUE/FALSE values to make it a bit easier to understand):

image

It may be a bit difficult to grasp the concept at first, or how it could be applied to a real-life scenario. The term is of ‘Exclusive Or’ is more frequently heard when related to programming or computer science in general, but here’s an example of how the XOR function could be used to assess whether an employee is working a half-day using only two logical statements:

image

As you can see, it only shows up as TRUE for working a half day is one of the two statements in columns C & D are true. There are much more complicated ways you can use XOR, especially related to mathematics, but hopefully this gives you a taste of how the function works.

Brush up on your Functions!

I’ve only gone through only three of the new functions here, but if any of these piqued your interest, I’d suggest reviewing all of the new functions that have been added with Excel 2013, and certainly would suggest taking some time to review the 400+ functions that you already had at your disposal before these new functions were added. Since there are so many aspects of Excel, you might find a hidden gem that you didn’t know already existed! For a listing of pre-existing functions, shortcuts, and examples of how to use them, feel free to check out my Master Workbook. Hope you’ve enjoyed the lesson and feel free to leave any comments or questions below.

About the author

Ben Currier

Ben Currier has been working in Financial Planning and Analysis over the past 10 years. He also teaches an on-going free online Excel course at Excel Exposure with video tutorials and lessons, which aims to help people to improve their Excel skills.  He is excited about the emerging trends in online education and loves the thought of quality, free, and accessible educational information available for anyone who wants to improve their knowledge and abilities.

About MVP Mondays

The MVP Monday Series is created by Melissa Travers. In this series we work to provide readers with a guest post from an MVP every Monday. Melissa is a Community Program Manager, formerly known as MVP Lead, for Messaging and Collaboration (Exchange, Lync, Office 365 and SharePoint) and Microsoft Dynamics in the US. She began her career at Microsoft as an Exchange Support Engineer and has been working with the technical community in some capacity for almost a decade. In her spare time she enjoys going to the gym, shopping for handbags, watching period and fantasy dramas, and spending time with her children and miniature Dachshund. Melissa lives in North Carolina and works out of the Microsoft Charlotte office.


TechEd 2013 North America & Europe Wrap-Up

$
0
0

Editor’s note: The following post was written by Lync MVP Justin Morris

1064976_10152289082486542_51083586_o

During the first and last weeks of June I had the pleasure of presenting Lync breakout sessions at TechEd 2013 North America and Europe in New Orleans and Madrid respectively. Being Microsoft’s premiere event for IT professionals and developers, TechEd is kind of a big deal when it comes to tech conferences. They’re a great event for in-house IT staff and consultants alike to attend to brush up on the latest and greatest products and also meet a few of their peers. The last TechEd I attended was TechEd Australia back in 2008 in Sydney when I was still living in Australia, so I was excited to get involved as a speaker for the first time and meet people from the community.

North America

First up was TechEd North America in New Orleans. With almost 10,000 attendees, it was a mammoth event in a huge conference space right on the Mississippi River. My Lync Server 2013 Migration and Coexistence session was on the first day, so I had the luxury of getting it done first so I could concentrate on other things for the remainder of the week. I was happy with how my session went, and I threw in a few jokes at the start to break the ice a bit. Most of my time after was spent at the Lync booth in the Microsoft Solutions Experience part of the TechExpo. In the booth, we were demoing Lync across a number of different scenarios like the boardroom (where Lync Room System was on show), mobile worker (where we had Lync 2013 running on each mobile device to show attendees) and 4 PCs setup to show how powerful Lync is on the desktop. The booth was the best place to speak to customers from all over North America and hear about how they’re using Microsoft Lync, which was a great way to get feedback directly from people using it every day.

The Ask the Experts event on Tuesday night allowed us to answer questions in an informal setting over dinner and a few drinks. It was here that customers brought their Lync issues or design questions for us to help out with and really dive into some detail. The rest of the evenings were jam-packed also, with the Lync Users Group hosted by Intelepeer, Audiocodes and Plantronics, Kemp MCM/MVP meet up at the very cool Howlin’ Wolf bar all happening on the same night on Wednesday. Thursday night saw the Closing Party happen at the Superdome. With loads of great local Creole food being served up, live music and football fun to be had, this really finished up the event with a bang.

Europe

After a two week lull, next up was TechEd Europe in sunny Madrid. I delivered my two breakout sessions (on Lync High Availability/Disaster Recovery and Migration/Coexistence) back to back on the first day on Tuesday afternoon, so was once again able to get everything out of the way early. I had fun with my two sessions and received some great questions from the audience that were interesting to answer.

We had the Lync booth fully kitted out once again, and it was quite a contrast to hear how customers from Germany, Sweden, Denmark, France, Italy and everywhere in between have been using Microsoft Lync and their success stories/challenges they’ve experienced.

The Country Drinks on the Wednesday night was a great way to meet other IT Pros from your country. Microsoft had organized a number of venues at the Casa de Campo park where each European country was represented. After meeting some UK based folks and eating loads of ham and cheese, we ventured out to some of the other countries to mix it up a bit. Once again the Ask the Experts event was held in the meals hall, where we could chat informally with attendees and discuss further how they’re using Lync. The conference venue itself was a great place to spend each day. An open air avenue served as the main thoroughfare between the halls, so you were able to get some fresh air and sunshine between sessions.

The Windows 8.1 Preview announcement at build was reiterated at TechEd Europe in Jon DeVaan’s keynote, and attendees received a copy on a USB stick so they could upgrade then and there.

I also attended the F5 Networks MCM/MVP meet up at a tapas bar in La Latina, where it was great to meet fellow MVPs (including local Madrid resident, Lync MVP Peter Diaz) and discuss traffic management across a number of products.

June was a whirlwind month of TechEd events for me, and it was excellent to meet lots of customers face to face, catch up with old acquaintances and meet other MVPs from around the world. Both events definitely reinforced for me how valuable TechEd is for everyone that comes along, and I’m looking forward to hopefully being involved again next year.

About the author

photo

Justin Morris has 10 years enterprise IT infrastructure experience in Australia and the UK deploying Microsoft identity, messaging and communications platforms. He joined Modality Systems in October 2009 as a consultant designing, deploying and configuring Microsoft Unified Communications solutions in the UK. Prior to joining Modality Systems, Justin built up the Microsoft Unified Communications solution offering for a leading systems integrator in the Asia Pacific region. He specializes in designing and deploying Microsoft Lync solutions in large enterprises of 20,000 seats and more. He is active in the global Microsoft UC community via twitter, his technical blog, the TechNet Forums and various community user groups and events.

MVP Featured App: Halfwit

$
0
0

Client App Dev MVP Matt Hamilton has created a desktop Twitter client for Windows 8/RT titled Halfwit.   Tweets are presented as a single column, with mentions (even from people you don't follow), direct messages and search results integrated into the stream.  Halfwit's "trial" version is exactly the same as the full version and has no limitations (with a suggested donation if you enjoy using it).

Matt explains his reasons behind creating Halfwit:  “I tried lots of different clients but none had exactly the look and feel I wanted. When Windows 8 was released, it seemed like a great opportunity to learn the WinRT platform by rewriting Halfwit as a Windows Store app.” 

Matt’s experience with the MVP Program has been quite positive in terms of making connections.   “Being an MVP is all about networking - with Microsofties, other MVPs and the broader community. Getting to know experts in relevant fields means I know who to follow to keep abreast of the development technologies I use, and of new ones that I might not have known about otherwise.”

Friday Five-July 26, 2013

Using Power Pivot and Power View for Profit Analysis

$
0
0

Editor’s note: The following post was written by SharePoint MVP John White

Using Power Pivot and Power View for Profit Analysis

Power Pivot and Power View allow end users to quickly analyze corporate data without having to go through a complex data warehouse or cube design up front. In this article we will walk through the process of connecting to and analyzing corporate data. We will be working with the sample Fabrikam Great Plains database from Microsoft. The Fabrikam database ships with Great Plains, but if you want to work with it, and you don’t have GP, you can download it here.

In the past, it would have been necessary to import the data into Excel, use VLookups to establish relationships between the data, and then we would be able to use pivot tables and pivot charts to analyze it. We would also be subjected to the data size limitations in Excel (65,536 rows in 2003, over a million in 2007-2013). However, if we bring the data directly into the data model, we can circumvent these limitations, and with SharePoint, we can refresh the data automatically. There are 3 ways to do this – through the traditional data import mechanism, through Power Pivot’s data import feature, or through by using the new Power Query add-in. We’ll cover the first two here.

Getting the Data

Excel Data Import

Firstly, we can use the traditional means of Excel import via the Data tab, Other Sources and selecting SQL Server.

image

Once you select the server name, and authentication, things vary a little from what you may have been used to. Once you select your database, be sure to select the “Enable selection of multiple tables” option. You want to select this even if you will be working with a single table as this is the trigger that tells Excel to use the data model, instead of its own storage mechanism.

image

After creating a connection file, you will be prompted for how you want to import the data. Every option available will pull the requested data into the model, including the “Only Create Connection” option. The “Table” option will also pull the data into the worksheet, which in most cases you will not want, as you will be subject to the native Excel limits, and you’ll be storing the data twice.

It is possible to edit the connection, and build SQL queries that will help limit, sort, and transform the data, but a simpler method is to use Power Pivot’s data import feature.

Power Pivot Data import

Although Power Pivot is a part of Excel 2013, it isn’t turned on by default. If you don’t already see the Power Pivot tab, you can enable it by navigating to File – Options, and then selecting Add-Ins and selecting COM Add-ins from the drop down list, and clicking the Go button. Ensure that the “Microsoft Office PowerPivot for Excel 2013” add-in is selected.

image

Once enabled, you can click on the Power Pivot tab, and click Manage in the ribbon to open the Power Pivot editor. From there, start by selecting “Get External Data”, “From Database” and “From SQL Server”.

image

Here you enter in your connection information, and click Next. You may then choose to create your own Query, but we will choose the “Select from a list of tables…” option. Finally, you’ll be presented with a list of tables from the data source, in this case, Great Plains. We’re going to be doing a simple profit analysis, so we’re concerned with three tables:

Table

Purpose

Columns

RM00101

Customer Information

CUSTNMBR, CUSTNAME, ADDRESS1, COUNTRY , CITY, STATE, ZIP

SOP30200

Invoice Header Info

DOCDATE, SOPNUMBE,CUSTNMBR

SOP30300

Invoice Line Item Detail

SOPNUMBE, QUANTITY, UNITCOST, UNITPRICE

We’ll start with the main customer table. GP isn’t known for having intuitive table names, but right away Power Pivot start to help in this area. By specifying a Friendly Name, you make the data model much more approachable for end users. In this case, we name it “Customers”.

image

Using the Preview & Filter button, we can select precisely what data that we want to include in our model, making it much more efficient and less confusing to end users. Here you can see that we have selected CUSTNMBR and CUSTNAME – the other selected columns are off the screen.

image

From this screen you can also sort and filter the date from any column whether or not it is selected for inclusion, potentially making the model even more efficient. When complete, you are returned to the Table Import dialog. At any point in time, you can check your selections by clicking on the “Applied filters” link.

At this point, we can go ahead and repeat this process on the following two tables, naming SOP30200 “Headers” and SOP30300 “Line Items”, and selecting only the columns that we need. When complete, we click on Finish, and our data is imported.

image

The Fabrikam sample data set isn’t very large, but PowerPivot can literally handle hundreds of millions of rows of data, with almost no degradation of performance. However data load time is affected by volume. Once complete, you will be returned to the Power Pivot window.

image

Edit the Model

One of the first things that we want to do is to establish the relationships between the tables. The relationships in this case are straightforward, the Line Items table relates back to the Customers table via the Headers table using the SOPNUMBE and the CUSTNMBR columns respectively. The easiest way to do this is to use the diagram view, which can be found in the Home tab on the ribbon.

Once in the diagram view, simply drag the field to relate onto another field to relate. When done, the relationship should appear as follows. It should be noted that the arrows in the diagram do not point to the related fields, as in some other diagramming tools, they only indicate that there is a relationship between the tables.

image

Once the relationships are established, we can hide fields that we don’t plan on displaying to the end users. In this case we can hide all of the columns in the Line items table (we’ll add new columns shortly), and the SOPNUMBE and CUSTNMBR fields. We can do this by right clicking on the column, and selecting “Hide from Client tools”.

image

Once this is done, we can return to Data View to create some calculated columns and measures. First, we select the Line Items table. We’re doing a profit analysis, and ewe have unit price, unit cost, and quantity. The profit is the difference between the extended price and the extended cost, so the first thing that we want is the extended price. Click on the cell below the “Add Column” heading to the right of the last column of data. Type the “=” key on the keyboard, then with your mouse, click on the corresponding UNITPRCE cell. Then type the * key and click on the QUANTITY field, hit enter, and we have our unit price. Right click on the column header and select “Rename Column”. Rename the column to “Ext Price”.

Repeat this process for Extended Cost, Profit (Ext Price – Ext Cost) and finally Margin (Profit/Ext Cost), renaming each column accordingly.

The first three columns are in dollars, so our model should reflect that. Select all three columns, and in the ribbon, select Currency as the format. With the Margin column selected, select %.

image

Next, we want to always be working with the total profit, or the average profit, and we want to make it easy for the end users to do that. We therefore will create a calculated measure by selecting a cell immediately below the data in the Profit column. Then, click on the AutoSum dropdown from the ribbon, and select Sum. Click on the function bar to change the name from Sum of Profit to Total Profit.

Next, select the cell below that, and repeat the process, but select Average this time, and rename it to “Average Profit”. Finally, since we’ll only be working with the calculated measures, select the Ext Price, Ext Cost, and Profit columns, and hide them from client tools.

image

Next, we want to modify the Customers table a bit, so first we select it. Next, rename the columns to be more business friendly. Click on the Country column, and select the Advanced tab. The Data Category for Country is Country/Region. This is because the model recognized that this field represented Country values, and flagged it accordingly. It doesn’t always get this right, and a quick inspection should reveal that this has not happened for Address and State. Use the dropdown to flag them as Address and “State or Province” respectively.

image

We are now ready for our simple Power View report.

Analysis

Select your blank Excel workbook, click on the Insert tab and select Power View. After a moment, the Power View design surface will open. From the “Power View Fields” pane, open up Customers, and select Country, then open up Line Items and select Total Profit. The Globe field beside Country indicates that it is geo-locatable, and the calculator beside Total profit indicates that it is a calculated measure.

You will notice a small table has appeared on the design surface showing Country and Profit. Increase the size of the table by grabbing one of the corner handles and dragging. Make it the width of the design surface, and a little more than half the height.

image

Finally, from the ribbon, click on Map. You will see that the data is represented geographically on a map. Now, with the map still selected, drag the state and city fields down under Country in the Locations box. Now try double clicking on one of the countries, and then on one of the states. The map will drill down to the next level. To drill up, click the little “up” icon on the upper right of the map.

image

Next, click in an unused area of the design surface. This time, select Name (from Customers) and Total Profit (from Line Items). Click twice on the Total Profit header to sort the data from largest to smallest, and resize it so that it fills the width of the design surface. Finally, from the Design ribbon, Select Column Chart – Stacked Column.

image

Click on one of the data columns and observe the effect on the map. Next, do the reverse, drill down to a state, click on a state and notice how the column chart is updated. Everything in Power Pivot is cross filtered, allowing you to very quickly discover facts about your data.

Sharing

SharePoint supports embedded data models through Excel Services along with Power Pivot for SharePoint. This means that a power user can upload a workbook to a SharePoint library, and a user with a browser can work with it. Power Pivot for SharePoint (on premises) can also schedule automatic refreshes of the data on a scheduled basis. Office 365 also supports embedded data models, although for the moment, it can’t automatically refresh them. Power BI will address this in the near future.

To share the model with other Office 365 users, simply upload it to an online document library. Once uploaded, open it with a browser, and you will be presented with your Power View report.

image

Summary

Historically, implementing and using Business Intelligence products could be a cumbersome and daunting process. As we’ve seen here, with Power Pivot and Power View, it’s possible to very quickly get answers from corporate data. Traditional data warehouses and cubes still have their place, and can be well utilized by these new tools, but we no longer need to wait for their complete implementation to be able to realize the benefits of these Business Intelligence tools.

About the author

jpwHeadshotSquareFull

John P White is the Chief Technical Officer at UnlimitedViz Inc. He holds a Master’s degree in Engineering from the University of Guelph and is a Microsoft SharePoint MVP. He has spent 22 years in the Information Technology space, and possesses a skill set that spans both architecture and development. He has been instrumental in delivering projects and applications that have been recognized with both local and global awards from Microsoft and IBM.

As a seasoned IT professional, John has accumulated a plethora of legacy technologies like Novell, Lotus Notes and Java. This experience has proven invaluable when architecting systems alongside legacy applications. Over the past decade, he has been focused on the Microsoft SharePoint and Business Intelligence platforms and has become an expert with the latest that these platforms have to offer.

He is a frequent speaker at user groups and conferences, and he blogs as actively as possible at http://whitepages.unlimitedviz.com. John lives in Guelph, Ontario, and is the father of 3 boys, is an avid scuba diver, and a budding photographer, both under and above the water. Samples of his work can be found at http://www.flickr.com/wpages. Follow him on Twitter.

About MVP Mondays

The MVP Monday Series is created by Melissa Travers. In this series we work to provide readers with a guest post from an MVP every Monday. Melissa is a Community Program Manager, formerly known as MVP Lead, for Messaging and Collaboration (Exchange, Lync, Office 365 and SharePoint) and Microsoft Dynamics in the US. She began her career at Microsoft as an Exchange Support Engineer and has been working with the technical community in some capacity for almost a decade. In her spare time she enjoys going to the gym, shopping for handbags, watching period and fantasy dramas, and spending time with her children and miniature Dachshund. Melissa lives in North Carolina and works out of the Microsoft Charlotte office.

Imagine Cup: Three Gold Medals, What a Night!

$
0
0

Editor’s note: The following post was written by Windows Azure MVP Rainer Stropek

Oh yes, we won!

What a night! Of course I was hoping that the teams I had the privilege to accompany at Microsoft’s Imagine Cup here in St. Petersburg would win their categories - but I did not dare to dream of winning all of them. However, they really made it. We travel home with three gold medals :-) 

It was my first time to Imagine Cup and I am deeply impressed. I was invited to come to St. Petersburg as a mentor for two teams of students, one from India (Y’nots) and one from Austria (Zeppelin Studio). Both teams did a great job presenting their projects in front of the judges yesterday. Congratulations!

I am especially proud of the Y'nots team from India as I have consulted them for many months throughout their entire preparation for Imagine Cup. These guys did a great job. They convinced the jury and won the Connected Planet Award as well as the Windows Azure Challenge. I am an MVP for Windows Azure and therefore it is something special that they got this award.

In my opinion, Microsoft does an awesome job offering all these students a platform for presenting their ideas and inventions. It is not only the glamorous event of the worldwide finals here in St. Petersburg. I honestly have the impression that Microsoft’s activities for helping students to learn about IT and implement their ideas are an ongoing commitment.

My involvement in Imagine Cup started months ago in the MVP to MSP (Microsoft Student Partners) mentoring program. I had offered to work as a mentor using a website provided by Microsoft for us MVPs. Honestly, I had not expected that a team of students would approach me and invite me to be their mentor – but it happened. Only shortly after I had created my profile, a team from India contacted me and asked me if I could help them. We had regular contact over the last year using email and Skype. Technically these guys are brilliant already. I couldn’t tell them many new things about Azure or software development. However, they asked me for advice regarding their business plan, their ideas, team organization, etc. I was more than happy to help as it was very interesting for me to work with highly motivated students from a different part of the world.

Team Zeppelin Studio at Imagine Cup

My friends from the Y’nots team and I were happy when we heard that they made it into the Imagine Cup worldwide finals. This meant we would finally meet in person here in St. Petersburg. As I am the only MVP from Austria attending the finals, Microsoft asked me to coach a team from Austria, too (BTW - you have to check out their awesome game Schein). I have taken this request as an honor and agreed. I have tried my very best to support both teams by giving tips regarding the presentation and by trying to connect them to other interesting people who I know. 

You find more pictures in my Flickr album.

Imagine Cup was an exciting experience for me. These students have amazing ideas and the power and energy inspires everyone. In their projects you can feel the passion. They dare to dream and do not count every minute or Dollar they invest. They just want to learn and build something they can be proud of. Tomorrow I will travel home but the memories of Imagine Cup in St. Petersburg will stay for a long time. Being a mentor definitely paid off. Thank you, Microsoft, for facilitating it with the MVP to MSP mentoring program.

Again, congratulations to Zeppelin Studio and the Y'nots.

About the author

photo

Rainer Stropek is co-founder and CEO of the company software architects and has been serving this role since 2008. At software architects Rainer and his team are developing the award-winning SaaS time tracking solution “time cockpit”. Previously, Rainer founded and led two IT consulting firms that worked in the area of developing software solution based on the Microsoft technology stack. Rainer is recognized as an expert concerning .NET development, software architecture and database management. He has written numerous books and articles on C#, database development, Windows Azure, WPF, and Silverlight. Additionally he regularly speaks at conferences, workshops and trainings in Europe and the US. Rainer graduated the Higher Technical School Leonding (AT) for MIS with honors and holds a BSc (Hons) Computer Studies of the University of Derby (UK). You can follow Rainer on Twitter and Facebook or read his blogs at http://www.timecockpit.com and http://www.software-architects.com.

Viewing all 788 articles
Browse latest View live




Latest Images