Are you the publisher? Claim or contact us about this channel


Embed this content in your HTML

Search

Report adult content:

click to rate:

Account: (login)

More Channels


Channel Catalog


Channel Description:

Independent Experts. Real World Answers.

older | 1 | .... | 30 | 31 | (Page 32) | 33 | 34 | .... | 40 | newer

    0 0
  • 06/16/14--10:42: Friday Five - June 13, 2014
  •  
          By Windows Azure MVP Dennis Burton
     

    4. End to End BizTalk Domain Setup in Windows Azure IaaS Scripts

            By Integration MVP Stephen Thomas

    5. New Features in Total Access Emailer for Microsoft Access 2013 and 2010

            By Access MVP Luke Chung

     

     

    0 0
  • 06/23/14--09:54: Friday Five - June 20, 2014
  • 1. A Show-All-Or-Nothing Behavior for Windows Universal Apps

              By Client Development MVP Diederik Krols

    2. Hyper-V Amigos Podcast

              By MVPs Carsten Rachfahl, Didier Van Hoye, Hans Vredevoort and Aidan Finn

    3. Use Access Teams in Dynamics CRM 2013

              By Dynamics CRM MVP Adam Vero

    4. Creating Dialogue Windows in Dynamics CRM 2013

              By Dynamics CRM MVP Andrii Butenko

    5.  Windows Phone 8.1 Update Task

              By Client Development MVP Olivier Matis 

     

     


    0 0

    The following article was written by Microsoft Student Partner Avirup Basu

    The Microsoft Most Valuable Professional (MVP) Award is Microsoft's way of saying thank you to exceptional, independent community leaders who share their passion, technical expertise, and real-world knowledge of Microsoft products with others.  

    The Microsoft MVP Mentor Program matches these MVPs with students from around the world who want to learn how to use Microsoft technologies through social mentoring, guided webinars, one to one mentoring, and much more.
    In this post, I will be discussing my experience as a Microsoft Student Parnter (MSP) in MVP student mentorship program.

    Firstly, I would like to thank Kari Finn, from the MVP Mentor Program, for allowing me this wonderful opportunity. Windows Phone Development MVP Mayur Tendulkar was assigned as my mentor. I would say that I was lucky enough to develop under him. My interest was Windows Phone and as Mayur excels in this area, I was matched up with him. Who you will be matched up with for a mentor depends on the Microsoft technology you choose for the program. During my the entire course of my mentorship, I developed 2 apps. I will mention more about those apps later. You can also find out more information in apps section of my website.

    To click here to read the full story


    0 0

     

    Today, 1,062 exemplary community leaders around the world were notified that they have received the MVP Award! These individuals were chosen because they have demonstrated their deep commitment to helping others make the most of their technology, voluntarily sharing their passion and real-world knowledge of Microsoft products with the community.

     

    While there are more than 100 million social and technical community members, only a small portion are selected to be recognized as MVPs. Each year, around 4,000 MVPs are honored. They are nominated by Microsoft, other community individuals, or in some cases themselves. Candidates are rigorously evaluated for their technical expertise, community leadership, and voluntary community contributions for the previous year. They come from more than 90 countries, speak over 40 different languages, and are awarded in more than 90 Microsoft technologies. Together, they answer more than 10 million questions a year!

     

    MVPs are recognized each quarter for this annual award, which continues to grow and evolve to reflect the development of Microsoft technologies.

     

    Congratulations to the new MVPs, and welcome back to renewed MVPs. We are very excited to recognize your amazing accomplishments!

    For more information or to nominate someone, go to MVP.Microsoft.com


    0 0
  • 07/08/14--10:52: Friday Five - July 4, 2014
  • 0 0

    Editor’s note: The following post was written by SharePoint MVP Corey Roth

    The Microsoft Surface Pro 3 is an amazing device and OneNote works quite well on it.  When it comes to using OneNote with the Surface Pro 3, you have the option of using the OneNote 2013 (the desktop application) and the Windows 8.1 Store application.  By default, it will use the Windows Store app when you activate it with your stylus.  This can be changed though (read on).  The feature sets between these two applications vary. This article is going to cover note taking on the Surface Pro 3 as well as what the handwriting experience is like using both applications to help you decide which works best for your needs

    OneNote Windows Store Application for Windows 8.1

    The Windows Store application is designed with touch in mind.  It features the new radial menu that allows you to quickly select features such as copy / paste, tagging, tables, drawing, and the camera.  It works well using your finger or the stylus.  The screenshot below shows a selection of text and how I can use the radial menu to change the look and feel as well as use the tagging features.

     The camera feature is exclusive to the Windows Store application allowing you to take pictures directly from the device’s camera and insert them into OneNote.  This is a great feature for taking pictures of whiteboards. 

    When you are using the Windows Store application using a keyboard and mouse, you might miss the lack of context menus available.  For example if you are used to cutting and pasting text by right clicking, that feature is not available.  Instead, you either need to use the keyboard shortcut (i.e. Ctrl+C / Ctrl+V) or use the cut and paste commands from the radial menu.  The Windows Store application does support many common keyboard shortcuts though.  This includes my personal favorite Ctrl+. which inserts a bulleted list.  That’s a shortcut I would love to see come to the rest of Office.

    OneNote 2013 (desktop) Application

    The OneNote desktop application has more features comparatively but using it with touch is not as ideal.  Although, the desktop mode defaults to touch mode, you’ll find certain tasks such as searching when you have the Surface Pen in your hand a bit cumbersome.  For example, if you are holding the device vertically in the hand, that means you need to reach all the way to the taskbar to activate the virtual keyboard.  This makes it a bit awkward. 

    When holding the device vertically, OneNote naturally collapses the ribbon bar and navigation area to provide a larger work area.  Click the “…” bar at the top to expand the ribbon.  You’ll use this when you want to change the color of your ink.

     

    While OneNote 2013 only has minimal touch features, it has all of the other features you like about OneNote.  I personally use the Outlook integration feature a lot which lets me take notes about my meeting and include the details from the calendar invite. 

    Using the Surface Pen

    The stylus (or Surface Pen as it is called) that comes with the Surface Pro 3 is a nice upgrade from the previous generations of Surface Pro.  It features two buttons on the side and one on the top.  It was designed with OneNote in mind too.

     

    As you expect, you can use the Surface Pen for handwriting, but you can do a bit more by making use of its buttons.  Click the button at the top of the Surface Pen to open OneNote from wherever you are to jot down a quick note.  This even works when the device is locked and your screen is off.  Not to worry though, your device is still locked and you can’t view any of the existing notes that you have written until you unlock the device.  Write your note and it will automatically be saved.  Click the button at the top again if you want to write another separate page of notes.

     

    Using this functionality will always launch the Windows Store version of OneNote even if you set the default to OneNote 2013 (desktop).  Also, if you are using the desktop version as your default, and you click it when the device is unlocked, it will open OneNote, but it won’t take you to a new page in your Quick Notes automatically.

    If you double click the button at the top of the Surface Pen, it will allow you to take a screen clipping of whatever is currently on your screen.  Just drag the area that you want to clip and it will put it in a new page in OneNote.  I actually used this feature a number of times to take the screenshots for this article.

    Handwriting

    The ability to use your own handwriting using a stylus is nothing new to OneNote.  This could be done with Windows XP Tablet PC Edition and OneNote 2003.  And this could be done in earlier versions of Surface Pro.  However the heavier weight of those devices coupled with the 16:9 aspect ratio meant the devices weren’t as comfortable to hold vertically.  Surface Pro 3 addresses both of these issues by being light weight and by having the proper 3:2 aspect ratio which makes holding the device feel more similar to a pad of paper.

    The Surface Pro 3 features palm block technology allowing you to rest your palm on the screen when you are writing.  This is great, but I am so used to not touching the screen when using a stylus I forget that I can rest it there.  As you use your device for handwritten notes more and more, you’ll get used to it though.

    The two buttons on the side of the Surface Pen can be quite useful.  The bottom button is your eraser.  Hold it down and move the Surface Pen over the area that you want to erase.  In OneNote 2013, it will actually change the pointer to an eraser to visually indicate you are erasing.  This feature isn’t present in the Windows Store application.

    Use the top button for selection.  It effectively works like a lasso where you draw around the text or objects you want to highlight.  You can also use it in other Office applications such as Word though and it works similarly to selecting text with a mouse.

    Nicole Steinbok, OneNote Program Manager, has a good video on the Office blog demonstrating all of the tricks you can do with the Surface Pen.  Be sure and check it out to see the pen in action.

    Using Handwriting with the Windows Store Application

    Both OneNote applications support using hand writing.  Using the radial menu, you can select the type of ink including thickness and color.  OneNote can detect how much pressure you use with the pen to make lines bolder just like with a real pen or pencil.

    When taking notes in the Windows Store app, they aren’t indexed.  That means you can search for what is in your handwriting, but you won’t get any results.  However, if you open the OneNote 2013 desktop application later or on another device, your handwritten notes will get indexed regardless of which application you wrote them with.  Then, you can search for them in either application.

    Using Handwriting with OneNote 2013

    OneNote 2013 on the desktop supports handwriting as well.  You’ll need to make a few extra screen taps to find the drawing menu for your pen.  Since the ribbon is collapsed you’ll need to tap the “…” at the top to expand the menu and pick the type and color of pen you want.  This is one spot that I think the radial menus from the Windows 8.1 application really excels. 

    Searching handwritten notes

    When it comes to dealing with handwriting, you also have the option of converting it to text.  You might be thinking that you need to convert it to text for OneNote to be able to search it, but that’s actually not the case.  Once the content is indexed, you will be able to search for text inside the ink whether it’s converted or not.  OneNote does an exceptional job at recognizing handwriting and can even parse my sloppy handwriting with ease.  I like that you can convert handwriting to text, however, I find that it just introduces more work for me because I am anal and insist everything be formatted consistently in my notes. J  Take a look at this handwritten note.

     

    We can convert this to text by clicking the Ink to Text button in the Draw menu of OneNote 2013.

     

    The handwriting you wrote will be converted to text.  However, you can see that the formatting is less than ideal.  This just means you are going to spend more time trying to correct it.

     

    The ability to search handwritten notes is quite impressive.   Whether it is text or handwriting, you can search it either way.  When it finds notes that match, they will be highlighted directly in your handwriting.

     

    You can search from the Windows Store application too by activating the search charm and then selecting OneNote.  I find this a bit cumbersome, because the search charm will search all applications by default (not just OneNote).  You will also find that searching this way will yield different results than the desktop application.  The results may be similar but it seems that the order in which things are presented is not the same (notice the results from above).  Just remember, that the handwritten notes won’t be indexed unless you open the notebook in OneNote 2013 at some point.

     

     

     

    Is handwriting right for me?

    If you have been taking notes on a laptop with OneNote over the years, you know without a doubt you can type faster than you can write.  So why would you go back to handwritten notes?  There are a few places where I think it’s really valuable.  In a meeting at a conference room table, I am going to go with my keyboard every time.  What about when you are doing a white boarding session though?  The device is great.  Just hook it up to a projector and get your Surface Pen out.  You can do this when you are sitting or when you are standing up.  This is great for drawing diagrams, process flows, and network diagrams. 

    Another time I find handwriting useful is when I am in a place where I can’t sit.  The whiteboard example is a great one.  However, I think this is also useful for when you are in the field or at an airport waiting to board an airplane.  Just pull your device out and write on it just like it was a legal pad sitting in your clipboard.

    Setting the default OneNote application

    By default, the Surface Pro 3 is going to use the Windows Store OneNote application.  However, if you decide the desktop application is for you, you can change your default by changing the settings in OneNote 2013.   Simply go to File -> Options -> Advanced and then check the box Make OneNote 2013 (desktop) the default OneNote application for OneNote links, notes, and clips.

     

    Which OneNote should you use?

    This is going to be up to your personal preference, but let summarize the key differences between the applications. 

    OneNote for Windows 8.1

    OneNote 2013 (desktop)

    • Optimized for touch
    • Radial menus make common tasks easier with touch or the Surface Pen
    • Works when the device is locked and OneNote is activated by the Surface Pen
    • Can take pictures directly from the device’s camera
    • No context menus
    • Inserting tables is cumbersome
    • Does not support password protected sections
    • Touch support not as good.
    • Supports all OneNote features including Ink to Text, and Outlook integration
    • Supports indexing handwritten notes
    • More export and sharing options
    • Support for audio and video recording
    • Support for math symbols
    • Has review features such as Spelling, Research, and Thesaurus
    • Supports password protected sections
    • Can view previous versions of notes
    • Surface Pen shortcut doesn’t open a new page
     

     

    With my Surface Pro 3, I find myself jumping between both OneNote applications.  Since they sync automatically it doesn’t really matter which one I use.  When I am holding the device in my hand and using it with the Surface Pen, I use the Windows Store application.  When I am using the device as a laptop, I use the desktop version.  By leaving both applications open, it also allows for my handwritten notes to be indexed rather quickly.

    You may need to weigh the pros and cons to determine what will be your primary OneNote application.  Whether you choose OneNote for Windows 8.1, OneNote 2013, or both, you should have no problem with meeting your note-taking needs.

    If you are considering the Surface Pro 3 as the tablet that can replace your desktop, be sure and read my full review.

    About the author


    Corey Roth is a SharePoint consultant specializing in solutions in the Oil & Gas Industry.  He is a four-time recipient of the Microsoft MVP award in SharePoint Server.  Corey has always focused on rapid adoption of new Microsoft technologies including SharePoint 2013, Office 365, and  Visual Studio 2013.  When it comes to SharePoint, he specializes in Office 365 deployments, Enterprise Search, and Apps.  As an active member of the SharePoint community, he often speaks at conferences, events, and user groups.   Corey has a blog at (www.dotnetmafia.com) where he posts about the latest technology and SharePoint and he develops Office Apps for his company SP2 (www.sp2apps.com).  Corey is a huge fan of the Surface line of products.  Follow him @coreyroth on Twitter.

    About MVP Monday

    The MVP Monday Series is created by Melissa Travers. In this series we work to provide readers with a guest post from an MVP every Monday. Melissa is a Community Program Manager, formerly known as MVP Lead, for Messaging and Collaboration (Exchange, Lync, Office 365 and SharePoint) and Microsoft Dynamics in the US. She began her career at Microsoft as an Exchange Support Engineer and has been working with the technical community in some capacity for almost a decade. In her spare time she enjoys going to the gym, shopping for handbags, watching period and fantasy dramas, and spending time with her children and miniature Dachshund. Melissa lives in North Carolina and works out of the Microsoft Charlotte office.


    0 0
  • 07/14/14--08:22: Friday Five - July 11, 2014
  • 0 0

    Editor’s note: The following post was written by SharePoint MVP Mikael Svenson

    Best Bets or Promoted results is a great way to give end-users the answer to their search query without the additional click to open a page or document to locate the precise information they were looking for. Bing and other search engines already have many examples of this if you for instance look for weather information or currency conversion.

     

    Out of the box approach


    Out of the box in SharePoint Online/2013 you can set up this type of functionality using Query Rules. But, there are some hitches to the default experience provided by SharePoint.

    First of all, the UI is not very user friendly (unless you’re a search expert). Each best bet or promoted result, which is what it is called in SharePoint, need a separate query rule with associated trigger terms. And for each query rule you have to specify the information about the promoted result(s) itself for that rule. All in all it takes a while to get used to, set up and maintain.

    Secondly you can only trigger on exact terms/phrases or the start/end of the queries, not getting partial matches or terms spread apart. For on-premises you also have the option to write regular expressions, but now you are moving away from your regular search keyword manager in hurry.

    Thirdly you won’t get any lemmatization or stemming on your trigger terms. As an example the trigger term red car will match the following text with lemmatization turned on:

    • red car
    • reds car
    • red cars
    • reds cars

    Using lists and one Query Rule to Rule Them All!


    I’m not going to take full credit for this idea as it was introduced to me by Petter Skodvin-Hvammen. But I have taken it a bit further to get the lemmatization working. The idea is to create a regular SharePoint list using Enterprise Keywords as the trigger term matcher. An easy and familiar way to add new entries, as well as easy to maintain. Then display the best bet hits from this list in a result block at the top of the result page.

    Note: In order to display result blocks your users has to be assigned Enterprise licenses.

    To set this up you need only one query rule targeting the best bet list. Once it has been set up you never have to maintain it again, and good bye quirky UI. Adding new best bets is as simple as adding a new row to the best bet list. Something most SharePoint user should be familiar with.

    One issue with a list like this is that you would have to query only the Enterprise Keyword column, in order not to get hits from matches in the title, description or URL fields of the list items. This is all good, but when executing a property search you get no lemmatization/stemming on your trigger terms.

    My post “What makes a SharePoint column searchable?” serves as background material to my refined approach to get lemmatization going without doing a property query or matching on title and description.

    By querying against the full-text index and not one property you get the added benefit of partial matches on the trigger terms. In the scenario of the trigger term being red car the best bet block rule would trigger for the below queries as well.

    • red
    • reds
    • car
    • cars
    • (network OR red) AND (test OR car)

    If you have multi-word trigger terms, make sure all words are pretty good/unique to avoid unwanted partial matches. Or stick with the out of the box query rule matching with a promoted result instead.

    The re-written query I will use at the end will limit results to items where the content type is BestBets, and execute a full-text query on those items.

    spcontenttype:bestbets {searchTerms}

    What’s needed to set this up is

    1. A Best Bet list hosted on a SharePoint site. Don’t use the Search Center as it’s marked as not to be indexed. A separate site collection or sub-site anywhere outside the Search Center is fine, or even as a sub-site to a Search Center as long as you make sure it’s being indexed in some way.
    2. Editing or mapping of your crawled properties to reduce recall on title, description and URL columns.
    3. Custom Display Template for the Best Bets
    4. Custom Result Type and Result Source to target the Best Bet list
    5. Query Rule with result block targeting the Best Bet list using the custom result source

    Step 1 – Creating a site, columns, content type and the list

    Start by creating a new site or sub-site based on the Team Site template as there is no blank site template in the SharePoint UI.

    On your newly created site, add site columns as defined below and add them to a new content type named BestBets. The name in parenthesis are the internal column names.

    • Title (bbTitle – Single line of text *Required)
    • Description (bbDescription – Multiple lines of text)
    • URL (bbLink – Single line of text)
    • Start Date (bbStartDate – Date & Time)
    • End Date (bbEndDate – Date & Time)
    • Enterprise Keywords (enable on the list)

    You might be curious as to why I’m not using a HyperLink column for the URL, and the reason is that a URL column will always be included in the recall, even though mapped to an unsearchable managed property (bug anyone?).

    You can also add more columns to suit your management and display of information needs.

     

    Next, hide the default Title column so you don’t have to deal with triggering on that column as you have no control over where it’s mapped in regards to content recall.

    The reason for using site columns is to get automatic managed properties for retrieval (as mentioned in “What makes a SharePoint column searchable?”). The automatically generated managed properties will be used in the Display Template.

    Once set up, your content type should look similar to the below image.

     

    Create a new custom list on your site, turn on management of content of content types (Settings->Advanced Settings->Allow management of content types) and add the BestBets content type to your list and make it the default content type for the list. Make sure everyone has read access to this list. If not they won’t get any best bets. You may also utilize the security of list items to limit who get’s which best bet. An added bonus.

     

    Step 2 – Add content and tune the crawled property settings

    Add one best bet to your best bets list and kick off or wait for a crawl to pick up your data and create the crawled and managed properties needed.

     

    If you issue the query: spcontenttype:bestbets rose, after the initial indexing you will get a hit on the word roses from the description column, which is undesirable, as you only want matches in the Enterprise Keywords column.

    This is where it gets tricky. If you are on-premises you can edit the crawled properties in the search schema on the SSA and turn off the option to include them in the full-text index. For SharePoint Online you have to take a different approach as you cannot edit crawled properties.

    To ensure you don’t get recall from unwanted text, create a new managed property which is marked as not searchable, and map all the textual ows_bbInternalName crawled properties to it.

    As I’m using SPO for my prototyping I have creating a new managed property at the site collection root of my search center called BestBetsNoRecall, and mapped the following crawled properties to it:

    • ows_bbDescription
    • ows_bbTitle
    • ows_bbUrl

     

    The beauty and perhaps a side effect of this Prevent Recall type of managed property is that even if it’s done locally at the site collection/site, you won’t get recall for those columns if you search from another site/site collection either. You have effectively made that column non-searchable.

    Troubleshooting: If you cannot get the NoRecall property to work, try to create the property and mappings at the SSA/tenant level instead.

    If you want to add support for start date and end date of the best bets, map the crawled property ows_bbStartDate to the managed property RefinableDate00 and ows_bbEndDate to RefinableDate01.

    Once you have completed the property mappings, go to advanced settings for your list and click the Reindex List button to make sure the best bets are re-processed on the next crawl.

    Step 3- Create a Display Template

    The Display Template are stored in the master page gallery, and the result type and result source has to be created at either the Search Center level or globally on the SSA/tenant in order to make them available. Basically you have to create search settings at the same level or at a parent level to where you are using them.

    For the display template create a copy of the file _catalogs/masterpage/Display Templates/Search/Item_BestBet.html and name it Item_BetterBestBet.html.

    Open the copied file in a text editor and edit the <title> tag to read Better Best Bet Item. Next add the following to the header of the display template:

    <mso:ManagedPropertyMapping msdt:dt="string">'bbTitleOWSTEXT':'bbTitleOWSTEXT','bbDescriptionOWSMTXT':'bbDescriptionOWSMTXT','bbUrlOWSTEX':'bbUrlOWSTEXT'

    </mso:ManagedPropertyMapping>

    Below the line $setResultItem(itemId, ctx.CurrentItem); insert the following lines to quickly map the properties to the default ones and get the template working.

    ctx.CurrentItem.Title =  $getItemValue(ctx,"bbTitleOWSTEXT").value;

    ctx.CurrentItem.Description =  $getItemValue(ctx,"bbDescriptionOWSMTXT").value;

    ctx.CurrentItem.Url =  $getItemValue(ctx,"bbUrlOWSTEXT").value;   

    There are many ways to modify a display template to get this working, but this is one quick way to get it up and running. See SharePoint 2013 Design Manager display templates for more information on display templates. You might also want to create a custom control template to customize or remove the border around the best bet items.

    Note: Make sure you publish your display template once complete to make it accessible to all users.

    Step 4 – Create Result Type and Result Source to target the Best Bet list

    Navigate to Manage Result Types on your Search Center (Site Settings->Search Result Types - site collection) and add a new result type with settings like depcited below. For ContentType you must use Contains any of… and not Equals, as this property is a but quirky and SPContentType is not accessible in the dropdown. A better option might be to use the ContentTypeId.

     

    Next create a new Result Source (Site Settings->Search Result Sources) with the following properties:

    • Name: Better Best Bets
    • Protocol: Local SharePoint
    • Type: SharePoint Search Results
    • Query Transformation: {?{searchTerms} SPContentType:BestBets ((RefinableDate00<=today AND RefinableDate01>=today) OR (RefinableDate00<>"this year" AND RefinableDate01<>"this year")) }

    I want to point out the RefinableDate<>”this year” parts of the query, which will include any best bet which is NOT tagged with a  start and end date. It’s a workaround to include results which don’t have a value.

    Step 5 – Create a Query Rule to serve up Best Bets

    Now for the final piece of the puzzle. Add a new Query Rule (Site Settings->Search Query Rules->Context=Local SharePoint )

    • Rule name: Better Best Bets
    • Query Conditions: Remove any conditions
    • Add a Result Block
      • Block title: Best Bets for “{subjectTerms}”
      • Query->Select this Source-> Better Best Bets
      • Items: 5
      • Settings->This block is always shown above core results
    • Change ranked results by changing the query: {searchTerms} -spcontenttype:bestbets

    What the rule does is include up to five best bet results from the Best Bet list at the top of the results, and also exclude those items from the regular results themselves.

    The End Result and next steps

    Executing a search for red cars you now get a best bet at the top, even lemmatized.

     

    Next steps would be to improve on the Control and Display Template to make it more visual appealing, and you could for example incorporate an image link or other actions.

    I’m not saying it’s easy to set up if this is the first time you work with SharePoint 2013 search settings, but once set up your keyword managed for Best Bets will be eternally happy for providing him or her with a UI they actually can manage.

     

    About the author

    Mikael Svenson is a principal Consultant at Puzzlepart where he develops SharePoint Business Apps and consults on SharePoint in general. Mikael is a search enthusiast at heart having authored "Working with Microsoft FAST Search Server 2010 for SharePoint". A four time SharePoint Server MVP, Mikael puts his local community efforts into being a board member of both the Norwegian SharePoint community and SharePoint Saturday Oslo. In addition to organizing, he also speaks at conferences, events and user groups. Mikael has a blog at techmikael.blogspot.com where he mostly blogs about SharePoint and search, but you can find other nuggets there as well. You can follow @mikaelsvenson on Twitter or check out some forum goodness over at TechNet (http://social.technet.microsoft.com/profile/mikael%20svenson/)

    About MVP Monday

    The MVP Monday Series is created by Melissa Travers. In this series we work to provide readers with a guest post from an MVP every Monday. Melissa is a Community Program Manager, formerly known as MVP Lead, for Messaging and Collaboration (Exchange, Lync, Office 365 and SharePoint) and Microsoft Dynamics in the US. She began her career at Microsoft as an Exchange Support Engineer and has been working with the technical community in some capacity for almost a decade. In her spare time she enjoys going to the gym, shopping for handbags, watching period and fantasy dramas, and spending time with her children and miniature Dachshund. Melissa lives in North Carolina and works out of the Microsoft Charlotte office.


    0 0
  • 07/22/14--10:43: Friday Five - July 18, 2014
  • 0 0
  • 07/28/14--08:32: Why SharePoint on Azure?
  • Editor’s note: In partnership with Microsoft Press, now celebrating their 30th year, MVPs have been contributing to an ongoing guest series on their official team blog. Today’s article is from SharePoint MVP Tom Resing which is the43rd in the series. 

    Why SharePoint on Azure?

    Last week, at the Worldwide Partner Conference, a new SharePoint Farm option for Azure was announced. The new SharePoint Farm Gallery option from the Azure Portal is now, by far, the easiest way to get started with SharePoint on Azure. Type in a name for the Farm’s Azure Resource Group and credentials for the Administrator account, and the rest of the farm creation is automated.

     

     

    SharePoint in the Cloud

    As you may know, Office 365 includes SharePoint Online as a component. If you want to do SharePoint in the Cloud, it doesn’t get any easier to set up or less expensive than Office 365.

    If it’s not easier or less expensive, why chose Azure over Office 365?

    The TechNet article Microsoft Azure Architectures for SharePoint 2013, gives 4 reasons:

    1)      Development and Test

    2)      Disaster Recovery

    3)      Internet Sites

    4)      App Farms

    Development and Test

    From the beginning of my time working with SharePoint, creating and maintaining development and test environments has consumed a very large amount of my time. I started in the early days in 2006 when VMWare was my only option. Then my Microsoft Certified Master training and tests 4 years later required me to learn Hyper-V. Along the way, I made good use of the excellent online environments at Cloudshare.com. I’ve tried it all. To this day, I still maintain a SharePoint 2013 environment locally on my laptop for demonstrations. When I trust I can get reliable internet connections presenting at all conferences, I’ll be ready to trade that heavy laptop in for a more nimble machine and an Azure Farm.

    Today, development environments in Azure are easy and can be done pretty much for free, with an MSDN license. Here’s a great chart authored by Simon J.K. Pedersen on the cost:

    Machine

    Pricing Tier

    RAM

    Cores

    Price/Hour (Pay-As-You-Go)

    Price/Hour (MSDN)

    Domain Controller

    Basic A1

    1.75

    1

    $0.075

    $0.047

    SQL Server

    Standard A5

    14

    2

    $2.10

    $0.248

    SharePoint Server

    Standard A5

    14

    2

    $0.30

    $0.248

    With these prices you will be able to run your environment for around 276 hours, or more than 30 days if it is only online 8 hours a day, on a Visual Studio Ultimate MSDN subscription with $150 Azure Credit (MSDN subscribers don’t pay extra SQL servers and only pay linux prices for the VMs). If you are not using an MSDN subscription, your price will obviously be a lot higher.

    Disaster Recovery

    Azure offers a great option for a secondary location for your on-premises SharePoint farm. You can really take advantage of the on-demand nature of cloud infrastructure as a service in many disaster recovery scenarios. Continue reading full article here

    About the author

    Tom is an advocate for, architect of and developer of web sites and has been for almost 20 years. He's been very lucky to work with some of the best tools available in the web development world like Visual Studio, IIS, SharePoint and Azure. Tom has received 2 MVP Awards in SharePoint for his community contributions including writing, speaking at and organizing technical events.  Follow him on Twitter.

    About MVP Mondays

    The MVP Monday Series is created by Melissa Travers. In this series we work to provide readers with a guest post from an MVP every Monday. Melissa is a Community Program Manager, formerly known as MVP Lead, for Messaging and Collaboration (Exchange, Lync, Office 365 and SharePoint) and Microsoft Dynamics in the US. She began her career at Microsoft as an Exchange Support Engineer and has been working with the technical community in some capacity for almost a decade. In her spare time she enjoys going to the gym, shopping for handbags, watching period and fantasy dramas, and spending time with her children and miniature Dachshund. Melissa lives in North Carolina and works out of the Microsoft Charlotte office.

     

     

     


    0 0

    Nearly 300 MVPs, technology enthusiasts and SQL fans joined forces in Germany to participate in a hackathon and SQL Saturday event featuring MVP presenters.  The two-day event was organized by SQL Server MVPs Constantin Klein, Tillmann Eitelberg and Oliver Engels.  We had the chance to catch up with Constantin “Kostja” Klein to get an inside look into the event.

    What was the inspiration for creating such a unique community event?

    "Since BIG DATA is a hype-topic, we decided to favor a Hackathon over a regular Pre-Con in order to allow attendees to really get a first, hands-on experience with reference to the existing technologies on the Microsoft platform, like HDInsight and PowerBI. With Azure MVP Sascha Dittmann, Scott Klein and Emil Siemens we also found the right people to introduce the tools to the attendees and help them on occurring problems."

    What was the highlight of the event?

    "The highlight was the presentation of the results at the end of the day. This was when all other teams had the chance to find out about different approaches to the same problem, different ways of visualization, etc.
    By the way, the challenge we prepared for the day was to get some interesting insight and visualization out of more than 250,000 tweets collected during [the] football World Cup with [the] hashtag #WorldCup. And actually we had some interesting findings, like it seems that in the UK, there is a disproportionately high number of people who like football and use Twitter.   

    We wanted attendees to use the Microsoft cloud technology stack. Therefore we had Azure accounts prepared and helped people to get the environment (Cloud Storage, Azure SQL Database and HDInsight cluster) up and running. People helped each other and we started the day with building teams of up to five people who then worked together. In fact, most of the teams were not colleagues and had never worked together before."

    What is the benefit of attending such a great community event?

    "We believe that you get a much better kick-start for dealing with a brand new topic – which it was for almost all attendees – when you really have time to start a project and play with the technology. So during the wrap-up many attendees confirmed, that they now have a first real experience, they can take home and elaborate on that. This is totally different if you just listened to a whole day lecture. Obviously this lecture could cover much more details, but an attendee would not be able to immediately reproduce what he heard or had seen. Working in mixed groups is another interesting aspect, which helps people to deal with and adjust to new situations."

    Congratulations to the MVP organizers, presenters and all the participants!


     

     


    0 0

    Editor’s note: The following post was written by SQL Server MVP Mark Tabladillo

    Power BI is new and emerging self-service business intelligence and business analytics framework brings together and enhances key Microsoft technologies:

    • Office
    • SQL Server
    • Azure
    • SharePoint

    Fundamentally, Power BI is considered a premium Office option, because Microsoft licenses it that way.  Yet, the technology details also comprise new collaboration technologies for SQL Server, Azure and SharePoint.  A successful technology collaboration will have boundaries which could arguably belong to one or more of the contributing technology groups.

    This document provides links and introductory information to Power BI.  My analysis is more useful for the enterprise planner (CIO, CTO, Information Technology Architect), but also is useful for individual consumers.  Power BI is a technology which extends from individual use on any device (laptop, tablet or smartphone) and all the way up to high-scale cloud or hybrid (cloud plus on premise) production architecture.

    Books have already been written on aspects of the component Power BI technologies, and some will be recommended for further study.  For this document, the purpose is to provide an overview of the key points in knowing what this technology is and how it might be useful in your organization.  Along the way, I provide web links (URLs) to pages and videos that document and demonstrate key features of Power BI technology.  In larger view, Power BI is at the heart of how Microsoft is now developing integrated aspects of the already-technologies (Azure, SQL Server, Office, and SharePoint) and represents a direction for all these technologies for the foreseeable future. 

    The major sections of this report include:

    • Definition – what is Power BI?
    • Licensing Power BI – how can I or we get Power BI?
    • Excel 2013 Features – what are the major features in on premise (legacy) Excel 2013?
    • Power BI for Office 365 – what are the major features of the online Power BI for Office 365?
    • Power BI with Excel 2010 – what can Excel 2010 users do with Power BI?
    • Recommended Resources – where can I find free online resources and recommended paid books?

    Definition

    Formally, Microsoft claims that this technology is comprised of the following features and services:

    Excel Features

    • Power Query– easily discover and connect to data from public and corporate data sources
    • Power Pivot– create a sophisticated Data Model directly in Excel
    • Power View– create reports and analytical views with interactive data visualizations
    • Power Map– explore and navigate geospatial data on a 3D map experience in Excel

    Power BI for Office 365

    IT (Information Technology) Infrastructure Services for Power BI Office 365

     

     

     

    Many presentations I have seen on Power BI start with flashy demos and features.  I have both seen and done such demos several times for user groups and a national conference called PASS Business Analytics Conference.  I will recommend some video demos throughout in this document, and provide some recommended links for further reading. 

    Though first, the licensing needs to be examined because Microsoft is offering more than just the legacy pricing options for Office. In this report, I will be discussing and emphasizing the Office 365 integration over the SharePoint integration.  In my experience with consulting clients, a key question is how much the technology costs and how to obtain it. 

    Licensing Power BI

    Licensing has been one of the most actively discussed aspects among current Power BI users.  It is wise to spend some time on the topic up front.  Though a combination of technologies, Power BI is obtained through an Office license.

    While some pieces of Power BI have been and are available for Office 2010, I support Microsoft in recommending that people obtain Office 2013 for a more stable technology and complete Power BI experience (especially if organizations are coming from Office 2007 or Office 2003 or earlier).   The technology reason is that Excel 2013 Power Pivot is superior to Excel 2010 Power Pivot, and in enough ways to recommend the higher level.

    Power BI is not required for Office 2013:  the reason is that Power BI is considered a premium set of features and services, available only at higher licensing levels.  For people wanting Office as a one-time purchase, they can continue to obtain a single PC license in the United States:

    • Office Home & Student 2013 – Word, Excel, PowerPoint, OneNote
    • Office Home & Business 2013 – Word, Excel, PowerPoint, OneNote, Outlook
    • Office Professional 2013 – Word, Excel, PowerPoint, OneNote, Outlook, Publisher, Access

    Though, none of these three versions include Power BI Excel Features, which instead requires Office Professional Plus 2013 (only available through volume subscriptions, or through MSDN Premium or Ultimate subscriptions).  I have a MSDN Ultimate subscription, and therefore I have the Power BI Excel Features. 

    If you only have Power BI Excel Features, you can start using many new self-service business intelligence features, and you can be happy having such abilities.  However, these Excel Features are only part of the Power BI technology.  In several online forums, many others who have the same MSDN Ultimate subscription level have been asking Microsoft to extend the benefits to include all of Power BI and not just the Excel Features.

    For now, experiencing Power BI for Office 365 and the IT Infrastructure Services for Power BI currently requires one of several higher Office 365 subscriptions.  All Office 365 subscriptions require an annual commitment, and would have a penalty for early cancellation.  As with the one-time purchase options, only certain higher Office 365 subscriptions include Power BI.  Again, the reason is that Power BI is a premium set of features and services.

    In general, Office 365 subscriptions have been organized into several clusters, one for personal users, and the others toward different types of businesses.  You could be a single person or private group and purchase a Business Use option.  I listed these options in the following table, so that you can see it all in one place.

    Personal Use (single account, priced per month, could be paid monthly)

    • Office 365 Personal – Single Computer and Tablet: Word, Excel, PowerPoint, OneNote, Outlook, Publisher, Access
    • Office 365 Home – Five Computers and Five Tablets: Word, Excel, PowerPoint, OneNote, Outlook, Publisher, Access
    • Office 365 Pro Plus– Five Computers and Five Tablets: Word, Excel, PowerPoint, OneNote, Outlook, Publisher, Access – Includes Power BI for Office 365

    Business Use (priced per user, per month) – details on the Microsoft Website

    • Office Small Business
      • Office 365 Small Business
      • Office 365 Small Business Premium
      • Midsize Business
        • Office 365 Midsize Business – Includes Power BI for Office 365
        • Enterprise
          • Hosted email (Exchange Online Plan 1)
          • Office 365 Enterprise E1
          • Office 365 Enterprise E3 – Includes Power BI for Office 365
          • Office 365 Enterprise E4 – Includes Power BI for Office 365
          • Education or Academic– for qualifying organizations, reduced cost compared to Enterprise
            • Office 365 Education A2
            • Office 365 Education A3
            • Office 365 Education A4
            • Government– for qualifying organizations, reduced cost compared to Enterprise
              • Exchange Online (Plan 1)
              • Exchange Online (Plan 2)
              • Office 365 (Plan E1) for Government
              • Office 365 (Plan E3) for Government
              • Non-Profits– for qualifying organizations, reduced cost compared to Enterprise
                • Office 365 Small Business for Nonprofits
                • Office 365 Small Business Premium for Nonprofits
                • Office 365 Enterprise E1 for Nonprofits
                • Office 365 Enterprise E3 for Nonprofits – Includes Power BI for Office 365

     

     

    If you were counting, of the above options, the only five Office 365 subscriptions which include complete Power BI are:

    • Office 365 Pro Plus
    • Office 365 Midsize Business
    • Office 365 Enterprise E3
    • Office 365 Enterprise E4
    • Office 365 Enterprise E3 for Nonprofits

    In summary, when considering Office 2013, there is one partial licensing path to Power BI (Office 2013 Professional Plus) or one of the five listed Office 365 subscriptions. 

    Recommendations

    • If you are an individual, I first recommend the Office 365 Pro Plus subscription. 
    • If you are looking for a business, I would first look at the Office 365 Midsize Business subscription. 

    In a later section, I will describe what features of Power BI you could have using Office 2010 (and particularly Excel 2010).  Even having Office 2010 should make someone happy with at least getting started.  Again, I am recommending Office 2013 over Office 2010, though pragmatically many organizations (for example) already committed to Office 2010 before or as Power BI was being created.

    Excel 2013 Features

    The Excel Features work on-premises with either Excel 2010 or Excel 2013 (requiring an Office Professional Plus subscription, which comes with an MSDN Premium or Ultimate subscription).  Again, Excel 2013 is my recommended platform for using this technology, and the focus of this section (a later section discusses the comparatively limited Excel 2010). The Excel Features are comprised of four elements, which I will summarize and provide key technical descriptions.

    • Power Query– easily discover and connect to data from public and corporate data sources
    • Power Pivot– create a sophisticated Data Model directly in Excel
    • Power View– create reports and analytical views with interactive data visualizations
    • Power Map– explore and navigate geospatial data on a 3D map experience in Excel

    Some of the Power BI elements (like Power Pivot and Power View) are now native to Excel 2013.  Other elements are add-ins, and sometimes may “disappear” from the ribbon:  first try to enable them again from the COM add-in window, or secondly, uninstall and reinstall them.  Naturally, the best situation is when these emerging features are native to the Office version.  Not all these features directly impact SQL Server technology, though the one which most clearly extends SQL Server (Analysis Services) is Power Pivot.

    Power Query

    The development version of Power Query was termed “Data Explorer”.  I first saw this technology under development and while interesting as a web application, did not immediately excite me for its possibilities.  Since then, the product is continuing to mature, and my interest has come to increase in what this technology can do.  For power Excel users, I would hope that Power Query would become the default way to import data into Excel.

    Power Query allows for some amount of data preprocessing during the import phase.  Many of the steps which Excel users have come to do manually, such as splitting columns, removing columns or renaming columns, can be scripted within the Power Query interface.  The software is wizard driven, and I did say scripting:  underneath the technology is the Power Query Formula Language (informally known as “M”) allowing for future maturity into a reusable import technology.  How that technology grows can depend on what Microsoft hears from the user community.

    The next table summarizes many of the features.  Having Office 365 increases the features by allowing shared queries.  This type of structure is to be expected:  the standalone features would come with in-premise Excel, and having the Office 365 cloud options open up collaboration features.

    Power Query Summary of Features

    Standard Power Query features

    Value-added features with an Office 365 Power BI subscription

    Easily discover, combine, and refine data for better analysis in Excel.

    In addition to the features in the standalone edition, securely share and manage your data queries within the enterprise in Excel.

    Inside Excel, here’s what the latest ribbon looks like in Excel 2013 x64:

     

    The online search is considered a feature for searching your Office 365 datasets.  It becomes active once you sign in.

    Beyond that first icon, the common “Get External Data” icons allow for reading any number of sources.  The “From Web” link is often used in demonstrations, and permits searching website URLs or feeds.  Other external data options include ODBC, SQL Server, Windows Azure, Microsoft Access, SharePoint Lists, OData, Windows Azure Marketplace, Hadoop File System, and even Facebook.   

    In process, the wizard will provide a preview of the data (once a connection, meaning authentication and authorization, are established).  The preview window then allows for choosing the preprocessing steps for the data:  though nothing happens until you submit the entire list.  Along the way, the Power Query Formula Language builds the specific steps together into a query script.  Once the preprocessing selections are done, you submit the list and the results come back to Excel.  I recommend trying the technology yourself.

     

     

     

    Limitations are important for power users, so I am including them in this report.

     

    Power Query Specifications and Limits

    Feature

    Limitation

    Query name length

    80 characters

    Invalid characters in a query name

    Double quotes (“), periods (.), leading or trailing whitespaces

    Number of cells in a Query Editor data preview

    3,000 cells

    Navigation pane items displayed per level: databases per server and tables per database.

    First 1,000 items in alphabetical order. You can manually add a non-visible item by modifying the formula for this step

    Size of data processed by the Engine

    Limited to available virtual memory (for 64-bit version) or about 1GB for 32-bit version, if data cannot be fully streamed, such as when sorting the data set locally before filling it

    Number of columns per table

    16,384

    Maximum size of text in a preview cell

    1M characters

    Maximum size of text filled to Excel or data model

    Not limited by Power Query

    Maximum dataset size when evaluating a query

    256MB

    Maximum number of rows filled to worksheet

    1,048,576

    Soft limit of persistent cache. A soft limit can be exceeded for periods of time.

    4GB

    Individual entries in the cache

    1GB

    Compressed query and dependencies as stored in the connection string. For more information about how to display connection information, see Display connection information.

    64K characters

    Action Steps:

    Power Pivot

    I will count Power Pivot as the first of the Excel features, at least the first one I saw.  Of the four Excel features, this one has been where I have been spending most of the time with clients, and also my own presentations (I did some combining Power Pivot and data mining, which you can find on web).  This technology sits on top of what we now call xVelocity, a rapid summation and compression engine.  The underlying technology now scales to production in what is called Tabular mode for Analysis Services.

    Collectively, Power Pivot for Excel, Power Pivot for SharePoint and Tabular mode in Analysis services comprise key elements of what Microsoft has named the BI (Business Intelligence) Semantic Model:

    BI Semantic Model

     

    As with the rest of Power BI, there is an Excel feature which works well just on its own.  However, there is also a technology path and way to share Power Pivot data models using either SharePoint or Tabular mode in Analysis Services.  The interface is its own Excel window, but has a familiar spreadsheet-type interface which can declare data types and relationships among data models (tables).

    Power Pivot Features

     

    The DAX language permits programming custom measures.  This language works with Power Pivot in Excel, but more generally also allows for querying Microsoft’s Multidimensional and Data Mining mode (OLAP cube) databases.  Learning and using DAX is considered an intermediate to advanced Power Pivot skill.

    Again, here are some summarized capacity specifications for Power Pivot.  Please note that using a 32-bit system is an additional strain:  only about 2.1 GB of memory is available for all Excel activity, including Power BI.  I recommend that Power BI users upgrade from 32-bit (x86) to 64-bit (x64).  Using x64, I have shown demos where Power Pivot can import over 2M records from a SQL Server data warehouse, and of course I was not even pushing the entire limit.

    Some users report that because of their corporate environment, they were able to acquire a substitute:  access to a virtual machine (perhaps shared) running x64 Office and accessed through Remote Desktop Connection Manager.

    Data Model Specification and Limits

    Product or Platform

    Maximum Limit

    Excel 2013

    32-bit environment is subject to 2 gigabytes (GB) of virtual address space, shared by Excel, the workbook, and add-ins that run in the same process. A data model’s share of the address space might run up to 500 – 700 megabytes (MB), but could be less if other data models and add-ins are loaded.

    64-bit environment imposes no hard limits on file size. Workbook size is limited only by available memory and system resources.

    SharePoint Server 2013 1

    Maximum file size for uploading to a document library:

    • 50 megabytes (MB) default
    • 2 gigabytes (GB) maximum 2

    Maximum file size for rendering a workbook in Excel Services:

    • 10 megabytes (MB)  default
    • 2 gigabytes (GB)  maximum 2

    Excel Online in Office 365 3

    250 megabytes (MB) total file size limit. Core worksheet contents (everything not in the Data Model) size limits according to file size limits for workbooks in SharePoint Online.

    Footnotes

    1 On SharePoint Server, notice that the defaults that are much lower than the maximum allowed. Ask your SharePoint administrator about raising file size limits if your file is too big to upload or render. More information about Software boundaries and limits for SharePoint Server 2013.

    2 Maximum Upload Size must be configured for each web application by a SharePoint administrator. Maximum Workbook Size must be configured in Excel Services by a service administrator. More information for administrators can be found in Configure Maximum File Upload Size on TechNet.

    3 Limits in Office 365 are not configurable, but can change over time. Check the Office 365 for Enterprise Service Descriptions for the latest information. You can also see SharePoint Online: software boundaries and limits.

     

    Power Pivot Capacity Specifications

    Object

    Specification / Limit

    Object name length

    100 characters

    Invalid characters in a Name

    . , ; ' ` : / \ * | ? " & % $ ! + = () [] {} < >

    Number of tables per PowerPivot database

    (2^31) - 1 = 2,147,483,647

    Number of columns and calculated columns per table

    (2^31) - 1 = 2,147,483,647

    Number of calculated measures in a table

    (2^31) - 1 = 2,147,483,647

    PowerPivot memory size for saving a workbook

    4GB = 4,294,967,296 bytes

    Concurrent requests per workbook

    6

    Local cubes connections

    5

    Number of distinct values in a column

    1,999,999,997

    Number of rows in a table

    1,999,999,997

    String length

    536,870,912 bytes (512 MB), equivalent to 268,435,456 Unicode characters (256 mega characters)

     

    Power Pivot is a feature of Excel 2013 (Office Professional Plus). 

    Action Steps:

    Power View

    More than just a graphing interface, Power View surfaces an interactive work surface for exploring visual data.  The technology targets both Excel and SharePoint.  The technology for Excel is currently based on the Silverlight.  Here are some screenshots of options:

     

     

     

    Microsoft provides a few guidance documents for Power View specifications.  Key to making Power View for Excel work includes making sure the prerequisite Silverlight is available.  Viewing the results in browsers requires considering what specific browser version is used:

    Power View is a feature of Excel 2013 (Office Professional Plus). 

    Action Steps:

    Power Map

    The fourth and final Excel feature for Power BI teams Bing Maps with Excel.  The technology goes beyond just the standard 2D maps available with Excel or Power View, and extends mapping into three-dimensional views which can be turned into movies you can create. 

     

     

     

    Within Excel, Power Map shares the same earlier-mentioned size limitations of Excel 32-bit (x86) or 64-bit (x64).

    Action Steps:

    Power BI for Office 365

    As I mentioned earlier, Power BI for Office 365 extends what Power BI could achieve with Excel alone.  The main two features of Office 365 are collaboration (working with teams) and portability (working across locations and devices).  The technology is viewed through web browsers, and therefore would allow teams to use whatever laptop, tablet, or smartphone device they have already invested in.  As mentioned in the licensing section, obtaining this software also includes the legacy on premise versions of Microsoft Office, demonstrating the desire to keep this new web world connected with the already familiar Office experience.  Because of the streamlined licensing structure, I also am emphasizing and promoting Office 365 as the primary way to experience Power BI collaboration.  It is true that collaboration can happen with SharePoint, and that software is separately licensed from the lengthy options already discussed.

    The components of Office 365 are:

    The first action step is to view the two minute preview video on Office 365.

    Power BI Sites

    This technology is comprised of websites you make with your Power BI content.  Organizing your information on the web allows collaboration (teams) and portability (location and devices).  This feature is considered an aspect of SharePoint online.  Click this link to see the preview video.

     

    Power BI Q&A (Question and Answer)

    This technology opens up a way to query inside data stored on Power BI sites.  Based on Bing search technology, the interface interprets natural language entries and provides data results.  Click this link to see the preview video.

     

    Query and Data Management

    These features are an inherent part of the Power BI sites.  Collaboration requires knowing what happens with that information.

     

    Click this link to study the Data Management Experience in Power BI Office 365.

    Power BI Windows Store App

    Office 365 already allows for HTML5 rendering for mobile devices (including laptops, tablets and smartphones).  Though, the native Power BI Windows Store App extends that functionality for Windows devices.  The consumption experience matches Power BI sites.  Click here for the preview video.

    Managing Power BI for Office 365

    As mentioned in the introduction, there are other features which involve managing Office 365.  We can expect these features to grow and improve, as Microsoft often asks the community for input on what features people would find useful.

    IT (Information Technology) Infrastructure Services for Power BI Office 365

    Power BI with Excel 2010

    The following chart summarizes the key elements of Power BI, and what would be available for users of Excel 2010.

    Software

    Solution

    Power Pivot

    A version one of Power Pivot is available for Excel 2010.  Microsoft’s website provides video, demos, and hands-on labs to try out the software.

     

    Opening up an Excel 2010 Power Pivot workbook in Excel 2013 requires an irreversible upgrade (meaning that collaboration between Excel 2010 and Excel 2013 users is not possible – pick one version or the other).  There are some technical details involved in upgrading, which you can study by clicking this link.

     

    Power Map

    Power Map is not available for Excel 2010.

     

    Versions of Excel previous to 2002 had a native map feature.  Though, advanced Excel users will know about Microsoft MapPoint, which has gone through nineteen versions since its debut.  Sadly, MapPoint is being discontinued as of December 31, 2014.  You might be able to obtain a copy now, or through some MSDN subscriptions.

     

    Power Query

    Power Query is available for Microsoft Office 2010 Professional Plus with Software Assurance.  There are two versions, one for 32-bit (x86) and one for 64-bit (x64).

     

    Power View

    Power View is not available for Excel 2010.

     

     

    Recommended Resources

    First, the free Microsoft digital books – these books include some Power BI topics like DAX, but generally all types of Microsoft topics:

    Next, I generally recommend reading Microsoft’s website and documentation http://msdn.microsoft.com.  Often, the documentation integrates video and demos:  Microsoft has become better in doing that.  A general site for Power BI video is Channel 9, which has many videos from Microsoft technical conferences.

    Finally, there are books you buy.  Many of my MVP friends and other equally-skilled professionals have authored books on Power BI topics.  More continue to be published, some of them came out this month (July 2014), and most of them are generally available in both hardcopy and digital formats.  All these listed books are on some aspect of Power BI.

    Collie, R. (2012). DAX Formulas for PowerPivot: A Simple Guide to the Excel Revolution: Holy Macro! Books.

    de Jonge, K. (2014). Dashboarding and Reporting with Power Pivot and Excel: How to Design and Create a Financial Dashboard with PowerPivot – End to End: Holy Macro! Books.

    Ferrari, A., & Russo, M. (2013). Microsoft Excel 2013 Building Data Models with PowerPivot: Microsoft Press.

    Jelen, B., & Collie, R. (2014). PowerPivot Alchemy: Patterns and Techniques for Excel: Holy Macro! Books.

    Larson, B., Davis, M., English, D., & Purington, P. (2012). Visualizing Data with Microsoft Power View: McGraw-Hill/Osborne Media.

    Webb, C. (2014). Power Query for Power BI and Excel: Apress.  

    Summary

    Hopefully, this report provides good background information on Power BI.  Feel free to contact me with feedback, either through my website http://marktab.net or on Twitter @marktabnet.

     

    About the author


    Mark provides enterprise data science analytics advice and solutions. He uses Microsoft Azure Machine Learning, Microsoft SQL Server Data Mining, SAS, SPSS, R, and Hadoop (among other tools). He works with Microsoft BI (SSAS, SSIS, SSRS, SharePoint).

    Mark has a been a public voice for analytics since 1998: Microsoft TechEd, PASS Business Analytics Conference, Predictive Analytics World, SAS Global Forum, PASS Summit.  He is a SQL Server MVP, a trainer and consultant with SolidQ, and teaches part-time at the University of Phoenix.  His blog is at http://marktab.net

     About MVP Mondays

    The MVP Monday Series is created by Melissa Travers. In this series we work to provide readers with a guest post from an MVP every Monday. Melissa is a Community Program Manager, formerly known as MVP Lead, for Messaging and Collaboration (Exchange, Lync, Office 365 and SharePoint) and Microsoft Dynamics in the US. She began her career at Microsoft as an Exchange Support Engineer and has been working with the technical community in some capacity for almost a decade. In her spare time she enjoys going to the gym, shopping for handbags, watching period and fantasy dramas, and spending time with her children and miniature Dachshund. Melissa lives in North Carolina and works out of the Microsoft Charlotte office.



    0 0

    Editor’s note: The following post was written by Visual C# MVP Ming Man Chan 

    In order to connect and manipulate the data in Microsoft CRM 2013, the developer can download the MS CRM 2013 SDK (http://www.microsoft.com/en-my/download/details.aspx?id=40321). The CRM SDK uses different patterns to do the CRUD (Create, Read, Update, and Delete) operations. The simplest pattern for me is using the LINQ expression.  In my experience, not all the update operation can use the LINQ expression but you can definitely do the Read operation using LINQ easily.

    This article divided into 3 subsections to accomplish the retrieve and update contact details in MS CRM 2013 Contact Entity.

    • Connect to CRM 2013 using C#.
    • Retrieve all the contacts using LIQN statement.
    • Update the contact information LINQ statement.

    Connect to CRM 2013 using C#

    Once you have downloaded the SDK then you will have a file name MicrosoftDynamicsCRM2013SDK.exe. The installer will ask you for a location to unzip the files when you run the exe execution file. Choose a location for example, C:\MSCRMSDK.

    Open the folder that you unzipped the SDK then you will see a folder named SDK and a file named mscrmeula.txt. The code that we want to look at is located in SDK\Walkthroughs\Portal\ConsoleAppWalkthrough.

    Double click the ConsoleAppWalkthrough.csproj file, provided you have either Visual Studio 2012 or Visual Studio 2013 installed. The default sample does not have do the contact update.

    You will now see the following project structure in Visual Studio under the Solution Explorer.

     

    Double click to open the App.config then you will see the file in the following XML format.

    <?xmlversion="1.0"?>

    <configuration>

      <configSections>

        <sectionname="microsoft.xrm.client" type="Microsoft.Xrm.Client.Configuration.CrmSection, Microsoft.Xrm.Client"/>

      </configSections>

      <connectionStrings>

        <!--<add name="Xrm" connectionString="Server=http://crm/contoso; Domain=CONTOSO; Username=Administrator; Password=pass@word1"/>-->

        <addname="Xrm" connectionString="Url=http://WIN-70BO4R8PVDT:5555/CRM2013MM; Domain=LOCALCRM; Username=mmchan; Password=Password;"/>

      

      </connectionStrings>

      <microsoft.xrm.client>

        <contextsdefault="Xrm">

          <addname="Xrm" type="Xrm.XrmServiceContext, Xrm" connectionStringName="Xrm"/>

        </contexts>

      </microsoft.xrm.client>

      <startup>

        <supportedRuntimeversion="v4.0" sku=".NETFramework,Version=v4.0"/>

      </startup>

    </configuration>

    Pay attention to the entry:

    <addname="Xrm"connectionString="Url=http://WIN-70BO4R8PVDT:5555/CRM2013MM; Domain=LOCALCRM; Username=mmchan; Password=Password;"/>

    This is the key connection string for you to connection to the MS CRM 2013 server.

     

    Retrieve all the contacts using LIQN statement

    I could not use the below code that comes with the SDK for the first time.

    private static void WriteExampleContacts(XrmServiceContext xrm)

    {

           var exampleContacts = xrm.ContactSet

                  .Where(c => c.EMailAddress1.EndsWith("@example.com"));

     

           //write the example Contacts

           foreach (var contact in exampleContacts)           

           {

                  Console.WriteLine(contact.FullName);

           }

    }

    It has given me Access is denied error. For that reason, I have created another set of similar code.

    private static void GetContact(XrmServiceContext xrm)

    {

        var result = from r in xrm.ContactSet where r.EMailAddress1.EndsWith("@example.com") select r;

     

        foreach (Contact contact in result)

        {

            Console.WriteLine(contact.EMailAddress1);

        }

    }

    The above method is to extract the contacts in MS CRM 2013. The line:

    var result = from r in xrm.ContactSet where r.EMailAddress1.EndsWith("@example.com") select r;

    It is used to filter all the contact with the email address that ended with “@example.com”. You will see the listing as below.

     

    In order to see the return you must either type in some contacts the email address ends with @example.com or install the sample data for Dymanics CRM 2013. To install sample data for Dymanics CRM 2013 refer to https://community.dynamics.com/crm/b/crmpowerobjects/archive/2013/11/01/adding-and-removing-sample-data-in-dynamics-crm-2013.aspx.

    Update the contact information LINQ statement

    The sample code has a section to insert the contact record as follow.

    var allisonBrown = new Xrm.Contact

    {

           FirstName = "Allison",

           LastName = "Brown",

           Address1_Line1 = "23 Market St.",

           Address1_City = "Sammamish",

           Address1_StateOrProvince = "MT",

           Address1_PostalCode = "99999",

           Telephone1 = "12345678",

           EMailAddress1 = "allison.brown@example.com"

    };

    This article has added a method to update the record that the above code added.

    private static void UpdateContact(XrmServiceContext xrm)

    {

        var result = from r in xrm.ContactSet where r.EMailAddress1 == "allison.brown@example.com"select r;

     

        result.FirstOrDefault().EMailAddress1 = "allisonb@example.com";

        xrm.UpdateObject(result.FirstOrDefault());   

        xrm.SaveChanges();

    }

    The line below is to retrieve the added record using the email address (let us assume the email address is unique in this article).

    var result = from r in xrm.ContactSet where r.EMailAddress1 == "allison.brown@example.com"select r;

    We then change the email address.

    result.FirstOrDefault().EMailAddress1 = "allisonb@example.com";

     

    Finally, we save all the changes by using the SaveChanges() method.

    xrm.UpdateObject(result.FirstOrDefault());

    xrm.SaveChanges();

    If you call the GetContact method now then you should see the listing as follow.

     

    You can use the same way to update all the MS Dymanics CRM 2013 entities except for those entities that have dependencies to other entities.

     

    About the author

     

    Ming Man is Microsoft MVP since year 2006. He is a software development manager for a multinational company. With 25 years of experience in the IT field, he has developed system using Clipper, COBOL, VB5, VB6, VB.NET, Java and C #. He has been using Visual Studio (.NET) since the Beta back in year 2000.  He and the team have developed many projects using .NET platform such as SCM, and HR based applications. He is familiar with the N-Tier design of business application and is also an expert with database experience in MS SQL, Oracle and AS 400.  Additionally you can read Ming’s Channingham’s blog

     

    About MVP Mondays

    The MVP Monday Series is created by Melissa Travers. In this series we work to provide readers with a guest post from an MVP every Monday. Melissa is a Community Program Manager, formerly known as MVP Lead, for Messaging and Collaboration (Exchange, Lync, Office 365 and SharePoint) and Microsoft Dynamics in the US. She began her career at Microsoft as an Exchange Support Engineer and has been working with the technical community in some capacity for almost a decade. In her spare time she enjoys going to the gym, shopping for handbags, watching period and fantasy dramas, and spending time with her children and miniature Dachshund. Melissa lives in North Carolina and works out of the Microsoft Charlotte office.


    0 0

    Editor’s note: In partnership with Microsoft Press, now celebrating their 30th year, MVPs have been contributing to an ongoing guest series on their official team blog. Today’s article is from Cluster MVP David Bermingham which is the 44th in the series.

    SQL Server High Availability in Windows Azure Iaas

    When deploying SQL Server in Windows Azure you must consider how to minimize both planned and unplanned downtime. Because you have given up control of the physical infrastructure, you cannot always determine when maintenance periods will occur. Also, just because you have given control of your infrastructure to Microsoft it does not guarantee that you are not susceptible to some of the same types of outages that you might expect in your own data center. To minimize the impact of both planned and unplanned downtime Microsoft provides what are called Fault Domains and Upgrade Domains. By leveraging Upgrade Domains and Fault Domains and deploying either SQL Server AlwaysOn Availability Groups (AOAG) or AlwaysOn Failover Cluster Instances (AOFCI) you can help minimize both planned and unplanned downtime in your SQL Server Windows Azure deployment. Throughout this document when I refer to a SQL Server Cluster, I am referring to both AOAG and AOFCI. When needed, I will refer to AOAG and AOFCI specifically.

    Fault Domains are essentially “a rack of servers”, with no common single point of failure between different Fault Domains, including different power supplies and network switches. An Update Domain ensures that when Microsoft is doing planned maintenance, only one Update Domain is worked on at a given time. This eliminates the possibility that Microsoft would accidentally reboot all of your servers at the same time, assuming that each server is in a different Update Domain.

    When you provision your Azure VM instances in the same Availability Set, you are ensuring that each VM instance is in a different Update Domain and Fault Domain…to an extent. You probably want to read Manage The Availability of Virtual Machines to completely understand how VMs get provisioned in different Fault Domains and Update Domains. The important part of the availability equation is ensuring that each VM participating in your SQL Server cluster is isolated from each other, ensuring that the failure of a single Fault Domain or maintenance in an Update Domain does not impact all of your Azure instances at the same time.

    So that is all you need to know….right? Well, not exactly. Azure IaaS does not behave exactly like your traditional infrastructure when it comes to clustering. In fact, before July of 2013, you could not even create a workable cluster in Azure IaaS. It wasn’t until then that they released hotfix KB2854082 that made it possible. Even with that hotfix there are still a few considerations and limitations when it comes to highly available SQL Server in Windows Azure.

    Before we dive into the considerations and limitations, you need to understand a few basic Azure terms. These are not ALL the possible terms you need to know to be an Azure administrator, but these are the terms we will be discussing that are specific to configuring highly available SQL Server is Azure IaaS.

    Continue reading full article here


    About the author


    David Bermingham is recognized within the technology community as a high availability expert and has been honored by his peers by being elected to be a Microsoft MVP in Clustering since 2010. David’s work as director of Technical Evangelist at SIOS has him focused on evangelizing Microsoft high availability and disaster recovery solutions as well as providing hands on support, training and professional services for cluster implementations. David hold numerous technical certifications and draws from over twenty years of experience IT, including work in the finance, healthcare and education fields, to help organizations design solutions to meet their high availability and disaster recovery needs. David has recently begun speaking on deploying highly available SQL Servers in the Azure Cloud and deploying Azure Hybrid Cloud for disaster recovery.

    About MVP Mondays

    The MVP Monday Series is created by Melissa Travers. In this series we work to provide readers with a guest post from an MVP every Monday. Melissa is a Community Program Manager, formerly known as MVP Lead, for Messaging and Collaboration (Exchange, Lync, Office 365 and SharePoint) and Microsoft Dynamics in the US. She began her career at Microsoft as an Exchange Support Engineer and has been working with the technical community in some capacity for almost a decade. In her spare time she enjoys going to the gym, shopping for handbags, watching period and fantasy dramas, and spending time with her children and miniature Dachshund. Melissa lives in North Carolina and works out of the Microsoft Charlotte office.

     

     


    0 0

    MVP Mohit Singh Baweja with Imagine Cup Team Estimeet

    34 teams from all over the world met in Seattle for the 2014 Imagine Cup World Finals.  These were the best of the best, selected from tens of thousands of students, and each of their projects were unique and impressive.  When it came to innovation, one team stood above the rest.

    "We're feeling pretty unreal, it feels like a dream," says Imagine Cup Innovation Winners, Team Estimeet.  The New Zealand team took the top prize in the innovation category with their Windows Phone app that provides a user-friendly solution to a common problem among friends, being late.  Everyone has one or two friends who always seem to be running late. They don't respond to text messages or calls and leave you wondering when they'll arrive. With Estimeet, you will no longer have to make any more stressful texts and calls to your friends. Estimeet shows you all your friends’ distance and estimated time of arrival from the meeting location as well as whether they are on their way. We want to spend less time worrying about when our friends will show up, and more time having fun.

    Windows Expert-Consumer MVP Mohit Singh Baweja volunteered his time and expertise to mentor Team Estimeet during the competition.  "Mohit has been a very good mentor to us," said Team Leader Hayden Do.  "He allowed us to dive further into our project and helped us with our execution." 

    MVP Mohit is no stranger to the Imagine Cup having competed himself 3 years in a row.  "Over the course of the competition, I saw a huge boost in everyone's self confidence," said Mohit.  "Which reflected not only through their presentation's, but also seeing their personalities evolve to be the entrepreneurs of tomorrow.  Their experience of getting media coverage, pitching constantly in front of industry leaders and representing New Zealand on a world stage, played a huge part!"

    Check out this great video demo of Estimeet

    (Please visit the site to view this video)

     



     

     

     


    0 0

    Editor’s note: The following post was written by Visual Studio ALM MVP Amir Barylko 

    Acceptance Testing Made Easy with F# and Canopy

    Why acceptance testing?

    Writing tests is part of our development cycle. Most teams I work with have some kind of testing in place.

    In most cases I found that the tests are a mix of unit tests and integrationtests.

    A unit test is a test for one class or, more specifically, one method. All the collaborators should be faked in order to avoid coupling and to make sure the test is for the class and not for the dependencies.

    An integration test is when two or more classes are involved. Integration is usually used when there are doubts about how the classes will work together (though a unit test should take care of that) and because writing an acceptance test is very difficult.

    So far, so good—however, it is not enough.

    We have tested the classes and we have figured out that some of them work fine together. Does that imply that each feature is working as expected? Unfortunately it doesn’t so we need to go a bit further.

    The missing test

    To ensure that the functionality the user sees is the expected one, we need a test that works the same way the user does: end to end, with no clue about classes, methods, etc.

    This is known as acceptance testing, and because we do not know what is involved it is also called black box testing.

    Acceptance can be done automatically, manually, or through a combination of both. Clearly, the automated version, if available, will bring lots of benefits and save us lots of time.

    Furthermore, we can use acceptance testing to drive the feature in that  we can build code until the acceptance tests passes.

    As tempting as it sounds, the question is then why do most projects not usually write acceptance tests? The answer is simple: because it is hard! How can we make it easier?

    Enter Canopy

    Canopy is an F# library that provides a DSL on top of Selenium to make writing acceptance tests for web applications quite easy. Add to the mix F#’s descriptive power and we get a winning team that can save you time and effort.

    To demonstrate how to use Canopy I am going to use a project that I wrote some time ago. This project uses Knockout.js and Coffeescript to build a rich website that shows a collection of movies and lets us add movies to the collection.

    You can download the code and try it for yourself from Github.

    Baby steps

    The MVC application shows a welcome page first.

     

    On the left you can see the menu with two options. The link that says “Demo binding” leads us to the actual demo, and the list of movies loaded from the database is displayed. I am using Mongodb in this case but you can run the example with an in-memory collection if you wish.

     

    I am going to write a series of acceptance tests that will test:

    • Listing movies from the database
    • Listing the default movies when the database is empty
    • Adding movies to the list

    The acceptance project

    In order to create an acceptance test project we need to create an F# console application.

    We need to install the Canopy nuget package into the project.

    Then on Program.fs we can enter the basic skeleton (that you can read about on the Canopy website).

    The basics

    If we were to test the site manually, the steps to do so would be something like:

    • Launch a browser
    • Open the website
    • Click the link to get to the demo
    • Do the testing
    • Check the results
    • Close the browser

    And we are going to do exactly that, but by automating every step.

    The first part is the same for all the tests. Here we can see the basic structure.

    start firefox

    url "http://localhost:1419/"

    click "Demo binding"

     

    // Your test goes here

     

    //run all tests

    run()

     

    printfn "press [enter] to exit"

    System.Console.ReadLine() |> ignore

     

    quit()

    It is almost the same as writing in English! Isn’t that nice?

    Listing the default movies

    Now that we have the basic structure I want to test that when my database is empty the default movies are loaded and shown on the page.

    I am going to write what I want first and then we will add the support methods to make it happen.

    I see the scenario using the following steps:

    • Clear the database (to make sure it is empty)
    • Go to the movie list page
    • Check that the movies listed are the ones defined by default

    Below you can see the steps written in F#.

    context "When the database is empty"

     

    // Given I have no movies

    // When I open the list of movies

    before(clearMovies >> openMainPage)

     

    "the default movies are listed"&&& fun _ ->

        actualMovies() |> should equal defaultMovies

    The keyword context creates a new scenario that describes our test. The text for  the context will be printed to the console.

    The keyword before runs a function passed as a parameter before each test.

    My test is about ensuring the listed movies (actual movies) are the same as the expected ones (default movies). I am using FsUnit for the assertions.

    Let’s look at the missing bits we need to complete the test.

    let repo =MovieRepository()

     

    let clearMovies = repo.Clear

    clearMovies is a function that uses the MovieRepository class instance and calls the method Clear.

    let openMainPage = fun _ ->

        url "http://localhost:1419/"

        click "Demo binding"

    openMainPage is a function that navigates to the URL running my website and then follows the link to enter the demo.

    let actualMovies = fun _ -> (elements ".movie .name") |>Seq.map(elementText)

     

    let defaultMovies = MovieRepository.DefaultMovies |>Seq.map(fun m -> m.Title)

    actualMovies is a function that uses the elements function, passing a CSS selector to obtain all the HTML elements that have a class name and a parent element with a class movie, and takes the text of each element. That should return the movie title.

    defaultMovies is a function that invokes the DefaultMovies method on the MovieRepository class and selects the title from each movie.

    When tests are run the console output shows the results.

     

    Adding new movies

    To test adding a new movie the scenario would look something like:

    • Click the add movie button
    • Enter the title and release date
    • Press the save button
    • Check that the movie is listed

    And that’s exactly what I am going to write in my F# Canopy test:

    context "When adding a movie to the list and is saved"

     

    before(clearMovies >> openMainPage>> addNewMovie>> (fun _ -> click ".icon-ok-sign"))

    "the new movie is listed"&&& fun _ ->

        let expected = Seq.append defaultMovies ["High Anxiety"]

        actualMovies() |> should equal expected

    The before function, similar to the previous test, first does the cleaning and opens the main page with the movie list. The difference is that after showing the list a new movie is added by entering the information and saving the new entry to the list.

    let addNewMovie = fun _ ->

        click ".your-movies button"

        ".new-movie input"<<"High Anxiety"

        (last ".new-movie input") <<"Sep 1, 1978"

     

    addNewMovie is a function that allows clicking on a button to add a new movie and then enters the value for the title and release date field. To do so the function uses the << operator function that takes a CSS selector class and a value.

    The cherry on top

    Please go ahead a check the code to see all the tests in action and play with them. There is also an example using NUnit with Canopy.

    In the code on GitHub in the startup process I used PhantomJS as a browser.

    That way I can have a headless browser without having to launch a window and run my tests on a CI server.

    Using Canopy plus F# is a killer combination for writing acceptance tests that are easy to understand and maintain and that are quick to produce. All the tools come out of the box— we just need to plug in the code and enjoy.

     

    About the author

    Amir Barylko started his career in 1994 working for IBM as a senior developer while he was finishing his Master’s degree in computer science. Since then he worked as team leader and architect for the past 15 years. Having started with languages like C++ and Java he spent many years coding in C# and training other developers in topics such domain modelling, abstractions, patterns, automation, dependency injection, testing, etc. Being an incurable geek, always thirsty for knowledge, his passion for technology moved him towards Ruby on Rails (RoR) a few years ago, becoming an advocate of RoR web development.

    Looking for new ways of sharing his knowledge and helping others to achieve their goals motivated Amir to become an owner and build a lean management tool called SmartView (http://smartviewapp.com) that was released in 2014. Amir is a rare combination of high technical skills, experience in a wide range of platforms, exceptional presentation skills and great sense of humour. His presentations are always rich in content and fun to attend.  Follow Amir on Twitter @abarylko

     

    About MVP Mondays

    The MVP Monday Series is created by Melissa Travers. In this series we work to provide readers with a guest post from an MVP every Monday. Melissa is a Community Program Manager, formerly known as MVP Lead, for Messaging and Collaboration (Exchange, Lync, Office 365 and SharePoint) and Microsoft Dynamics in the US. She began her career at Microsoft as an Exchange Support Engineer and has been working with the technical community in some capacity for almost a decade. In her spare time she enjoys going to the gym, shopping for handbags, watching period and fantasy dramas, and spending time with her children and miniature Dachshund. Melissa lives in North Carolina and works out of the Microsoft Charlotte office.


    0 0
    0 0

    1. PoC: Tatoo the background of your virtual machines

    by Emin Atac – PowerShell MVP

    2. Completing Real Estate Transactions with OneDrive and OneNote

    By Corey Roth– SharePoint MVP

    3. PowerShell Prank #1: R2D2

    By Joseph Moody– Software Packaging and Deployment MVP

    4. Video – Introducing Windows Server Software-Defined Storage

    By Aidan Finn– Hyper - V MVP

    5. Bait and Switch PCL explained

    By Olivier Matis– Windows Platform Development MVP 


    0 0

    1. PoC: Tatoo the background of your virtual machines

    by Emin Atac – PowerShell MVP

    2. Completing Real Estate Transactions with OneDrive and OneNote

    By Corey Roth– SharePoint MVP

    3. PowerShell Prank #1: R2D2

    By Joseph Moody– Software Packaging and Deployment MVP

    4. Video – Introducing Windows Server Software-Defined Storage

    By Aidan Finn– Hyper - V MVP

    5. Bait and Switch PCL explained

    By Olivier Matis– Windows Platform Development MVP 


    0 0
  • 09/08/14--11:07: Building the Eject-A-Bed
  • Editor’s note:   Instead of usual content, we wanted to post something on creative ways our MVPs are using our technology.  Enjoy!

    The following post was written by Visual F# MVP Jamie Dixon 

     

    Building the Eject-A-Bed

    My youngest child has a real hard time getting out of bed in the morning.  Instead of getting mad, I put my hacker hat on and got building.  I found a hospital inversion table on Craig’s list and put it in the garage.

     

    My daughter, Sonoma, and I first wanted to understand how the bed controller works.  We first thought that the controller used PWM so we hooked up our Oscilloscope to the two metal wires that travel from the controller to the bed’s motor.  We moved the controller up and down but no signal was being recorded.  Sonoma then noticed that the “wire” was a hollow tube.  Taking a wild guess, we blew down the pipe.  Sure enough, that made the bed move.

     

    So now we had to figure out how the controller moved air up and down the pipe.  We opened the controller and there is a small bellow that attaches to the pipe.  Press the controller on 1 side and air is forced down the pipe, press the controller on the other side and air is sucked up the pipe.

     

    So we batted around ideas about how to push air up and down the pipe – ideas included using a small electric air compressor, some kind of mechanical pressure plate, etc…  We then decided to try and gluing a servo to the switch and controlling the movement that way.  However, we couldn’t figure how to attach the servo to the existing plastic switch.  So we decided to build our own switch – we used my son’s erector set to create the harness for the bellows.

     

    With the mechanics set up, it was time to code the controller.  I created a new Netduino project and added the following class-level variables and some methods to control the servo:

            private static OutputPort _led = null;

            private static InterruptPort _button = null;

            private static Socket _serverSocket = null;

            private static PWM _servo = null;

     

            private const uint SERVO_UP = 1250;

            private const uint SERVO_DOWN = 1750;

            private const uint SERVO_NEUTRAL = 1500;

     

            private static bool _servoReady = true;

     

            private static void SetUpServo()

            {

                uint period = 20000;

                uint duration = SERVO_NEUTRAL;

     

                _servo = new PWM(PWMChannels.PWM_PIN_D5, period, duration, PWM.ScaleFactor.Microseconds, false);

                _servo.Start();

                _servoReady = true;

                Thread.Sleep(2000);

            }

            private static void ActivateServoForBellows(String direction, Int32 duration)

            {

                if (direction == "UP")

                {

                    _servo.Duration = SERVO_UP;

                }

                else if (direction == "DOWN")

                {

                    _servo.Duration = SERVO_DOWN;

                }

     

                Thread.Sleep(duration);

                _servo.Duration = SERVO_NEUTRAL;

            }

     

    If you are not familiar, servos are controlled by pulses of low voltage electricity.  The width of the pulse is interpreted by the servo and it moves correspondingly. In this case, a pulse width of 1.5 milliseconds will send the servo to neutral so the bellow is neither up nor down.  A pulse width of 1.25 millisconds moves the servo to the position where the bellow is being compressed, and a pulse of width of 1.75 milliseconds moves the servo to the position where the bellow is being expanded.  So the Netduino continuously sends a PWM signal of 1.5 milliseconds.  Once the length is changed to either 1.25 or 1.75, the servo moves and stays in that position (simulating a person pressing the button down for a period of time).  Note that “Duration” is the position of the servo, “duration” is how long the servo stays in its position.  This is what a PWM signal looks on an oscilliscope:

     

     

    I then wired the Netduino to the servo.

     

     

    The next step was to have the Netduino receive commands to move the servo.  I purchased an inexpensive portable router and plugged it into the Netduino.  I then coded up a series of methods that would set up the router and listen for requests.  If a valid request came in, the servo would move:

            private static void SetUpWebServer()

            {

                _serverSocket = new Socket(AddressFamily.InterNetwork, SocketType.Stream, ProtocolType.Tcp);

                IPEndPoint ipEndpoint = new IPEndPoint(IPAddress.Any, 80);

                _serverSocket.Bind(ipEndpoint);

                _serverSocket.Listen(10);

                ListenForWebRequest();

            }

     

            public static void ListenForWebRequest()

            {

                while (true)

                {

                    using (Socket clientSocket = _serverSocket.Accept())

                    {

                        if (_servoReady)

                        {

                            _servoReady = false;

                            String request = GetRequestFromSocket(clientSocket);

                            Thread thread = newThread(() => HandleWebRequest(request));

                            thread.Start();

                            SendActivateResponse(clientSocket);

                        }

                        else

                        {

                            SendBusyResponse(clientSocket);

                        }

                    }

                }

            }

     

            private static string GetRequestFromSocket(Socket clientSocket)

            {

                int bytesReceived = clientSocket.Available;

                if (bytesReceived > 0)

                {

                    byte[] buffer = newByte[bytesReceived];

                    int byteCount = clientSocket.Receive(buffer, bytesReceived, SocketFlags.None);

                    return newString(Encoding.UTF8.GetChars(buffer));

                }

                returnString.Empty;

            }

     

            private static void HandleWebRequest(String request)

            {

                RequestValues requestValues = GetRequestValuesFromWebRequest(request);

                Thread.Sleep(requestValues.Duration);

                if (requestValues.Duration > 0)

                {

                    ActivateServoForBellows(requestValues.Direction, requestValues.Duration);

                }

                _servoReady = true;

            }

     

            public static RequestValues GetRequestValuesFromWebRequest(String request)

            {

                RequestValues requestValues = new RequestValues();

     

                if (request.Length > 0)

                {

                    String[] chunkedRequest = request.Split('/');

                    for (int i = 0; i < chunkedRequest.Length; i++)

                    {

                        chunkedRequest[i] = chunkedRequest[i].Trim();

                        chunkedRequest[i] = chunkedRequest[i].ToUpper();

                    }

                    requestValues.Verb = chunkedRequest[0];

                    requestValues.Direction = chunkedRequest[1];

     

                    Int32 duration = 0;

                    if (chunkedRequest[2] == "HTTP")

                    {

                        duration = 15000;

                    }

                    else

                    {

                        try

                        {

                            duration = Int32.Parse(chunkedRequest[2]);

                        }

                        catch (Exception)

                        {

                            try

                            {

                                String[] chunkedDuration = chunkedRequest[2].Split(' ');

                                duration = Int32.Parse(chunkedDuration[0]);

                            }

                            catch (Exception)

                            {

     

                            }

                        }

     

                    }

                    requestValues.Duration = duration;

                }

     

                return requestValues;

     

            }

     

            private static void SendActivateResponse(Socket clientSocket)

            {

                String response = "The Eject-A-Bed activated at " + DateTime.Now.ToString();

                String header = @"HTTP/1.0 200 OK\r\nContent-Type: text;charset=utf-8\r\nContent-Length: " +

                    response.Length.ToString() + "\r\nConnection: close\r\n\r\n";

     

                clientSocket.Send(Encoding.UTF8.GetBytes(header), header.Length, SocketFlags.None);

                clientSocket.Send(Encoding.UTF8.GetBytes(response), response.Length, SocketFlags.None);

            }

     

            private static void SendBusyResponse(Socket clientSocket)

            {

                String response = "The Eject-A-Bed is busy at " + DateTime.Now.ToString();

                String header = @"HTTP/1.0 503 Service Unavailable\r\nContent-Type: text;charset=utf-8\r\nContent-Length: " +

                    response.Length.ToString() + "\r\nConnection: close\r\n\r\n";

     

                clientSocket.Send(Encoding.UTF8.GetBytes(header), header.Length, SocketFlags.None);

                clientSocket.Send(Encoding.UTF8.GetBytes(response), response.Length, SocketFlags.None);

     

            }

     

    And there is 1 data structure that is used to communicate the request

        public class RequestValues

        {

            public String Verb { get; set; }

            public String Direction { get; set; }

            public Int32 Duration { get; set; }

        }

    So now when I send a browser request on my local Ethernet, we have a way of actuating the hospital bed screw drive via a wireless command:

     

     https://www.youtube.com/watch?v=bHZvxBTxPds&feature=player_embedded

     

    The next step was no code but still lots of fun (anytime you can use air tools, it is fun).  We removed the screw drive from the hospital bed and attached it to my son’s real bed:

     

    And then we can throw him out of bed via a simple http request from anywhere in the house:

    https://www.youtube.com/watch?v=QJG2seTFcxQ

     

    About the author

    Jamie Dixon has been writing code for as long as he can remember and has been getting paid to do it since 1995.  He was using C#and javascript almost exclusively until discovering F# and now combines all three languages for the problem at hand.  He has a passion for discovering overlooked gems in data sets and merging software engineering techniques with scientific computing.  When he codes for fun, he spends his time using the .NET micro framework with Phidgets, Netduinos, and the Kinect.  Follow him on Twitter @jamie_dixon

    About MVP Mondays

     

     

    The MVP Monday Series is created by Melissa Travers. In this series we work to provide readers with a guest post from an MVP every Monday. Melissa is a Community Program Manager, formerly known as MVP Lead, for Messaging and Collaboration (Exchange, Lync, Office 365 and SharePoint) and Microsoft Dynamics in the US. She began her career at Microsoft as an Exchange Support Engineer and has been working with the technical community in some capacity for almost a decade. In her spare time she enjoys going to the gym, shopping for handbags, watching period and fantasy dramas, and spending time with her children and miniature Dachshund. Melissa lives in North Carolina and works out of the Microsoft Charlotte office.


     

     


older | 1 | .... | 30 | 31 | (Page 32) | 33 | 34 | .... | 40 | newer