Are you the publisher? Claim or contact us about this channel

Embed this content in your HTML


Report adult content:

click to rate:

Account: (login)

More Channels

Channel Catalog

Channel Description:

Independent Experts. Real World Answers.

older | 1 | .... | 10 | 11 | (Page 12) | 13 | 14 | .... | 40 | newer

    0 0

    Just in case you missed one of our outstanding MVP Monday posts—or wanted to check out one again that you liked—we’re celebrating the whole 2011 series today. The MVP guest authoring series started in 2010 with the launch of Office 2010 with a special series called MVPs for Office 2010.  That 10 day run was so popular we made it a regular series: MVP Mondays.  The year 2011 marks the first full year of MVP contributions.  Here are some highlights:

    • 45 original articles
    • 38 MVPs contributing from 9 countries and 17 areas of technical expertise
    • Special themes around SharePoint 2010, Windows Phone 7, Office 365 and Developer
    • New this year are multi-language articles (five in 2011)

    We’re looking forward to bringing you exciting new content in the coming year, including more multi-language articles as well as product specific themes.  Thanks to our readers for making this series so successful and a special thanks to our MVP contributors for 2011 for taking the time to share their expertise.  Quite simply they rock!

    To view the article written by each MVP, please click on the article name.


    MVPs for Office 365: Getting Ready for Office 365

    By Office 365 MVP Chad Mosman 


    MVPs for Office 365: How to create a secured kiosk access with Office 365?

    By Office 365 MVP Arnaud Alcabez


    MVPs for Office 365: MVP Award Program Interviews our Office 365 Guest Bloggers!

    Office 365 MVPs Chad Mosman, Myles Jefferey, Arnaud Alcabez, Brett Hill, Daniel Trautman, Zoltan Zombory, and Martina Grom


    MVPs for Office 365: Microsoft Demonstrates Commitment to Service Updates

    By Office 365 MVP Brett Hill



    MVPs for Windows Phone 7: How To Sync Multiple Notebooks in OneNote Mobile on Windows Phone 7

    By Windows Phone MVP Adam Lein


    MVPs for Windows Phone 7: Syncing Outlook 2010 and Window Phone

    By Windows Phone MVP Trent McMurray


    MVPs for Windows Phone 7: Quick Tips

    By Windows Phone Development MVP Kevin Wolf


    MVPs for Office 365: Organize Your Information Better in SharePoint Online Office 365 with Managed Metadata

    By Office 365 MVP Myles Jeffery



    Office 2010: Save Time and Create Incredible Content with Word 2010

    By Office System MVP Stephanie Kreiger


    MVPs for Windows Phone 7: Out of Box Experience – Windows Live ID and the Benefits to Having One For Your Windows Phone 7

    By Windows Phone MVP Trent McMurray


    MVPs for Windows Phone 7: Blogging While on the Move with a Windows Phone

    By Windows Phone MVP Todd Ogasawara


    MVPs for Windows Phone 7: Building Location Service Applications in Windows Phone 7

    By Windows Phone Development MVP Wei-Meng Lee



    MVPs for SharePoint 2010: Debugging Techniques for SharePoint Online Applications

    By SharePoint Server MVP Corey Roth


    MVPs for SharePoint 2010: Rolling up News Articles in SharePoint Server 2010 with the Content Query Web Part

    By SharePoint Server MVP Randy Drisgill


    MVPs for SharePoint 2010: Modifying Ribbon Fonts and Styles for Publishing Page HTML Field Controls

    By SharePoint Server MVP Becky Bertram


    MVPs for SharePoint 2010: Office 365 – Enhance Productivity through SharePoint Online & Exchange Online

    By SharePoint Server MVP Razi bin Rais


    MVPs for SharePoint 2010: Quick Tips for Improving Search in SharePoint 2010

    By SharePoint Server MVP John Ross



    MVPs for Office 365: Lync Online Federation

    By Lync MVP David Lim


    MVPs for SharePoint 2010: Using Azure ACS v2 to Authenticate External Systems Users

    By SharePoint Server MVP David Martos


    MVPs for SharePoint 2010: Practical SharePoint Governance for Everyone

    By SharePoint Server MVP Kanwal Khipple


    MVPs for SharePoint 2010: Office 365: SharePoint Online & Instant Extranets

    By SharePoint Server MVP Kris Wagner



    MVPs for SharePoint 2010: Managing Metadata

    By SharePoint Server MVP Liam Cleary


    MVPs for SharePoint 2010: Why Social Networking on SharePoint 2010?

    By SharePoint Server MVP Natalya Voskresenskaya


    MVPs for SharePoint 2010: SharePoint Designer Workflow Tasks and InfoPath 2010

    By SharePoint Server MVP Laura Rogers


    MVPs for Office 365: How-To Administer Office 365 through the Exchange Control Panel

    By Exchange Server MVP J. Peter Bruzzese



    MVPs for Exchange Online: Calling Exchange Online PowerShell Cmdlets from C#

    By Exchange Server MVP Mike Pfeiffer


    MVPs for Office 2010: The Beauty of Transitions in PowerPoint 2010

    By PowerPoint MVP Glenna Shaw


    MVPs for Microsoft Dynamics: Microsoft Dynamics CRM – Building Consistency into Free Form Text Fields

    By CRM MVP Jerry Weinstock


    MVPs for Office 365: Establishing Calendar Sharing between Office 365 Customers

    By Office 365 MVP Loryan Strant



    Getting Up and Running with the TFS 2010 Object Model

    By Visual Studio ALM MVP Jeff Bramwell


    Getting Started with the Silverlight 5 Release Candidate

    By Silverlight MVP Michael Crump


    Using MVC as a REST Service that is Accessed by jQuery/JavaScript

    By Visual FoxPro MVP John Petersen


    Did You Miss This Top MVP Guest Post On SharePoint Governance?

    By SharePoint MVP Kanwal Khipple



    Getting the Most out of the Kinect SDK

    By Silverlight MVP Michael Crump


    Displaying Lync Online Meeting Appointment Details using the EWS Managed API

    By Exchange Server MVP Glen Scales


    Automated Build-Deploy-Test using TFS 2010

    By Visual Studio ALM MVP Anuj Chaudhary


    Applying Document Retention in SharePoint 2010

    By SharePoint Server MVP Becky Bertram



    PowerPoint 2010 and Excel 2010: Perfect Partners for Tracking Projects

    By PowerPoint MVP Glenna Shaw


    How DPM 2010 Could Protect Forefront TMG 2010 with a Minimum Opening of Feeds | Proteger Son Serveur Forefront TMG 2010 Avec DPM 2010

    By French Forefront: Architecture MVP Lionel Leperlier


    Basis of Tactile Programming with WPF4 and Surface 2.0 SDK | Bases de la Programmation Tactile avec WPF4 et le SDK

    By French Surface: Development MVP Nicolas Calvi


    Ever Heard of the ReportViewer Control?: Getting reacquainted with the ReportViewer for developers who don’t understand its potential

    By Data Platform Development: Training MVP William Vaughn



    Information Architecture in a SharePoint Context | Architecture d’Information dans un contexte SharePoint

    By French-Canadian SharePoint Server MVP Alain Lord


    Windows 7 Quick Tips | Petites Astuces Windows 7

    By French Windows Expert Consumer MVP Michel Martin


    Asynchronous Programming with the Reactive Extensions (while waiting for async/await) | Programmation Asynchrone avec les Reactive Extensions (en attendant async/await)

    By French Canadian Visual C# MVP Jerome Laban






    0 0

    Welcome to the first MVP Friday Five for 2012! Each week, we highlight five MVP blogs that showcase some of the great tips, tricks and insights MVPs are known for—and that set them apart as community leaders. This week we have another great set of articles for your enjoyment.

    1. SharePoint & CRM Online Document Management

    By Dynamics CRM MVP Donna Edwards | @edwardsdna

    Donna takes us step-by-step through the process for integrating CRM Online with SharePoint Online to leverage SharePoint’s document management features.


    2. Enable Tab Order and Other Commands in Visual Studio Express Edition

    By Visual Basic MVP Rod Stephens

    In this how-to article Rod show you how to add a Tab Order command that lets you set the tab order of controls on a form by clicking on them to Visual Studio Express Edition.


    3. Create a List of Rules in Outlook

    By Outlook MVP Diane Poremsky | @dicot

    Diane shares a VBA code sample that creates a text file containing a list of rules, in the order they are listed in Rules and Alerts.


    4. Layer 1 or Layer 2 Hypervisor? A Common Misconception of Hyper-V, and a Brief Explanation of the Parent Partition

    By Windows Expert- IT Pro MVP Mitch Garvis | @Mgarvis

    Mitch describes the difference between Layer 2 and layer 2 hypervisors, clears up a common misconception of Hyper-V and explains the Parent Partition using an illustrative story about how Montreal came to build a mall under a Cathedral.


    5. SQL Server 2012 Windowing Functions Part 2 of 2: New Analytic Functions

    By SQL Server MVP Leonard Lobel

    In this second part of his two-part article on windowing functions in SQL Server 2012, Leonard explains the new T-SQL analytic windowing functions in SQL Server 2012.


    If you’re an MVP and would like your blog posts considered for our MVP Friday Five, please reach out to your MVP Lead or provide your URL in the comments section below!






    0 0

    Editor's Note: The following MVP Monday post is by French SQL Server MVP David Barbarin and is available in both English and French.

    Why shall I consolidate?

    Consolidation, who doesn’t know this term? The number of IT projects about consolidation has increased these last years. The databases are also concerned. Then can we ask if we are in a fashion wave? The response is no, because consolidation provides a lot of advantages. TCO reduction is certainly the best argument for the financial managers. The aim is to reduce the infrastructure cost while optimizing the use of hardware resources. Administration also becomes simpler for IT staff. The sprawl of SQL Server instances implies the increase of complexity to administrate and the increase of the total cost of licenses. Furthermore, the advent of virtualization appeared as a consolidation accelerator. Indeed, today the provisioning of the servers in production environment and others becomes easier regarding a configuration standard while sharing the resources of a physical computer between virtual hosts. Finally, reducing the number of physical servers also opens the way towards the storage convergence. It allows considering more robust solutions about fault tolerance, performance and scalability.

    Consolidate with one or many instances?

    When we talk about consolidation with SQL Server, we have quickly to ask if we need to implement a standalone or multi instances. Keep in mind that a consolidated server hosts a lot of databases and applications. Several factors are involved in the definition of the target architecture like security, server collation, resources control or difference of versions. The step is very important and should not be overlooked.

    Each application has its own set of prerequisites for security.  Some of them require server level permissions or maybe administrator privileges for the SQL Server instance. It will be possible to reduce this set of privileges but only in certain cases, which can compromise the security of the others databases. The choice to setup a dedicated instance can be a good option. The main goal is to give an application the control of the SQL Server instance and to isolate in the same time the security of the other databases.

    Furthermore, the server collation determines the collation which the system databases use like model or tempdb. It will ensure the fact that all the databases of the consolidated instance that will use tempdb have a compatible collation in order to avoid conflicts during join operation for example.

    We can add that an assessment of resources consumed by an application on the database server must be estimated before the deployment on production environment. Indeed, each application consumes a certain quantity of resources like memory, processors, network or disks. The purpose of this assessment is to verify that each database or each set of databases will not use resources of others databases. In certain contexts, SLA can exist and require a minimum threshold of resources to ensure an adequate level of service for an application. SQL Server 2008 provides an interesting feature that could meet to the requirements: the resource governor that allows to control and to limit the application resources consummation by using resources pools and workload groups. Unfortunately this feature is available only with Enterprise Edition and in this case the implementation of a dedicated instance is the only alternative left.

    Finally, we will often face to an environment with many different versions of SQL Server. In this context, we have two options : update the application to the referenced version for consolidation or add an instance with the adequate version. Many instances with different versions will exist on the same consolidated server. Each application will be spread to the correct instance depending on its version. However before to install a SQL Server with a specified version, the application manufacturers should be consulted to verify if the migration to the correct version is available. In most cases, we can use the compatibility level of databases to migrate slowly without the management of an additional instance and its resources.

    Consolidation and High-availability

    High- availability is a concept that we always have to keep in mind with consolidation.  Reducing the
    TCO on the entire SQL Server infrastructure could be considered as an investment in high-availability infrastructure. RPO (Recovery Point Objective) and RTO (Recovery Time Objective) are generally the safety keys that will determine the target architecture. Of course, we will have to take into account the budget aspect. It is the same thing with storage. The sprawl of standalone SQL Server instances on the dedicated physical servers often involves the use of local storage or JBODS (Just Bund of Disks). Consolidating SQL Server instances will often converge in the same time into the local storage to a solution that will be more centralized, efficient and scalable. Generally, companies will opt for a SAN (Storage Area Network) that provides more advantages than local storage or DAS (Direct Attached Storage) although the performance of SAN versus DAS is not so obvious. For example, back up local disks by backup agents over the network is much more complex when it comes to DAS. Moreover, during a hardware failure of a server, we can easily move the storage to a backup server with SAN. 

    Why shall we integrate a high- availability solution to consolidation? The implementation of a consolidated infrastructure centralizes the hosting of all the databases of the information system, making it much more critical. Indeed, stopping a consolidated instance can have much more impact on the business because all the databases and applications become unavailable. Maintenance operations such as update security or services packs become more complex or can degrade the business during a long time (depending on the maintenance operation) if no failover solution is implemented.  Therefore, it is important to secure this environment and ensure a certain level of service.

    Consolidation plan

    What are the elements to consider for a successful consolidation project of SQL Server instances? Once the identification of SQL Server instances to consolidate is completed, we have to begin an assessment of the server resources needed to determine the target server. Then, we must also identify individually the consumed resources by each database or by each set of databases of an application to ensure that they have a similar level of service than the old environment or even increase it. SQL Server 2008 R2 provides a good feature for that: UCP (Unit Control Point) that allows getting information about resources CPU and space used on disks in centralized manner. Unfortunately, UCP is limited to manage SQL Server instances with a version equal to 10.0.4 and more (SQL Server 2008 SP2 and more). 32 bits architectures are deprecated today and we have to focus on 64 bits architecture for consolidating environments. This type of architecture removes the limitation of 4GB of memory and increases the working set of SQL Server. 

    In addition, we have also to ensure a correct sizing of the database tempdb. Indeed, keep in mind that there is only one tempdb database on a SQL Server instance. In a consolidated instance, this aspect is very important because tempdb can be used heavily depending on the number of the databases and their workload. A good practice is to place tempdb on a dedicated performance.  Determining the number of files to create is more complicated. We can calculate this number based on the rule of one file per processor or one file per two processors depending on the version of SQL Server. We can also estimate the number of file based on the workload burden on the server by monitoring.

    Finally, we must also consider the overlapping of the login accounts and the security model. Indeed, it may happen that several applications use the same logins with a different password. Similarly, the security model used by a login account for an application can be not compatible on the target architecture. If nothing can be changed, we will probably have to consider an additional instance to host the concerned applications.  



    In a perfect world the consolidation of SQL Server instances could be a simple task but in reality we have to consider a lot of things like the understanding of the current architecture, the applications used by companies or the security model. Planning is an important phase of the consolidation project because it ensures that the target architecture will meet the requirements of performance and the availability required by the business.


    Author’s Bio

    David BARBARIN is currently a database consultant for a Microsoft GOLD PARTNER company in  switzerland and participates in the development of add-valued offers for data management around  Microsoft SQL Server. David was a speaker on several sessions with GUSS and SQLPass in switzerland since 2010 and wrotes several articles and blog posts about SQL Server. Furthermore, he is actively involved in SQL Server community such, Technet SQL Server and Beyond Relational. David brings his experience to various clients for whom he could work.


    MVP Mondays

    The MVP Monday Series is created by Melissa Travers. In this series we work to provide readers with a guest post from an MVP every Monday. Melissa is a Community Program Manager for Dynamics, Excel, Office 365, Platforms and SharePoint in the United States. She has been working with MVPs since her early days as Microsoft Exchange Support Engineer when MVPs would answer all the questions in the old newsgroups before she could get to them



    Pourquoi consolider ?

    Consolider est un terme qu’il est impossible de ne plus entendre de nos jours dans nos entreprises. Le nombre de projets informatique autour de la consolidation n’a pas cessé de croître ces dernières années. Les bases de données ne dérogent pas à la règle.  On peut alors se poser la question suivante : sommes-nous dans un phénomène de mode ? La réponse est bien entendu non. La consolidation amène son lot d’avantages. La réduction du TCO est sans aucun doute l’argument le plus vendeur auprès de la direction financière parce qu’il s’agit ici de réduire les coûts d’infrastructure tout en optimisant l’utilisation des ressources matérielles. De plus le travail des équipes IT devient également beaucoup plus simple car la multiplication des instances SQL Server sur le réseau impliquaient une augmentation des serveurs à administrer d’une part et une augmentation du coût global des licences d’autre part. L’avènement de la virtualisation a été également un accélérateur de consolidation. En effet, aujourd’hui il devient relativement simple et rapide de provisionner des serveurs dans un contexte de production ou autre en fonction de standard de configuration par exemple tout en partageant les ressources d’une même machine physique entre les hôtes virtuels.   Enfin, la  réduction du nombre de serveurs physiques ouvre la voie vers la convergence du stockage, ce qui permet d’envisager des solutions beaucoup plus robustes en termes de tolérance de pannes, de gestion et d’évolutivité.

    Consolider sur une ou plusieurs instances ?

    Lorsqu’il est question de consolidation avec SQL Server, le choix d’une installation standalone ou multi instances se pose rapidement. Il faut garder à l’esprit  qu’une instance consolidée héberge un certain nombre de bases de données et donc un certain nombre d’applications. Plusieurs facteurs entrent en jeu dans la définition d’une architecture cible comme la sécurité, la collation de niveau serveur, le contrôle des ressources ou la différence de versions qui existent sur l’environnement existant. Cette phase  est donc très importante et ne doit pas être négligée.

    Chaque application possède son lot de prérequis en terme de sécurité. Certaines d’entre elles nécessiteront des permissions de niveau serveur voire même des privilèges administrateur sur l’instance SQL. On pourra tenter de réduire ce niveau de privilège, mais seulement dans certains cas, ce qui peut donc compromettre la sécurité des autres bases de données. Le choix d’une instance dédiée peut s’avérer alors une bonne alternative. Il s’agit ici de laisser la possibilité à une application de gérer l’instance SQL Server qui l’héberge et d’isoler la sécurité des autres applications.

    D’autre part, la collation de niveau serveur détermine celle qui sera utilisée par les bases de données systèmes tels que model ou tempdb. Il faudra veiller à ce que l’ensemble des bases de données de l’instance SQL consolidée qui utiliseront tempdb possèderont une collation compatible afin d’éviter les conflits lors des opérations de jointure par exemple.

    En outre, une estimation des ressources consommées  (processeurs, disques, réseau, mémoire) par une  application sur le serveur de bases de données doit être effectuée avant toute mise en production. Le but est de vérifier qu’une base de données ou un ensemble de bases de données ne viendra pas vampiriser l’ensemble des  ressources allouées à une instance, au détriment des autres. Dans certains contextes d’entreprise, des SLA peuvent exister et déterminent des seuils de ressources minimum à respecter pour garantir un niveau de service suffisant pour une application. SQL Server propose depuis sa version 2008, une fonctionnalité intéressante qui peut répondre à cette problématique : le gouverneur de ressources qui permet de contrôler et de limiter les ressources consommées  pour une application à travers des pools de ressources et des groupes de charge de travail sur une instance SQL Server. Ce dernier  n’est malheureusement disponible qu’à partir de l’édition Entreprise et la mise en place d’une instance dédiée devient alors la seule alternative.

    Enfin, il arrive souvent que l’environnement SQL Server à consolider possède des niveaux de versions diverses et variés. Dans ce cas, deux options s’offrent à nous : mettre à niveau l’application vers une version de SQL Server référencée comme version de consolidation ou ajouter une instance  supplémentaire mais avec un niveau de version différent. Ainsi, plusieurs instances de versions différentes cohabiteront sur un même serveur de consolidation, et les applications pourront être réparties sur chaque instance en fonction de la version de SQL Server qu’elles utilisent. Cependant avant de créer une instance SQL Server avec un niveau de version spécifique, on pourra consulter l’éditeur sur la possibilité de migrer son application vers le niveau de version adéquate. Dans la plupart des cas, il est possible de contourner le problème de version en utilisant les niveaux de compatibilité de bases de données pour effectuer une migration en douceur et faire l’économie  d’une  gestion supplémentaire d’instance et des ressources monopolisées par celle-ci.  

    Consolidation et haute disponibilité

    L’aspect haute-disponibilité est omniprésent dans les projets de consolidations d’instances SQL Server.  La réduction du TCO permet d’envisager un investissement sur une infrastructure de haute disponibilité. Le RPO (Recovery Point Objective) et le RTO (Recovey Time Objective) sont en général les facteurs de sécurité qui détermineront la solution adéquate. Il faudra bien entendu prendre en compte une composante budget dans le choix de la solution. Il en va de même avec le stockage. La multiplication des instances SQL Server standalone sur des serveurs physiques  la plupart du temps dédiés impliquent généralement l’utilisation d’un stockage local ou de JBODs (Just Bund Of Disks). La consolidation des instances SQL Server permet souvent de faire converger l’ensemble de ce  type de stockage vers une solution beaucoup plus centralisée, performante et évolutive. Le plus souvent les entreprises opteront pour un SAN (storage Area Network) qui offrent plus d’avantage que les stockages locaux ou DAS (Direct Attached Storage) bien que les performances des SAN face aux DAS ne soient pas si évidentes. Par exemple sauvegarder des disques locaux à travers le réseau par le biais d’agents de sauvegarde est beaucoup plus mois simple lorsqu’il s’agit de DAS. Par ailleurs, lors d’une panne hardware d’un serveur, on pourra déplacer beaucoup facilement le stockage associé vers un serveur de secours avec un SAN.

    Pourquoi intégrer une solution de haute disponibilité  à la consolidation ? L’implémentation d’une infrastructure consolidée centralise l’hébergement  des bases de données du système d’information, ce qui en fait un élément beaucoup plus critique. En effet, l’arrêt d’une instance consolidée peut avoir un impact beaucoup plus important sur le business car c’est tout un ensemble de bases de données et d’applications qui deviennent indisponibles. Les opérations de maintenance telles que la mise à jour de sécurité et les services packs deviennent plus compliqués et peuvent dégrader le fonctionnement du business (en fonction de l’opération de maintenance) lorsqu’aucune solution de failover n’existe.  Il est donc primordial de sécuriser cet environnement et de garantir une certaine continuité de service. 

    Planifier la consolidation

    Quels sont les éléments à prendre en compte pour réussir un projet de consolidation d’instances SQL Server ? Un fois l’identification des instances SQL Server à consolider terminée,  il faudra commencer par évaluer les ressources serveurs nécessaires pour le serveur cible, puis identifier  individuellement les ressources consommées par chaque base de données ou chaque ensemble de bases de données d’une application.  Il s’agit ici de garantir un niveau de service similaire à l’ancien environnement voire l’augmenter.  SQL Server 2008 R2 propose une  fonctionnalité intéressante pour pouvoir répondre à ces deux questions : UCP (Unit Control Point) qui permet de visualiser d’une manière centralisée les ressources CPU et disques réellement utilisées par différentes instances SQL Server. Malheureusement, UCP se limite à la gestion des instances d’une version supérieure ou égale à 10.0.4 (SQL Server 2008 SP2 et plus). Si d’autres versions de SQL Server sont à consolider, UCP ne pourra pas être utilisé. Les architectures 32 bits étaient obsolètes, il faudra privilégier les architectures 64 bits pour des environnements consolidés afin de supprimer la limitation d’utilisation de la mémoire à 4Go d’une part et augmenter l’espace de travail utilisable par SQL Server d’autre part.

    De plus, il faudra également veiller à dimensionner correctement  la base de données temporaire tempdb. Il faut garder à l’esprit qu’il n’existe qu’une seule base de données tempdb pour une instance SQL Server. Dans un environnement consolidé, cela a d’autant plus d’importance car son utilisation est susceptible d’augmenter en fonction du nombre de bases de données et de leur charge de travail. Une bonne pratique est de placer cette base de données temporaire sur un stockage dédié avec de bonnes  performances. Le nombre de fichiers de données à créer est quant à lui plus compliqué à estimer. On peut se baser sur la règle d’un fichier par processeur ou d’un fichier pour 2 processeurs en fonction de la version de SQL Server mais ceci n’est pas forcément juste et dépendra surtout principalement de la charge globale que devra
    supporter l’instance SQL Server consolidée. Le monitoring sera donc de mise !!

    Enfin, il faudra également considérer les problèmes de chevauchement des comptes de connexion et de la sécurité. En effet, il peut arriver que plusieurs applications utilisent les mêmes comptes de connexion mais avec un mot de passe différent ou que modèle de sécurité utilisé par un compte de connexion pour une application ne soit pas compatible sur l’architecture cible. Si rien ne peut être changé il faudra sans doute prévoir l’installation d’une instance supplémentaire pour pouvoir héberger ces applications.


    Dans le meilleur des mondes la consolidation des instances SQL Server devrait être une tâche simple à réaliser. Cependant comme nous avons pu le voir, consolider nécessite de prendre en compte un ensemble de considérations importantes comme la compréhension de l’architecture physique existante, les applications utilisées ou encore le modèle de sécurité. La planification est une phase importante dans un projet de consolidation car elle permet garantir que l’architecture cible répondra aux exigences de performance et disponibilité imposées par le business.


    Author’s Bio in French:

    David BARBARIN est actuellement consultant en bases de données pour une entreprise Suisse GOLD Partner Microsoft et participe au développement d’offres à valeur ajoutée de data management autour de la technologie Microsoft SQL Server. Intervenant en tant que speaker à plusieurs reprises lors des sessions GUSS et SQLPass Suisse depuis 2010, David a également écrit de nombreux articles et billets de blog. De plus, il participe activement aux communautés SQL Server sur, Technet SQL Server et Beyond Relational David met son expérience au service des différents clients pour lesquels il a pu travailler.



    0 0

    Editor's Note: The following post is by MVP Lead Kari Finn.

    The 2012 International Consumer Electronics Show (CES) is in full swing at the Las Vegas Convention Center and Jake and I are having a blast so far!  We kicked it off Monday night by watching Steve Ballmer’s last CES keynote after waiting in line for two hours to get in.  There was an incredible turnout and by the time it got under way, the house was packed and full of excited energy.  It launched with a pretty hilarious video montage of Microsoft keynotes delivered at CES over the years that started with an auto tuned Bill Gates and ended with Ryan Seacrest being introduced as the “host” of the night. This year’s keynote focused on previewing the trifecta of innovations to come that are being highlighted at CES: Windows Phone 7.5, Xbox Kinect and, of course, Windows 8.  There were a few things I have never seen before that were totally cool, one of which was an interactive Sesame Street TV show that was created specifically for the Xbox Kinect.  

    On Tuesday morning, CES officially opened its doors to 8 million square feet of exhibits and over 150K attendees. This is the first year I’ve been and, as a newbie, I’m overwhelmed by how enormous and packed the event is. There are literally six maps to help guide you. The vendors range from hugely popular to virtually unknown. There are the powerhouses like Lenovo and Microsoft to small start ups that create things like educational software. Samsung has a gigantic and beautiful exhibit that features a huge 3D TV screen on the back wall and a Windows 8 tablet. 

    We’ve also run into a lot of MVPs at CES!  We caught up with SQL Server MVP Andrew Karcher at his booth where he was giving demos of a very innovative cloud-based educational software program called Thuze that he’s helped develop.  We ran into Xbox MVP Rick Wallace hovering near the Microsoft booth, where he was scoping out the Xbox Kinect and Windows integration innovations.  Enterprise Security MVP and technical author Debra Shinder is at CES covering the latest security news and Windows Touch and Tablet MVP Linda Epstein is at CES covering the latest Tablet news.  We also ran into Windows Expert - Consumer MVP Joli Ballew, who is at CES doing research for her latest writing gig as the co-author of Windows 8 Step by Step. Visual Basic MVP Andrew Brust is here as an official member of the press, working on a piece for Visual Studio Magazine that covers the latest news, including Nokia’s 900 Lumia phone. 

    Thursday is packed with all kinds of fun CES events that we’re attending and we’re also hoping to hook up with Project MVP Tim Runcie and Access MVP Joe Anderson to find out more about what they’re doing at CES.  There’s so much more to do and share with you, so be sure to follow the #mvp, @karifinn and @jake_grey twitter hash tags to get the latest buzz from CES 2012 from your MVP community!

    Author Bio

    Kari Finn is an MVP Lead for Office Technologies in the United States. Sheis inspired every day by the power of communities and feels lucky to be able to work with one of the most thriving technology communities today. She enjoys engaging with MVPs and supporting them in their pursuits to share their knowledge and passion with the community. You can follow her on twitter @karifinn or find her on LinkedIn.


    0 0

    Thanks for tuning in to another MVP Friday Five! We have another set of great MVP authored articles that showcase the expert knowledge we come to expect from MVPs. In the articles featured this week you’ll find great tips, how-to’s and insights that MVPs are known for.


    1. Nonpaged Pool Resource Allocation Error (SRV Error 2019)

    By Small Business Server MVP Robert Pearman | @titlerequired

    In this post Robert discusses a clients server that, every so often, experiences a condition where it stops responding to network requests.  He shares how he uses Microsoft-owned tools (SysInternals) to drill down and find the specific task or process or DLL that causes the problem and shows you that there is an early warning of these events, if you look in the right place.


    2. Crawl Troubleshooting

    By SharePoint Server MVP Matthew McDermott | @MatthewMcD

    Matthew shares a way he discovered to troubleshoot the crawl when it behaves unexpectedly on sites you want to crawl by using a technique with Fiddler Web Debugging Proxy ( Using Fiddler he shows how to configure SharePoint Search to crawl through Fiddler as a Proxy to watch the traffic.


    3. Sharing Resources among Projects with MS Project 2010 – Part 1

    By Project MVP Nenad Trajkovski |@ntrajkovski

    Nenad explains how to share Resources among different Projects in MS Project 2012 step-by-step.


    4. Using the Network Dashboard Views in SCOM 2012

    By System Center Cloud and DataCenter Management MVP John Joyner |@john_joyner

    This article is an introduction to the four new dashboards that are part of the network monitoring engine for Microsoft System Center Operations Manager 2012. In it, John describes each dashboard, provides a screenshot, and discusses scenarios where you might use each dashboard for a specific task.



    5. Windows 7 Search Indexer Tips

    By Windows Expert-Consumer MVP Anand Khanse

    Anand tells you how to configure Windows Search and its indexing options, so that you can make the most of it!


    If you’re an MVP and would like your blog posts considered for our MVP Friday Five, please reach out to your MVP Lead or provide your URL in the comments section below!






    0 0

    Editor's Note: The following MVP Monday post is by Russian SharePoint Server MVP Marat Bakirov and is available in both English and Russian.

    How to create a LightSwitch Application that works with SharePoint Data.

    In this article we will work with Microsoft SharePoint data in Microsoft Visual Studio LightSwitch. We will take a SharePoint site with existing data and make an application for the end users that will enter the data. In this article I would like to point out to the required protocol for SharePoint and also explain how to work with lookup columns in a right way.

    In this article we will work with Microsoft SharePoint via Microsoft Visual Studio LightSwitch. I suppose that my readers have a basic understanding what SharePoint is. 

    However, LightSwitch is a relatively new technology, so at first it looks like one could explain it in a few words. This is true and false at the same time. There is a great book called “Framework Design Guidelines” and it clearly states that a good framework should give us simple ways to do our most popular tasks and at the same time provide flexibility and advanced customization methods.  All these statements are true about LightSwitch.

    As a user, you can drag/drop/click and make your interface and an application the way you want. As a developer, you can bring some parts to another level thanks to LightSwitch extensibility and the fact that it is built on the latest Microsoft technologies for applications

    So let us start and make an application that works with SharePoint data. 

    First, we need to install an extension SharePoint that allows us to work with SharePoint via OData. OData is a standard protocol for manipulating data via REST (Representational State Transfer. The idea of OData/REST is simple – we use standard HTTP GET/PUT/DELETE verbs and some variation of RSS/Atom format for operating with our data via web or network. For example, we can use an URL http://yoursite/_vti_bin/listdata.svc/IPPublic(3) and it means that we want to get element number 3 in a IPPublic collection from site your site. At the same time, we can PUT XML data to the same URL to save data and also we can add some parameters to the URL to sort, filter the data returned or even use data paging.

    So we want to enable this technology for SharePoint. There is a quick way to see if the technology is installed on your SharePoint or not. Just take any URL of your existing SharePoint site and simply add a magic suffix “_vti_bin/listdata.svc”, put the combined url http://yourgreatsite/_vti_bin/listdata.svc into a browser, push enter and see what happens.  Normally you will see an error message “Could not load type System.Data.Services.Providers.IDataServiceUpdateProvider from assembly System.Data.Services, Version=, Culture=neutral, PublicKeyToken=b77a5c561934e089” indicating that these data services are not installed.

    By default all pages and web files required are already available in SharePoint, but we miss the actual Data Services DLL’s in Global Assembly Cache. There are several links for ADO.NET Data Services update, but I found out for myself that this link work for Windows 7/ Windows Server 2008 R2 and this should work for Vista / Windows Server 2008. If your installation is successful, you should see something like this 

    In our company Intranet solutions we have rather complex data structure. This structure includes information about job positions. When you create a new job position you need to enter information about a division first, because we have relations (lookup columns). Let us try to make an application that will allow us to enter and edit this data.

    Here is a part of our data schema that is used on the portal. The schema is prepared with a great Visual Studio plug-in called Open Data Protocol Visualizer. You can see that Job positions refer to other entities such as divisions and job statuses.

    Now let us start Visual Studio 2010 or Visual Studio LightSwitch 2011 (technically LightSwitch installs as a plug-in if you already have Visual Studio Professional version or higher). For this article I am using my Visual Studio 2010 version. Create a new Project, select LightSwitch \ LightSwitch Application (I am using C# for this demo).

    After creating a new app, Visual Studio starts with a fancy screen.

    Now we press a button “attach to external Data Source” and select SharePoint.

    Next screen asks us for an address and the credentials – we can choose Windows credentials to authenticate as a current windows user (the option we do choose now) or we can enter specific credentials. The last screen allows us to select the lists we would like to work with.  I have selected IPDivisions, IPEmployees and IPJobPositions. LightSwitch gives me a warning that I have some extra dependencies on three other lists and a special list called UserInformationList, and I press Continue. The UserInformationList is a special list SharePoint uses to store information about site collection users, and all system fields like Editor (a person last edited the item) can be considered as references to this list.

    As I cannot wait to see what happens, I usually immediately try to run an app after the wizard. This time I have entered into an error like this. 

    “C:\Program Files (x86)\MSBuild\Microsoft\VisualStudio\LightSwitch\v1.0\Microsoft.LightSwitch.targets(157,9): error : An error occurred while establishing a connection to SQL Server instance '.\SQLEXPRESS'.

    A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: Named Pipes Provider, error: 40 - Could not open a connection to SQL Server)

    Done building project "LSAPplication.lsproj" -- FAILED.”

    This should not normally happen on your machines, but happens on my geek machine. As I do have SharePoint installed, I am saving memory as much as possible, so I do only run the SQL server I need each day (and it is .\SharePoint). I do have SQL Express installed, but do not run it automatically. But wait a moment … we are working with SharePoint data, why then we do need a SQL Express for our application???!

    The answer is that LightSwitch is intended to quickly make an application and to quickly design a data schema. When the user designs a data schema and creates some tables, they are stored in SQL Express running in the User mode. Moreover, when you deploy your solution as a multi-tier application having a layer deployed on IIS, you may need SQL Express for ASP.NET infrastructure such as user settings. However, if you do not have any data except SharePoint and deploy your application as a desktop application with app services running on local machine (it is one of the deployment options), the end user does not need a SQL Express. But Visual Studio uses full mode with web services running on local web server, and thus you need a SQL Express for running and debugging.

    So SQL Express is installed and running and the user mode enabled –here is a link with instruction. You may sometimes face sometimes an error “Failed to generate a user instance of SQL Server due to a failure in starting the process for the user instance” – here is a solution.

    So far we have added a SharePoint data connection to our application. Open a solution explorer. By default, the explorer may look confusing and offering nothing for a developer. But there are two views for a solution, which can be switched by a dropdown. One is called Logical View and the other is File
    View. First one is simpler and presents an overall data and interface for your application. The second one shows you all the files and all the custom logic in your application and is good for a developer for deeper understanding the application structure.

    Before we create our visual interface, let’s complete two simple but very important steps. Open each of your table in your data source (just double click it). You will see a list of columns. You can click on the table itself and on each column, and explore some settings in the standard Visual Studio property window.

    The first important setting is “Display by default” defined the column level. When it is checked, the column appears in new screens you create in Visual Studio. Visual Studio LightSwitch is smart enough to hide SharePoint system columns such as Owshiddenversion or Version, but it displays ContentType by default. In most cases we do not need this column, when our list contains only one content type. We deselect ContentType from all our data tables.

    The second important setting is a Summary property defined on a table level. This setting is very important when the table data is displayed as a list.

    Consider an example. You need to edit a job position. When you edit a division, you see a drop-down list showing all divisions. In this dropdown, the summary column is shown. By default, SharePoint shows a content type, so you see nothing meaningful in a list.

    That’s why I suggest changing summary property to Title for all our types and to Name in our UserInformationList

    Now we are ready to create our screen data. Open logical view, right click on Screens and select Add Screen. For this sample we will use List and Details screen type. Give a screen any name, and select the desired entity in a Screen Data drop down. For this sample, I have selected IP Job positions.  Let us do the same for all entities referred by our JobPositions – for my data structure this would be Divisions, Job Statuses and some other entities.

    Now the application is ready and we can press Ctrl F5 to run an application.

    To further extend an application I suggest watching a 15 minutes video by Beth Massi which shows some additional functions such as querying data depending on a current user. LightSwitch contains a lot more features than we have seen today, such as custom validation logic or event handling that allows us to easily fill the data.  LightSwitch is a quick way to build rich applications for end users and deploy them as Desktop or Web application.


    Author's Bio

    Marat worked in the IT industry since 1993. First with Borland Pascal , Assembly language and MS DOS.

    He has worked on projects for companies like (DataObjects.NET project), .

    In 1996-2000 Marat worked on projects for Rocket Software and IBM companies. And from 2007-2010 worked at Microsoft Russia as a Community Developer Evangelist. He is currently the Chief Software Architect at UMSOFT company.

    MVP Mondays

    The MVP Monday Series is created by Melissa Travers. In this series we work to provide readers with a guest post from an MVP every Monday. Melissa is a Community Program Manager for Dynamics, Excel, Office 365, Platforms and SharePoint in the United States. She has been working with MVPs since her early days as Microsoft Exchange Support Engineer when MVPs would answer all the questions in the old newsgroups before she could get to them



    Как создать приложение LightSwitch для работы с
    данными SharePoint.

    Марат Бакиров

    В данной статье мы будем работать с данными Microsoft SharePoint с помощью Visual Studio LightSwitch. Мы возьмем уже созданный сайт Sharepoint и сделаем приложение воода данных конечными пользователями. В статье я укажу необходимые настройки Sharepoint, а также укажу как правильно работать с колонками типа Lookup. 

    В статье мы работаем с Microsoft Sharepoint с помощью Visual Studio LightSwitch. Я предполагаю, что читатели статьи имеют базовое понимание того, что такое Sharepoint.

    В то же время, LightSwitch является достаточно новой технологией, и на первый взгляд может показаться, что данное приложение можно было бы объяснить за две секунды. Это одновременно и правда, и неправда.  Есть замечательная книга “Framework Design Guidelines’,  в которой сказно, что любая хорошая библиотека ( в оригинале framework) должна предоставлять возможность быстро и просто решать самые распространенные задачи, но в то же время  предоставлять гибкость и различные возможности по настройке. Все вышесказанное однозначно относится к LIghtSwitch.

    Действительно, LIghtSwitch содержит упрощенный режим, в котором с помощью drag and drop можно очень быстро настроить и сделать приложение. Но будучи разработчиком, можно сильно улучшить итоговое приложения благодаря тому, что LIghtSwitch предлагает расширяемость с помощью самых новых технологий Microsoft для построения приложений.

    Итак, давайте попробуем сделать приложение для работы с данными Sharepoint.

    Первым делом необходимо установить специальное расширения для Sharepoint , которое позволит нам работать с Sharepoint через интерфейс OData. OData это стандартный протокол для работы с данными через REST (Representational State Transfer). Идея REST очень проста – мы используем стандартные команды HTTP GET/PUT/DELETE , а также некую разновидность формата RSS/Atom для того, чтобы читать и писать данные. Если мы, например, обращаемся к данным по URL http://yoursite/_vti_bin/listdata.svc/IPPublic(3), то это означает, что мы хотим на сайте yoursite в списке или библиотеке IPPublic получить элемент номер 3. Также мы можем записать данные с помощью команды PUT или с помощью добавления параметров к URL команды GET получить отфильтрованные или отсортированные данные.

    Мы хотим включить на нашем Sharepoint сайте данную технологию. Можно очень  быстро проверить – установлена ли у вас поддержка Odata – для этого достаточно взять URL любого сайта и добавить к нему строчку _vti_bin/listdata.svc и открыть  то, что получилось с помощью веб браузера.  Обычно при свеже-установленном Sharepoint Вы при этом получаете сообщение об ошибке «Could not load
    type System.Data.Services.Providers.IDataServiceUpdateProvider from assembly System.Data.Services, Version=, Culture=neutral, PublicKeyToken=b77a5c561934e089». Это происходит потому, что все необходимые страницы уже присутствуют в Sharpepoint, но
    необходимые сборки не установлены в Global Assembly Cache. Существует несколько ссылок для скачивания ADO.NET Data Services, и в них
    можно запутаться, я лично записал для себя что данная ссылка работает в Windows 7/2008R2 и вот эта должна работать в  Windows Vista/2008.  Если Вам удалось все правильно установить, то Вы должны увидеть что то вроде этого:

    Я решил попробовать для примера решение нашей компании для интранет порталов. У нас в решении достаточно сложная структура данных, содержащая, в частности, данные о должностях сотрудников. Но поскольку у нас есть связи между данными (lookup
    columns), то прежде чем вводить данные о сотруднике, приходится вводить данные о структуре предприятия. Именно поэтому мы хотим создать Windows приложение для быстрого ввода данных.

    Ниже приведена часть схемы данных нашего портала. Кстати, есть совершенно замечательный плагин для визуализации структур OData под названием Open Data Protocol Visualizer, картинка подготовлена с помощью него.  Видно, что данные о должностях содержат ссылки на другие данные.

    Запуcтите Visual Studio 2010 или Visual Studio LightSwitch 2011 (по сути, LightSwitch устанавливается в качестве дополнения, если у Вас уже есть Visual Studio Professional или выше). Выберите новый проект, далее выберите LightSwitch \ LightSwitch Application (я обычно выбираю C#, но у Вас есть варианты). После создания Вы получите стартовый экран с двумя опциями «Create New table» и «Attach to external Data Srouce». Выбираем вторую опцию, выбираем SharePoint.

    Далее необходимо будет выбрать вариант аутентификации – либо Windows (то есть наше конечное приложение будет соединяться с сайтом Sharepoint от имени текущего пользователя), либо указать конкретные учетные данные. На последнем экране можно выбрать необходимые нам списки из имеющихся на сайте. Я выбрал три списка. LightSwitch предупреждает, что существует зависимость относительно служебного списка UserInformationList и предлагает его тоже добавить. Тут остается только согласиться и нажать Continue. Список UserInformationList – это специальный служебный список, в котором храниться информация о всех пользователях, которые хотя бы раз появлялись на сайтах из указанной коллекции (авторизовались либо указывались в списках или списках прав), и технически очень многие системные поля такие, как Editor (пользователь, последним изменивший запись), можно считать ссылками на данный список.

    <image 4>

    После этого можно уже пробовать запускать приложение. Если Вы ставили все по умолчанию, то все будет хорошо, но я при запуске столкнулся со следующей ошибкой.

    C:\Program Files

    (x86)\MSBuild\Microsoft\VisualStudio\LightSwitch\v1.0\Microsoft.LightSwitch.targets(157,9): error : An error occurred while establishing a connection to SQL Server instance '.\SQLEXPRESS'.

    A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: Named Pipes Provider, error: 40 - Could not open a connection to SQL Server)

    Done building project "LSAPplication.lsproj" -- FAILED.”

    Ну ясно, нет или не запущен SQL Express.  Хотя секунду… мы работаем с данными Sharepoint, почему же нам нужен SQL Express?

    Дело в том, что одной из основных возможностей LightSwitch является быстрое создание схемы данных. Пользователь создает таблицы схемы данных, и данные хранятся в  SQL Express  в режиме User Mode. Более того, если вы будете впоследствии разворачивать приложение в режиме ASP.NET, SQL Express может понадобиться для хранения системных данных ASP.NET – например для данных о пользователях и ролях.  Тем не менее, если Вы не используете дополнительных  данных кроме Sharepoint, то после сборки и публикации приложения в режиме Desktop SQL Express будет необязателен. Но Visual Studio всегда запускает Ваше приложение в полном режиме с веб сервисами , запускаемыми на локальном веб сервере, и поэтому во время отладки и запуска из Visual Studio нужен SQL Express.

    Итак, допустим, что у нас запущен SQL Express, включен режим User Mode (вот  ссылка  на инструкцию по настройке). Иногда Вы можете столкнуться с ошибкой “Failed to generate a user instance of SQL Server due to a failure in starting the process for the user instance” – тут можно найти решение данной проблемы. 

    Итак, мы пока только добавили в приложение соединение с данными. Теперь нам необходимо добавить пользовательский интерфейс. Откройте solution explorer. По умолчанию окно выглядит немного странно для разработчика. Но дело в том, что данном окне есть два режима просмотра, между которыми можно переключаться. Один из режимов называется Logical View, а другой — File View. Первый – более простой и показывает данные и интерфейс Вашего приложения. Второй показывает все имеющиеся файлы и большое подходит уже для более профессиональной доработки приложений, а также хорош для понимания общей структуры проекта с точки зрения разработчика. 

    Перед тем как создать наш интерфейс, давайте сделаем два простых, но очень важных шага. Для каждой таблички необходимо проделать следующее – откройте таблицу в разделе Data Source окна Solution Explorer (делается простым двойным кликом). Появится список колонок. Можно нажать мышью на таблицу или на конкретную колонку, и в стандартном окне Properties появятся важные интересные настройки.

    Первая интересующая нас настройка это “Display by default” (определена для каждой колонки). Если ее выбрать, то колонка будет появляться на всех вновь созданых «экранах». Visual Studio LightSwitch самостоятельно распознает большинство системных колонок SharePoint , напртмер таких, как Owshiddenversion или Version, но почему то по умолчанию показывает также ContentType, который обычно не нужен. Необходимо отключить использование ContentType по умолчанию.

    Вторая важная настройка (уровня таблицы данных) — Summary. Эта настройка очень важна в случае, когда у нас есть ссылки на другие таблицы.

    Рассмотрим пример. Нам необходимо редактировать должность, но должность содержит ссылку на подразделение, и мы хотим, чтобы подразделение  показывалось в выпадающем списке. В выпадающем списке  будет показываться именно колонка, указанная в настрйке Summary. По умолчанию выбирается ContentType,поэтому я рекомендую сразу поменять настройку Summary на что то более подходящее, например Title.

    Теперь мы готовы создать «экраны». Открываем logical view, кликаем правой кнопкой мыши на Screens и выбираем Add Screen. Для данного примера я использовал тип экрана List and Details. Выбираем произвольное имя, выбираем необходимую нам таблицу – здесь я использовал сначала IP Job positions, а потом подразделения (Divisions) и другие таблицы.

    Теперь приложение готово и его можно запускать с помощью Ctrl F5.

    Я рекомендую также просмотреть это видео для понимания дополнительных возможностьей по расширению приложения. В этом видео Бет Масси (Beth Massi) показывает некоторые дополнительные интересные возможности – например возможность использовать запросы (query) и указывать в запросе текущего пользователя. Вообще LightSwitch содержит огромное количество возможностей yнапример возможность добавлять логику проверки данных или обработку событий.  LightSwitch – это быстрый способ создать приложение и в дальнейшем установить его в Web или Desktop режиме.


    Author's  Bio

    Марат работает в индустрии ИТ с 1993 года, начиная еще со времен Borland Pascal, ассемблера и MS DOS.

    Работал в таких компаниях, как (проект DataObjects.NET),

    В 1996-2000 работал над проектами для компаний Rocket Software и IBM. В 2007-2010 работал в компании Майкрософт в должности Community Developer Evangelist. В данный момент работает в компании Умсофт ( в должности главного архитектора и иногда пишет в блог по адресу

    0 0

    After seven years as a valuable leader in the MVP community, Community Engagement Director Nestor Portillo is moving on to a new role at Microsoft. Nestor’s passion for and expertise in community and social media will be put to good use as he takes on a new global leadership role within Microsoft’s support organization, where he will focus on delivering great value to Microsoft customers in community forums, micro-blogs and other next-generation social channels.

    “Nestor leaves a strong seven-year legacy in our organization, having played a critical role in shaping the Most Valuable Professional Award in recent years,” explained Toby Richards, General Manager of Microsoft’s Community and Online Support.

    Starting as a community lead in Latin America, Nestor’s contributions to the community grew to leading the MVP Award in recent years. He helped to strengthen how Microsoft identifies, awards and engages with top community leaders, and he clocked tens of thousands of air miles in the process as he developed strong personal relationships with MVPs around the world.

    Nestor wished to share his sincere gratitude for all Microsoft’s MVPs, past and present, and the MVP program’s staff with these words… “During my years working for the MVP award program I’ve had the privilege to contribute in its evolution to become the industry best practice; this has only been possible because its key components: the MVPs, our vibrant communities and the passion of the people running it are deeply committed to helping users. I cannot foresee anything less than success in the coming years because the passion involved.”

    Continuing and growing upon Nestor’s great work will be Mike Hickman and Lourdes Orive.  Mike and Lourdes share Nestor’s and the whole MVP program’s passion for our MVPs and each bring rich histories and experiences within the MVP and community spaces.  Mike will manage all community engagement and Lourdes will drive the program and operations side of the MVP experience.  Mike and Lourdes are not new to our MVPs, but will reach out to more of our MVPs in the coming months and look forward to connecting personally with many MVPs at the upcoming Global MVP Summit in late February and early March.

    Please join us in expressing our great thanks to Nestor for his exceptional leadership and personal commitment to the MVP community over the years and welcome Mike Hickman and Lourdes Orive.

    0 0

    We have another great MVP Friday Five for you all this week. This week’s article cover a wide variety of expertise, and each article is filled with expert how-to’s and step by step information from MVPs.  These articles are just a few examples of the incredible content MVPs provide to the tech community every day!


    1. Windows Azure Storage Client Library for Windows Phone: Part 1

    By Connected System Developer MVP Dhananjay Kumar | @debug_mode

    Dhananjay discusses some of the ways you can work with Windows Azure storage client library for Windows Phone.


    2. Filtering by Columns

    By Excel MVP Tom Urtis | @TomUrtis

    In this article Tom explains two methods for filtering rows by individual columns including one with Data Validation and another with a UserForm interface.


    3. Using SCOM as a basic configuration audit system – Part 6

    By System Center Cloud and Datacenter management MVP Daniele Grandini | @DanieleGrandini

    This is the final post in a series Daniele has published about using SCROM. In this post he discusses the compliance checking monitor component.


    4. Building a Multi-Touch Photo Viewer Control

    By Silverlight MVP Morten Nielsen | @dotMorten

    Morten shares a simple reusable control that allows users to use their fingers to pinch zoom and drag using the touch screen on a Windows Phone using very little xaml.


    5. Windows Phone 7 DB Connection Settings Helper

    By ASP.NET/IIS MVP Lohith Nagaraj | @kashyapa

    In this post Lohith has put together a helper class which will do all the magic under the hood but expose pre-defined parameters for the connection string for Windows Phone 7 as properties. At the usage level, you just have to instantiate this and set some properties and call a method to extract the connection string.


    If you’re an MVP and would like your blog posts considered for our MVP Friday Five, please reach out to your MVP Lead or provide your URL in the comments section below!






    0 0

    Editor's Note: The following MVP Monday post is by PowerPoint MVPs Luc Sanders and Glenna Shaw.

    When PowerPoint was first released in 1990, the educational community immediately realized the benefits of using it as a learning tool.  And more than 20 years later it’s still going strong as the medium of choice for educators worldwide.  This is in large part because, as the world of education delivery has changed from the classroom to virtual, PowerPoint has continued to keep pace with changes in technology and the area of interactive online tutorials is no exception.  While there are a plethora of third party offerings to put your presentations on the web, for those who like their options integrated with PowerPoint, it’s hard to beat using SkyDrive.

    The first thing you’ll want to do is make sure you have a SkyDrive account.  It’s easy and free and explained here.  With your SkyDrive account you get a whopping 25 GB of free cloud space and the ability to post and share files with whomever, whenever you want.  Use the Explore Windows Live Center to learn all about the features.

    Once you have your SkyDrive account, it’s time to create your tutorial.  You have two options, use the PowerPoint Web App directly on your SkyDrive or use your desktop version of PowerPoint and upload the presentation to SkyDrive later.  Each method has its advantages and disadvantages.  If you use the Web App, you’re insured the presentation will run exactly as you’ve designed it but you won’t be able to add animations and other features that run in SkyDrive but aren’t available for editing in the browser.  If you use the desktop version of PowerPoint some features won’t run in SkyDrive so you’ll want to make sure you don’t use them when designing your presentation.  Specifically, transitions are all changed to a fade transition, animations are “smoothed out” in the SkyDrive viewer so they may appear a little different, Audio/Video/VB is not supported and Loop until ESC doesn’t work.  On the plus side, Hyperlinks, most Action Settings and Animation Triggers do work so you can still incorporate a lot of interactivity in your presentation.  Decide how you want to handle navigation before you create your tutorial before you start because it will make a difference to how you chose to share your presentation later.  Your options are to add your own navigation within the presentation or use the navigation that’s automatically included with the SkyDrive reader.

    Setting Permissions on SkyDrive

    Before you start creating your tutorial you need to decide who is your audience and know how to set the permissions on SkyDrive.  The new enhancements to SkyDrive allow you to set the permissions directly on a file, but if you plan to post several tutorials for the same audience you may want to create a folder and set the permissions there.  All files you save to your folder will then inherit the permissions of the folder and you won’t need to set them separately.  See the Explore Windows Live Center for instructions on how to create a folder and set or change permissions for a file or folder.

    Using the PowerPoint Web App

    The images below show how to create and edit a presentation using the PowerPoint Web App on the Flemish version of SkyDrive.  Even if you don’t speak Flemish, the images are easy to follow and serve to demonstrate the worldwide availability of SkyDrive.

    Upload from PowerPoint 2010

    You’ve created your presentation in PowerPoint and now you want to upload it to your SkyDrive. The image below shows how to save your presentation to a public folder on SkyDrive directly from PowerPoint 2010.  If you’re using PowerPoint 2003/2007 you’ll need to upload the file separately into SkyDrive.

    You can learn more about saving files to SkyDrive and editing in the PowerPoint Web App at Introduction to the PowerPoint Web App.

    Once you’ve saved your presentation, make sure the permissions are set correctly for your intended audience.

    Sharing Your Tutorials

    You’ve created and saved your tutorial, set the correct permissions for your intended audience and now you’re ready to get the word out.  SkyDrive gives you a number of options for sharing your presentation and directions for each are provided on the Explore Windows Live Center.  However, each of these options will automatically link to the Reader view of your presentation.  This is fine if it’s your preferred method of display, but if you’ve built in your own navigation for your tutorial you’ll probably prefer the full screen view.  To get the link to the full screen view, simply click the menu item to Start Slide Show and copy the link from there.  You’d then share this link with your audience.  In the example shown below the navigation was created with the ActivePrez add-in from GMark making it a better candidate for the full screen view since all navigation is included in the presentation itself.

    One of the coolest new features for presentations is the ability to embed them in a web page.  Directions for embedding presentations are provided here.  The image below shows the Gestalt of Slides tutorial embedded on the Visualology.Net blog.


    PowerPoint is a stellar tool for creating interactive tutorials and, although there are many options available, SkyDrive provides a new perfect vehicle for publishing and sharing your tutorials online.

    Author Bios

    Luc Sanders has been a teacher/instructor for over 25 years in the Flemish Public Employment Service (VDAB). His ultimate goal is to provide the best possible assistance to every jobseeker in their search for an appropriate job. He teaches word processing and presentation software to jobseekers and corporate users.


    Glenna Shaw is a Most Valued Professional (MVP) for PowerPoint and the owner of the PPT Magic Web site and the Visualology blog. She is a Project Management Professional (PMP) and holds certificates in Accessible Information Technology, Graphic Design, Cloud Computing and Professional Technical Writing.


    MVP Mondays

    The MVP Monday Series is created by Melissa Travers. In this series we work to provide readers with a guest post from an MVP every Monday. Melissa is a Community Program Manager for Dynamics, Excel, Office 365, Platforms and SharePoint in the United States. She has been working with MVPs since her early days as Microsoft Exchange Support Engineer when MVPs would answer all the questions in the old newsgroups before she could get to them

    0 0

    Windows Azure MVP Anton Staykov

    From: Bulgaria

    Time as an MVP:  10 months



    Which technical communities are you most active in?

    MSDN Azure related forums and StackOverflow Azure tags


    How did you first start in community?

    By very actively participating at MSDN technical forums (with questions and mostly answers)


    What's the best technical tip you have for implementing Cloud deployment?

    IT’s actually about hybrid deployment. When using Azure Storage from non-Azure applications – such as WinForms/WPF or ConsoleApps- the default project template for those projects uses .net Framework Client Profile, within which one cannot use any of the Storage Client Library. You can learn more in my post on Windows Azure Storage Tips.


    What do you tell people who are considering using the Cloud, but aren't sure about making the move?

    I ask them: Are you (or your on-premise deployment) ISO 27001/2005 certified? Do you have a geo-replicated storage for major disaster recovery? Can you provision a SQL Server instance with a single click, within a blink of an eye? You don’t have to shut down your IT Pro department; you just need to educate them to use the Cloud. There is nothing more secure and reliable than the Cloud.


    What words of advice do you have for new MVPs?

    Don’t keep the knowledge for yourself. If you have solved a tough issue – share the experience! Help others.

    0 0

    This morning we were thrilled to host the 4th Official MVP Twitter Chat. The @MVPchat twitter handle hosted the chat from 9-10 AM PT, and it was an hour filled with great conversation and networking between new MVPs, veteran MVPs and aspiring MVPs. In fact, almost 170 people joined in the conversation as we welcomed new MVPs and asked the community to help answer questions about making the most of being an MVP, the MVP award program and more. In the end we had almost 800 pearls of wisdom in the form of 140 character or less tweets.

    We’ve included a few of our favorite responses from MVPs to the questions below and will include a link to a full transcript from MVP Tony Champion sometime in the next week!

    Thank you again to everyone who joined us today and provided such great advice. We hope you enjoyed the chat as much as we did, and we look forward to more MVP Twitter Chats in the future.

    What advice would you give to new MVPs for ways to best engage with other MVPs?

    • Get out and chat with folks, don't hide in the corners checking e-mail. Attend the social events.
    • Like I've encouraged in the past, get to know MVPs outside of your expertise. You never know when it'll help!
    • Connect, converse, follow, ask questions. Join chats like this one. Ask us questions. BE HUMAN!
    • Get into mailing lists, follow on Twitter, engage on Facebook, attend summits
    • Seek out the blogs and sites run by other MVPs and participate in other MVP communities. You can learn a lot.
    • Look at blogs and social media sites. And at Summit, talk with others! Remember - we're all geeks about our expertises. :)


    After receiving the award, what are the 1st things you should do to make the best use of the MVP Award Program?

    • Take advantage of the resources, explore MSDN/TechNet, visit the private NGs & sign up for the lists in your product.
    • 1) sign up for PG email list; 2) book tickets to Global Summit
    • Go put the award on your resume /LinkedIn/other profiles :)
    • Connect with your MVP Lead and fellow MVPs in your expertise. You'll already have something in common.
    • Get to know your lead-they can be an amazing resource (Thanks @hekost!) Then take advantage of new resources for networking.


    We often hear the most valuable part of being an MVP is the other MVPs & relationships you form… any best memories to share?

    • About 3 summits ago, @adefwebserver and I spent every night coding new changes into my website
    • My fav MVP memories are of events like the annual PPT dinner at Summit. Great times!
    • My MVP Lead and wife secretly coordinating haveng my Navy retirement flag flown at MS HQ. Meant a lot to me!
    • Lots of great memories of MVP friends like Shauna Kelly and Nate Oliver.
    • Mine was meeting my fellow MVPs for the first time at last year's Summit. It was fun. Plus, @kilaMOMjaro make awesome fudge
    • When our OZ MVP practiced to say his name as American as he could and we all spontaneously corrected him in his Oz accent. ;)
    • The time @patricia_eddy and @jaydeflix surprised the other Outlook MVPs by announcing they were a couple. (now married)
    • Honestly for me theres not really memories, it's more of a feeling of safety. I can trust my life to these people!


    What advice do you have for fellow MVPs looking for opportunities to speak/present?

    • Shameless self promotion works for me! Pursue every opportunity no matter how small to build up your speaking skills.
    • Connect with local users groups or community centers. Good subjects are privacy, online security and basic PC knowledge.
    • Make sure local groups know you're an MVP and willing to speak. Lots of groups always looking for speakers.
    • User groups, Rotary/Kiwanis, Chamber of Commerce, business groups...
    • Engage with communities and microsoft employees. Make people notice you're around whenever you visit a different place
    • As said, local user groups. I've also found local community colleges love the interaction!


    What is the most important learning resource for MVPs?

    The overwhelming response to this question: Other MVPs

    • Your fellow MVPs and the community at large.
    • For me it's been the product groups themselves and other MVPs.
    • There are so many! Product team blogs are great. The MS Virtual Academy is tremendous too!
    • Fellow MVP's in any medium and then take advantage of MSDN/Technet to learn even more. Then exercise knowledge by sharing.


    What is the most important advice you can give to someone that would like to receive the MVP Award?

    • Be an expert, share your knowledge and experience, and be positive and professional at all times.
    • Learn as much as you can about the product and share that knowledge freely with the community.
    • Speak with other developer and msft evangelist
    • Keep going, if you're passionate and love to share, you'll get it
    • Do NOT go after being an MVP for your own glory. You SERVE the public. We're here to humbly help where ever we can.
    • Do what you like most, inspire and SHARE
    • Do something in the community that you love to do, and then do it (a lot!)
    • Be visible. Be good at what you do. Don't contribute just to be an MVP, contribute to help. (My two cents)
    • Make sure to have fun while helping! Enjoying yourself helps keep you engaged and not burnt out



    0 0

    It’s Friday again, which means it’s time for another MVP Friday Five. This week the posts featured are filled with technical content, including tips, how-to’s and step by step solutions.  These articles are just a few examples of the incredible content MVPs provide to the tech community each week!


    1. Compile code entered at run time, execute it, and get the return result in C#

    By Visual Basic MVP Rod Stephens | @CSharpHelper

    In this post, Rod uses CodeDomProvider to compile text entered by the user and execute it, passing it a parameter and receiving a return result.


    2. Dynamically Updating the CQWP ItemXslLink Property to Point to the Local Site Collection

    By SharePoint Server MVP Becky Bertram | @beckybertram

    Becky shows you how you can dynamicaly modify the content query web part's ItemXslLink property to point to an XSL file in your local site collection.


    3. Geospatial Support for Circular Data in SQL Server 2012

    By SQL Server MVP Leonard Lobel | @lennilobel

    In this blog post, Leonard explores one of the enhancements SQL Server 2012 adds to the spatial support: support for curves and arcs (circular data).


    4. Where’s my Tracking Toolbar?

    By Project MVP Sam Huffman | @Sam_Huffman

    Sam discusses some ways you can utilize the tracking tools available in Project 2010 to replace the Tracking Toolbar that was in Project 2007 including creating a tracking tab or tracking group.


    5. A Smarter Infrastructure: Automatically filtering an EF 4.1 DbSet

    By Windows Phone Development Matt Hidinger | @matthidinger

    Matt shows how to solve an issue that would commonly be solved with a repository pattern using EF 4.1 code and building a smarter infrastructure.


    If you’re an MVP and would like your blog posts considered for our MVP Friday Five, please reach out to your MVP Lead or provide your URL in the comments section below!

    0 0
  • 01/30/12--10:08: Keeping Your Documents Safe
  • Editor's Note: The following MVP Monday post is by Enterprise Security MVP Debra Littlejohn Shinder.

    We use our computers for many purposes: to browse the web, listen to music, watch videos. One of the most important is the creation, sending/receiving and/or storage of documents. Some of these documents are fairly trivial, easy to reproduce and non-sensitive. Others – such as our financial and tax documents, personal correspondence, original fiction or non-fiction writing or our work product – represent many hours of our time, would be difficult to replace, and/or are highly confidential. Those in the latter category deserve extra effort to keep them safe from intentional or inadvertent modification, destruction or prying eyes.

    Luckily, there are a number of technologies that you can use to protect your important documents, whether you’re storing them on your hard drive, storing them in the cloud, or sending them to someone else via email. You’ll find that many of these technologies are built into Microsoft’s operating systems and applications, so you don’t even have to buy or download extra software.

    Encrypt documents on disk with EFS

    The Encrypting File System (EFS) was introduced as part of NTFS v.3 in Windows 2000 Professional and Windows 2000 Server. It has been evolved over the years as it’s been included in the Professional/Business, Enterprise and Ultimate editions of Windows XP, Vista and Windows 7, and in Windows Server 2003/2003 R2 and Windows Server 2008/2008 R2. EFS is used to encrypt files that are stored on disk.

    Note that the files to be encrypted must be on an NTFS-formatted volume and they must not be compressed. Best practice is to encrypt at the folder level rather than encrypting individual files. To encrypt a folder and its contents, do the following:

    1. Navigate in Windows Explorer to the folder you want to encrypt, right click it and select Properties.
    2. Click the Advanced button.
    3. In the Advanced Attributes dialog box, check the box labeled “Encrypt contents to secure data” as shown in Figure 1 below.


    Figure 1


    1. Click OK twice to exit the dialog boxes.


    EFS uses public/private key cryptography. It’s important to export your EFS certificates and private keys to removable storage, such as a USB key, and store it securely, because you won’t be able to decrypt your files if the key is lost. For best security, store the keys only on removable media and remove it from the computer when not in use.

    For instructions on how to back up your EFS certificate, see

    Encrypt a whole volume with BitLocker

    BitLocker whole volume encryption was introduced in Windows Vista. In its first iteration, you could only encrypt the data stored on the volume where Windows was installed (unless you wanted to use WMI scripts). Vista Service Pack 1 added the ability to easily encrypt other volumes, so if you have a partition set up for storing your documents and other personal data, you can encrypt it with BitLocker, too. BitLocker prevents an unauthorized person from being able to access your data without booting into Windows (such as by installing a second instance of Windows or another OS). It can be used in conjunction with EFS, which protects your data from other users after they’ve booted into Windows.

    BitLocker uses the AES algorithm and can be used with or without a Trusted Platform Module (TPM), which is a hardware chip built into many modern laptops. If you don’t have a TPM, you can use a PIN (user authentication mode) or removable media (USB key mode) for authentication. For better security, you can also combine the authentication methods (for example, use the TPM and USB key, or even all three together). Using BitLocker to encrypt volumes requires a number of steps. For step-by-step guidance, see

    Encrypt documents on removable storage with BitLocker to Go

    If you have the Enterprise or Ultimate edition of Windows 7, you can use a new feature, BitLocker to Go, to encrypt your documents when they’re stored on a removable USB drive or a flash memory card. You set a password that has to be entered to read the data on the drive. You don’t have to have Windows 7 to decrypt and read the documents on another Windows computer, either. When you encrypt the removable drive, a reader application is installed on it that will prompt for the password when you connect the USB drive or memory card to an XP or Vista computer.

    To encrypt a USB drive or memory card with BitLocker to Go, insert it and right click its icon in Windows Explorer. Then follow these steps:

    1. Select Turn on BitLocker. It will take a moment for BitLocker to initialize the drive.
    2. Next choose how you want to unlock the drive. You can use a password or a smart card. If you use a smart card, you’ll need to insert it and you’ll use the smart card PIN when you unlock the drive. If you select to use a password, type it in and confirm it, as shown in Figure 2.


    Figure 2


    1. Click Next.
    2. Select how you want to store the recovery key. The recovery key can be used to access the drive if you forget the password or lose the smart card. You can save it to a file or print it. The recovery key is a 48 digit number (Example: 414260-501435-535601-423313-623887-246961-490193-040821).
    3. If you select to save the recovery key to a file, select a location and click Save. The default location is your Documents folder, but this may not be the most secure location. Remember that anyone who has the recovery key will be able to unlock the drive without knowing the password or possessing the smart card.
    4. Click Next.
    5. On the next page, click Start Encrypting. It may take a while to encrypt the drive, depending on its size. A progress bar will appear, as shown in Figure 3. You can pause the encryption process. Don’t remove the drive during the encryption process without pausing, or the files could be damaged.

    Figure 3


    When encryption is complete, close the dialog box and your files will be protected. If you remove and reinsert the USB drive or memory card, and click on it in Windows Explorer, you’ll get the message shown in Figure 4, that the drive is not accessible and access is denied.

    Figure 4


    To unlock the drive, you must right click it and select Unlock Drive … . This will display the dialog box that asks you for your password (or smart card and PIN). If you want the drive to be locked when used on other computers, but don’t want to have to go through the unlock process every time you use it on this computer, you can check the box that says Automatically unlock on this computer from now on.

    If you forget your password or don’t have your smart card, select the I forgot my password link and you can either type in the recovery key or get it from a USB flash drive (if you’ve stored it there).

    After you’ve unlocked the drive, you can manage BitLocker options by right clicking the drive name in Explorer and selecting Manage BitLocker… . This provides you with the ability to change the password, remove the password, add a smart card, save or print your recovery key again, or set the drive to automatically lock on this computer, as shown in Figure 5.

    Figure 5

    You can only remove the password if you first add a smart card to unlock the drive.

    Things work a little differently if you insert the BitLocker-protected drive in a Windows XP or Windows Vista computer. In that case, you get a dialog box that gives you the option to install or run the BitLocker to Go Reader program. After you do that, you’ll see the prompt to enter your password. You won’t have the option to automatically unlock the drive on this computer.

    The BitLocker to Go Reader interface displays the files on the drive like Windows Explorer, but you can’t open them here. You’ll be asked if you want to copy them to your desktop. You can drag and drop them from the Reader window. You can’t save or change files on the protected drive when it’s in a non-Windows 7 computer.

    Other ways to protect your documents

    There are several other mechanisms in Microsoft operating systems and applications by which you can protect the confidentiality, integrity and authenticity of your documents, including the following:

    Password protect documents in Office applications

    You can set passwords on Word documents, Excel workbooks and PowerPoint presentations using the encryption feature in Office applications. Find out how to do that here:

    Digitally sign your documents

    You can protect your documents’ contents by adding a digital signature, so that if someone makes changes to the document after you sign it, you (and recipients of the document) will know that it has been changed. Find out how to add digital signatures here:

    Encrypt email messages in Outlook

    If you send your document in the body of an email message, using Outlook, you can encrypt the message contents using your private key so that it can be read only by others with whom you have shared your public key certificate. Find out how to send encrypted messages with Outlook here:

    Encrypt data sent between Outlook and an Exchange server

    If you have an Exchange account, you can select to encrypt the data that is sent to and from the Exchange server from Outlook to protect it while in transmission. Find out how to do that here:

    Document protection on a business network

    On a company network, there are additional protective mechanisms that can be employed to keep documents safe, such as IPsec encryption to protect data as it travels across the network and Rights Management Services/Information Rights Management to prevent legitimate recipients from forwarding, copying or printing your documents.


    Author's Bio

    DEBRA LITTLEJOHN SHINDER, MCSE, MVP is a technology consultant, trainer, writer and analyst who has authored, edited or contributed to over 25 books on computer operating systems, networking, and security. She edits GFI Software’s weekly WinNews newsletter and writes a weekly column called Microsoft InSights for TechRepublic/CNET, as well as a monthly column on Cybercrime and twice-monthly blogs on smart phone and mobile technology. She is lead author for, and Her articles on various tech issues are regularly published in online and print magazines. She has spoken at various technology conferences and presented web-based talks on various security and technology topics. Deb currently specializes in security issues and Microsoft products.


    MVP Mondays

    The MVP Monday Series is created by Melissa Travers. In this series we work to provide readers with a guest post from an MVP every Monday. Melissa is a Community Program Manager for Dynamics, Excel, Office 365, Platforms and SharePoint in the United States. She has been working with MVPs since her early days as Microsoft Exchange Support Engineer when MVPs would answer all the questions in the old newsgroups before she could get to them

    0 0

    ASP.NET/IIS MVP Eduardo Zabat Lorenzo

    From: Philippines

    Time as an MVP: 3 years




    Which technical communities are you most active in?

    MSPinoy: Microsoft Philippines Users Group


    How did you first start in community?

    Officer of .Net User Group


    What's the best technical tip you have for implementing Cloud deployment?

    DO NOT allow devs to use Visual Studio to deploy. An ideal deployment process should be similar to a SharePoint team, where there is a designated deployment manager who can keep track of changesets.


    What do you tell people who are considering using the Cloud, but aren't sure about making the move?

    Most of the time, when I ask why they are not sure yet I often get vague answers, but most revolve around "I don't know that much about it to make a decision,” so I go ahead and try to add to their knowledge by bringing up a few items.

    • The 99.95% SLA- Not only does the Cloud offer such a high promise of uptime, it also offers extra services for recovery. For ISVs and startups, I show them how, with the cloud (Azure in particular), they will have a full data center and all the capabilities at their disposal without the large investment on hardware..
    • Opex vs. Capex (operating expenses versus capital expenditures) - This area is still relatively new to the Philippines, so I try my best to shed some light. In the classic data center, a company will need to invest a huge amount of money in order to put up their datacenter, and invest even more money to maintain it. With the Cloud, a data center as a service or subscription eliminates the need to purchase the machinery, saving them money. And if this is not enough, I try to inject the pains of having to upgrade, maintain and then dispose of deprecated data center hardware and software.
    • The global market- Most ISVs here in the Philippines still offer their applications as standalone and installable software. Very few have ventured into offering their apps as a service. With the Cloud, and modifications to the application, ISVs can then start to offer their product/s to a global market in a much simpler way, without the marketing efforts or putting up offices abroad etc.


    What words of advice do you have for new MVPs?

    In general, although MVPs are not “open source,” MVPs should still be open-minded. For Cloud/Azure in general, never talk about something you have not dealt with in real life.

    0 0

    Another week has passed, and we have another Friday Five to sum up some of the great contributions MVPs have made recently through articles and blog posts! This week, our MVP Friday Five is filled with even more expert knowledge that we’ve all come to expect from MVPs.


    1. SSRS: Display User Data Fields for a SharePoint List

    By SharePoint Server MVP Laura Rogers | @WonderLaura

    Laura shows you how you can create a SQL Server Reporting Services (SSRS) report to create a better user experience for site users who fill out a list item, form or survey in SharePoint.


    2. Identity crisis: Why won’t Outlook for Mac open its own messages!?

    By Macintosh MVP William Smith | @meck

    William addresses an issue that is common in the Microsoft Answers Outlook for Mac forum: the “Outlook cannot open the file because it is not associated with the default identity” message.


    3. Big Data and SQL Server: Disruption or Harmony?

    By Visual Basic MVP Andrew Brust| @andrewbrust

    Andrew discusses the disruption of Big Data, Hadoop and its MapReduce distributed-computing approach has caused to SQL Server, and how Microsoft is working to create harmony between them.


    4. Disabling Network Pop Ups

    By Outlook MVP Diane Poremsky | @dicot

    In this article, Diane explains step by step how to disable popup messages about Exchange server messages, network warnings & network connectivity that can be annoying.


    5. Removing Duplicate Dimension Rows in SSIS

    By SQL Server MVP Michael J Swart | @MJSwart

    In this article Michael takes you step by step through his method for solving the problem of creating a data flow transformation which removes duplicate key values in SQL Server.


    If you’re an MVP and would like your blog posts considered for our MVP Friday Five, please reach out to your MVP Lead or provide your URL in the comments section below!






    0 0

    Editor's Note: The following MVP Monday post is by Windows Expert MVP Mike Halsey.
    When we think of using a cloud service such as Office 365 in our business, initial thoughts will usually drift towards comforting ones of Microsoft taking the strain with providing the hardware, manpower to keep the system maintained and a backup strategy that will help keep your system working and online.  While all of this is true, and indeed many companies have adopted Office 365 not just for the cost-savings they can make and for the peace of mind that comes with not having to maintain the hardware, operating systems and other software yourself, there are other considerations closer to home that you will need to make.

    If we look at Office 365 holistically, which is essential in a modern business, it will soon become apparent that there’s much more to it than this.  Microsoft can and does maintain all the security for the central Office 365 system.  What they don’t control is your own security, and I’m not just talking about maintaining up-to-date anti-virus and malware protection on your PCs (though obviously this is important).

    The reason you need to maintain good security is because the information and data you store in Office 365 is valuable and not immune to theft by staff, hackers or malware writers.  It is also sensitive and private information, not only for the work you undertake at your company, but especially if you work with individuals as clients where you are commonly being entrusted with private information.  You need to make certain that you properly protect this and simply relying on Microsoft to secure the back-end Office 365 servers and Internet portal isn’t enough.

    So what are these security concerns, and do they need to cause you a headache?  The answer to the latter question is that with a bit of planning and observation they don’t need to cause you a headache at all.  In answer to the former question they are varied and many.

    The security of your company’s Internet and computer infrastructure is the first of these.  Do you have a properly secured router for example, with a non-guessable password for both employee Wi-Fi access and the administrator interface?  Are you keeping your computers up to date with Windows Updates and malware protection?

    In addition to this, what are your company’s policies regarding the use of removable media devices, such as USB Pen drives, external hard disks and burnable CDs and DVDs?  Within the standard Group Policy controls for a PC running the Professional version of Windows and above you can block these devices on a computer or a user-level without needing to manage your own Windows Server.

    When it comes to users, does your staff have the correct permissions for both their PCs and their Office 365 access?  It’s very easy to let user permissions become unmanageable, which is why you should have just a couple of people within your company responsible for overall permissions in Office 365 itself, and why all staff should be Standard users in Windows.

    There are other devices we now use with our computers at work.  These can include smartphones, the Windows Phone includes tight integration with Office 365.  Do you know if your staff have passwords set on the smartphones they use to access business information?

    This also extends to the computers people use outside of the office to access and store work documents and information.  If staff are using their own computers and laptops you will have very little, if any, control over the security they choose to employ.  Are these computers secure and up to date?  Do children and other non-employees have access to the same user account used to access your Office 365 system?

    Much of this falls within the remit of staff training and company policy, but the easier that products such as Office 365 make collaboration and data sharing, the more aware we all need to be of the responsibilities we have in doing so, which include maintaining compliance with data protection and privacy laws.

    The good news is that Office 365 does make managing its own security pretty simple and straightforward, especially when it comes to the process of managing users and their permissions.  You don’t need to be an IT Pro to sort users into the pre-defined and clearly labelled groups.  Nor do you need to be an IT Pro to advise staff against using their own computers for work, or to advise them to make sure their computers are up to date and protected.

    In my book, Need2Know: Office 365 Security Essentials, I work through all of these aspects and take a completely holistic view of the security required to use Office 365 in a trouble-free way.  It is neither complicated nor difficult to manage, but it can often be seen as initially daunting to identify all the components involved, and to explain these and their importance to employees.

    What you’ll find yourself doing is entering into a valuable partnership with your employees where you will, probably inadvertently, be helping to raise their own awareness of computer security and how they can protect themselves and their families.  This will come through the trust that you’re demonstrating you have in them with your company’s data and the roles you provide them for accessing and managing this on your behalf.

    With the correct policies, training and outlook, all of which is based on nothing more complicated than common sense, you will find that using Office 365 becomes a truly worry-free process, where you can rest safe in the knowledge that every angle is covered, and that the future of your business is secure.  Not bad for something that won’t cost you anything is it!?


    Author's Bio

    Mike Halsey is a Microsoft MVP (Windows Expert) and the author of Need2Know: Office 365 Security Essentials from Fair Trade DX.  He writes regularly on security subjects and is also the author of Troubleshooting Windows 7 Inside Out from Microsoft Press and several forthcoming books on Windows 8.  You can follow Mike on Facebook, Twitter and at his website TheLongClimb.


    MVP Mondays

    The MVP Monday Series is created by Melissa Travers. In this series we work to provide readers with a guest post from an MVP every Monday. Melissa is a Community Program Manager for Dynamics, Excel, Office 365, Platforms and SharePoint in the United States. She has been working with MVPs since her early days as Microsoft Exchange Support Engineer when MVPs would answer all the questions in the old newsgroups before she could get to them

    0 0


    SharePoint Server MVP Patrick Yong

    From: Malaysia

    Time as an MVP: 3 years

    Blog: Patrick's Bytes


    Which technical communities are you most active in?  and

    How did you first start in community?

    Started as an active forum participant in local UG

    What's the best technical tip you have for implementing Cloud deployment?

    Cloud app deployments require you to rethink your storage and database usages. You can no longer take your SQL Server and storage for granted, you have to plan where to best store them and how to optimize them for download. One strategy I use is that I store static pictures with local web hosting for better latency and cost, and have ASP.NET sessions store on Windows Azure Storage. But for apps with a global audient reach, I would consider using Windows Azure CDN for latency.

    What do you tell people who are considering using the Cloud, but aren't sure about making the move?

    As a developer, you can get an enterprise level web app and database running without knowledge of deploying them with high availability and scalability configuration.

    What words of advice do you have for new MVPs?

    The MVP Award Program is the venue for you to meet like-minded professionals with passion such as yours. I had grown a lot in terms of UG leadership.  Take part in local MVP meet ups organized by your MVP Lead and sign up for MVP Summit when possible!

    0 0

    The 2012 MVP Global Summit is just weeks away, and we can't wait to see all of the MVPs here in the Seattle/Bellevue/Redmond area! I grew up around here, so I decided to take our camera and show everyone some of my favorite places around town. I've included some great places to grab a bite to eat, places to hang out, pubs and bars to get a drink, the Hyatt and more in this video.

    I really can't wait to meet all of the MVPs who are traveling near and far to join us for this year's Summit, and I hope you all enjoy the area as much as I do!

    (Please visit the site to view this video)

    Also, don't forget to stay connected with all things MVP Summit on our MVP Global Summit 2011 blog and by joining the twitter conversation at #mvp12.

    0 0

    We have a great MVP Friday Five in store for you today! We’ve selected some expert technical content in the five articles we’ve chosen this week. As always, these articles are filled with how-to’s, step by step instruction, and expert tips that MVPs are known for.


    1.  A Custom High-Availability Cache Solution

    By Windows Azure MVP Brent Stineman | @BrentCodeMonkey

    Brent walks you through his approach to solve a business need for an easy to manage session state service that is highly available, low latency, but not persistent With small session caches, but a high end user load.


    2.  Transitioning Between Sprints/Iterations with TFS

    By Visual C# MVP Mark Michaelis | @MarkMichaelis

    Mark walks you through the steps to eliminate the step of modifying all “current sprint” queries when transitioning from one Sprint to another by creating a “release” called “Current”.


    3. Silverlight ComboBoxItem IsEnabled, SL5 Style

    By Silverlight MVP Tony Champion| @tonychampion

    In this article, Tony shows you how to disable a ComboBoxItem so that users can view all options, and can only select the available options.


    4. Another Way to Kick-Start F# WPF Apps

    By Visual F# MVP Daniel Mohl |@dmohl

    Daniel shows you how to access his F# Empty WPF project template that can be used to kickstart F# WPF apps and more easily leverage the many available templates.


    5. Creating E-Mail Alerts for Team Members in TFS

    By Visual Studio ALM MVP Ed Blankenship |@edblankenship

    Ed shows you around the administrator options for creating alerts for team members in TFS.


    If you’re an MVP and would like your blog posts considered for our MVP Friday Five, please reach out to your MVP Lead or provide your URL in the comments section below!


    Also, as the 2012 MVP Global Summit is quickly approaching, we’re looking for blogs stories from MVPs who are attending covering their experiences meeting and learning from other MVPs. If you plan on writing about your experiences at this year’s summit, please leave your blog Url in the comments section below!

    0 0

    Editor's Note: The following MVP Monday post is by ForeFront MVP Jordan Krause.

    I have been working with Microsoft DirectAccess for about two years now, and I typically find myself writing or speaking about a deep-dive description of “this” or a technical write-up of “that”. Today I wanted to take a step back and cover DirectAccess at a higher level, both because there are some real world scenarios that anyone, not only the network security team, would be interested in hearing about, and also because speaking with new individuals and organizations almost daily over the past two years has brought me to realize that the majority of the IT population is still unaware of this amazing new technology. So here’s to spreading the word…

    1.  Userless VPN

    I almost titled this one “Users will consider you a hero” but it looked silly on paper. Apparently not too silly as I just typed it anyway. Think of DirectAccess as a completely automatic VPN connection. Around the office here, we like to call it “userless”. A DirectAccess laptop is connected to the corporate network automatically, without user input, the moment that it receives internet connectivity. One of the reasons that I love working with DA so much is the feedback I receive from, well, everyone. Users love it because their workflow processes are exactly the same whether they are sitting in the office or sitting in a coffee shop, IT loves it because those laptops are always available and managed (more on that later), and executives love it not only for their own use, but also because of the reduced helpdesk costs that it brings to the table (also more on this later).

    2. Reduced support costs - ROI

    In the majority of my implementations, a reduction in support and helpdesk costs is a bonus side-effect that is often not realized until months after the rollout of DirectAccess. In most companies, a high percentage of helpdesk calls are from remote users struggling with a VPN connection. Here are some of the things you will no longer need to worry about:

    Forgotten passwords – There aren’t many good options for an employee who has forgotten their password and isn’t going to be back in the office in the near future. Nor for a user who reset an expired password on their desktop at the office, only to find out that this password change was not reflected on their laptop that they are now trying to use from home. IF you can get logged into the laptop with an old cached password you stand a decent chance at getting this situation straightened out, though it’s still going to be a headache and time consuming for the helpdesk. On the other hand, I have seen far too many cases where the password was forgotten and the only recourse is for the helpdesk to reset the password in Active Directory. In this situation, until that laptop is plugged back into the corporate network the only purpose it’s going to serve is to emit a friendly glow while it sits on the login screen. As you may have guessed by now, these problems are non-existent on a DirectAccess laptop. When the helpdesk resets a password in Active Directory, that new password is available for the user to type into their login screen in real-time. The user can literally call the helpdesk – “I forgot my password”, helpdesk resets password, user logs in with new password, and be off the phone in less than a minute.

    Port restricted firewalls – We have all been in a hotel room or connected to a public WiFi only to discover that we have internet access, but our VPN will not connect. I won’t get into the technical nitty-gritty here, but will simply state that DirectAccess is able to work around these kinds of firewalls that prohibit traditional VPNs from connecting.

    VPN software not working – Having VPN means you have a VPN software that is installed on the client computer. Sometimes software breaks, it’s inevitable. DirectAccess has no client software. The componentry for DA is baked right into the Windows 7 operating system. There’s nothing to install, nothing to break, and therefore nothing to worry about.

    3. “Always-on” access for management and patching of your remote devices

    Many of you probably realized this benefit after reading above about the always-on user experience. A seamless, self-connecting tunnel to the corporate network not only enables users to have a continuous connection to the network, but also allows the network to have a continuous connection to the laptops. Even before the user authenticates to the machine, as soon as that machine gets internet access an IPsec tunnel is established that we like to call the “Management Tunnel” or “Infrastructure Tunnel”. This means that if the device is turned on and has an internet connection, even if still sitting at the login screen, the IT department and management servers have the ability to push patches, push SCCM, push Group Policy objects, and even remotely control that remote computer from the corporate network. There’s no more waiting around for users to connect their VPN before patches and antivirus definition files can be updated, with the implementation of DirectAccess organizations see patch application rates immediately skyrocket. This always-on management capability is actually the sole reason that many of the customers I work with decide to use DirectAccess. While they all have plans to move to the “two way street” with DirectAccess enabling the users to access applications in the future, for the present time they may be happy with whatever remote access solution they currently have and instead of scrambling to train all of the users on something new, DirectAccess is being implemented as a “one way street” only allowing this management access and using it only for the continuous updating of their remote devices. Even in this limited one-way street/manage-only kind of installation, you still get the password reset benefits that I mentioned earlier.

    4. The Branch Office Scenario

    Now that you have a grasp on what DirectAccess is and how it could benefit both your remote users and your management systems, let’s expand the playing field a little. In most cases when referencing a “DirectAccess client computer” we are talking about a laptop that is roaming the earth, connecting back to the corporate datacenter automatically whenever that machine gets an internet connection. Another less obvious way to gain benefit from DirectAccess as a technology is what I call the Branch Office Scenario. Many, many companies have multiple physical locations. There is commonly a main office and one or more branch offices which contain a lesser number of personnel. I speak with companies all the time who have branch offices all around the country or the world, and in most cases these branch offices are connecting back to the main office by either a semi-finicky site-to-site VPN, or by an expensive MPLS circuit. I used to work for such a company where we had hundreds of offices, many of them with only 2-5 people, and each had a dedicated frame relay circuit that was a lot of money for very little bandwidth. The monthly cost combined with the equipment cost and the stack of networking equipment piled up in the corner of these mostly single-room offices made the whole thing seem silly at times. How would you like to dump all of those expensive lines for regular internet connections? Enter DirectAccess. With DA running in your main office, you can trade in the dedicated circuits in these remote sites for regular internet connections, giving you much more bandwidth for a fraction of the cost. Then, provided your computers in that remote office are Windows 7, you simply make those computers DirectAccess connected computers and voila, they are all connected back to the corporate datacenter over secure IPsec tunnels 24x7x365. What about that local file server that might be sitting in one of your larger remote offices? Got that covered as well. Not only can Windows 7 operating systems run DirectAccess, but a Server 2008 R2 can also be a DA client and connect seamlessly back to the corporate network.

    5.  You already have it, why not start using it?!

    I don’t want this to be misleading, you do not currently own EVERYTHING that you need to turn DirectAccess on, but if you have already accomplished or are planning to accomplish a Windows 7 rollout like so many companies are right now, you are awfully close. As stated earlier, there is no client software that needs to be installed to run DirectAccess. All of the components necessary to run this technology are baked right into the operating system of Windows 7 Enterprise, Windows 7 Ultimate, or Server 2008 R2. All you need is the DirectAccess “gateway” for which you have a number of options. There are two different flavors of DirectAccess today. The first is native DA for which you only need a simple Server 2008 R2 server in your network to be the gateway. Native DirectAccess comes with some particular requirements and limitations that make it harder to justify, like needing IPv6 inside your network and requiring all of your application servers to be Server 2008 R2. However, by far the more common flavor of DirectAccess is that provided by Microsoft’s Unified Access Gateway (UAG) platform. UAG is available as a software that you can install on your own Server 2008 R2 box, or available from Microsoft OEM system builders as specialized, hardened turn-key networking appliances. UAG brings so many advantages to the table, I will list just a few of them here. When running UAG for DirectAccess:

    No IPv6 requirements – The need for IPv6 and all Server 2008 R2 inside your network goes out the window. IPv6 is still an integral part to the way that DirectAccess works, but UAG contains translation technologies known as NAT64/DNS64 that will make all of theappropriate translations for you so that you don’t need to change your internal infrastructure to take advantage of DirectAccess. In fact, I have a demo environment running an IPv4-only network and Server 2003 application servers (not the UAG gateway, that is a DirectAccess Concentrator appliance built by IVO Networks) and running Active Directory 2000 and everything works perfectly.

    Array and load balancing capabilities – Native DirectAccess does not provide you a way to run multiple gateways for redundancy. UAG provides the ability to join multiple gateways together in configuration arrays so that you need not make changes on each gateway individually, and also provides a Network Load Balancing mechanism that allows you to join multiple gateways together in active/active for both growth and redundancy purposes.

    Security – Native DirectAccess means plugging a regular, general purpose server into the edge of your network. UAG contains Threat Management Gateway, Microsoft’s robust firewall software so that your gateway (and everything behind it) is protected from the www.

    Web portals – UAG is not only an engine for DirectAccess, but also contains full-fledged SSLVPN functionality. With UAG you can simultaneously provide a DirectAccess entrypoint, and one or more web portals that can provide browser-based access to applications and even full SSLVPN connectivity at the same time. Maybe one of your employees has a DirectAccess laptop but left it at the office and needs to check email or pull a document out of SharePoint from home. With a UAG portal running you have a secure entry-point that they can jump into and grab what they need even without their corporate machine handy. The technical capabilities of UAG can (and have) filled a book, so I will leave it at this for now – UAG is designed to be a one stop shop for remote access. In many cases an implementation of UAG/DirectAccess on a single appliance (or array of appliances) equates to shutting down multiple vendor remote access solutions such as VPN, SSLVPN, virtual desktop solutions, etc. Consolidation of remote access makes life easier for the users, cuts down on administration time, and is good for the budget.

    So there you have it, my summary of what I believe to be the future of remote access. I am fortunate enough to be immersed in these technologies daily so if you have any questions, or if there are any particular areas of DirectAccess that you would like to see expanded upon in subsequent articles, please feel free to reach out to me.

    Author's Bio

    Jordan Krause is a Microsoft Forefront MVP and enjoys working “on the edge”. As a Senior Engineer at IVO Networks he spends most of his days designing and implementing the integration of Forefront technologies for enterprises around the world. Jordan’s primary focuses are Unified Access Gateway and Threat Management Gateway, his favorite technology without a doubt being DirectAccess provided by UAG. Committed to continuous learning, Jordan holds multiple certifications including Microsoft Certified IT Professional in Enterprise Administration (MCITP: EA). He posts Forefront related articles and tech notes on the following page: and can be found via Twitter @jokra.

    MVP Mondays

    The MVP Monday Series is created by Melissa Travers. In this series we work to provide readers with a guest post from an MVP every Monday. Melissa is a Community Program Manager for Dynamics, Excel, Office 365, Platforms and SharePoint in the United States. She has been working with MVPs since her early days as Microsoft Exchange Support Engineer when MVPs would answer all the questions in the old newsgroups before she could get to them

older | 1 | .... | 10 | 11 | (Page 12) | 13 | 14 | .... | 40 | newer