Quantcast
Channel: The Microsoft MVP Award Program Blog
Viewing all 788 articles
Browse latest View live

MVP Friday Five: March 16, 2012

$
0
0

We've seen some great posts from our MVPs and selected a few of them to share.  Great work guys!

 1. Analyze server performance with Microsoft System Center Advisor: How to get started

By System Center Cloud and Datacenter Management MVP John Joyner@john_joyner

John explains what System Center Advisor does and how the goal of this Software Assurance benefit is
increased uptime for servers and applications.

 

2. Five Steps to Create a Custom SharePoint FBA Login Page

By SharePoint Server MVP Elczar Pelarta Adame

This blog post talks about how to add digital signature in Infopath2010 form through Sharepoint 2010
browser without opening it in Infopath2010. It also teaches you how to make workflow status as
approved after signing the form.


3. Understanding the Portable Library by Chasing Icommand (1 of 3)

By Silverlight MVP Jeremy Likness | @jeremylikness

Jeremy shows us how to directly create portable class libraries and reuse them without having to
recompile.

 

4. How to Backup and Restore Exchange 2010 Using Symantec Backup Exec

By Exchange Server MVP Mahmoud Magdy Solima | @_busbar

In this blog series Mahmoud explores the options and methods to backup and restore Exchange
2010 either single server or DAG using Symantec Backup Exec 2010.


5. jQuery Tip #6 – Creating a Custom jQuery Selector

By Silverlight MVP Dan Wahlin | @danwahlin

Dan walks us through the creation of a custom jQuery selector and just how easy it is to
implement.

 


SharePoint Development and Application Lifecycle Management

$
0
0

SharePoint Development and Application Lifecycle Management

The topic of application lifecycle management (ALM) has been close to my heart since I entered the
development industry more than 10 years ago. I started off in ASP web development, which quickly
matured into ASP.NET web development. As I gained more experience, I found myself on larger
team projects – it was there that the value of ALM really came to the fore, and the topic itself piqued
my interest.

If you do a quick Bing search on ALM, you’ll get lots of definitions but these two really stuck out and
really need no further explanation at a high level:

“Application Lifecycle Management (ALM) is a continuous process of managing the life of an
application through governance, development and maintenance."
-Wikipedia

“ALM is the marriage of business management to software engineering made possible by tools
that facilitate
and integrate requirements management, architecture, coding, testing, tracking, and
release management.”

-Wikipedia

David Chappell has written some great whitepapers on ALM also and defines the three main aspects
of ALM: Governance, Development, and Operations.


The diagram above really highlights where these three aspects come into play within the overall
lifecycle of an application. You can gain more detail and insight into those three areas in his white paper.

Scoping an Application in SharePoint

In SharePoint terms, an application in my mind is any solution that is built on top of SharePoint. Often
people will say the “intranet” is the solution. Now when you develop an intranet on SharePoint, it is made
up of many parts such as:

  • Branding of master pages/page layouts
  • Web parts that sit on the homepage
  • Content types defined for the information
    published
  • Many more artifacts that are typically put
    together

Each of these things should be included in scope of the lifecycle of the solution and, therefore, the
application.

I also find that each of the aforementioned parts will have a different development lifecycle and are
typically released into Production environments at different frequencies. For example, the initial
branding might be deployed, and then over the period of the first six months there are lots of
changes frequently rolled out on a weekly basis. Alternatively, the definition of the content types may
get deployed as part of the go-live of the project and not require any changes for a year until the
information architecture is reviewed. As such, I recommend to development teams that they understand
these nuances of their application and subsequently treat the lifecycle differently.

The other issue with scoping it at the intranet level is that often these artifacts overlap: For instance, it is
common for other “solutions” in the business perspective to be released inside the intranet and re-use the
branding elements and content types, which brings dependencies into the releases and often requires
changes. I personally try, and recommend to teams, to think more granularly when developing these to be
able to handle the lifecycle at a lower level and not get tied into thinking about “solutions” from a business
mindset so they can do so from an artifact development perspective.

SharePoint Designer vs. Visual Studio Development

The term “SharePoint Development” always brings up some difficult discussions when you have
“Developers” who use Visual Studio, write managed code (C#/VB.NET), and generate Solution Packages
in the same room as “Developers” who use SharePoint Designer and write client side code
(HTML/JavaScript/XSLT) to build their applications. Essentially these are both forms of SharePoint
Development, but it’s important to remember that each have their pros and cons when it comes to
application development.

Automated Deployment

The Visual Studio developers will have a much better automated deployment story among their
development, test, and production environments because they can package artifacts in Visual Studio
and “F5 deploy”them into an environment. Because the artifacts are written in Visual Studio, they can
easily be put in source control and can pull down to a fresh SharePoint development environment for
easy deployment.

I have found in the past that a lot of SharePoint Designer developers tend to work directly in
production due to the pains caused by having to manually cut and paste changes among separate
SharePoint Designer windows.
 

Team-based Development

Team-based development in a Visual Studio development approach can also ease changing
artifacts without directly affecting other developers, as they will likely have their own SharePoint farm
environment because Visual Studio requires it to be installed on the SharePoint farm. For example,
deployments of Solution packages will often require an IISRESET and therefore affect any developers
also using that farm. Additionally, debugging code will also prevent others from executing code.

SharePoint Designer developers tend to use a shared SharePoint farm development environment
and can leverage check-in and check-out on artifacts to try and prevent it from affecting others, but
there are certain artifact types that will immediately affect others – such as changes to content types
or list instances.

The Application V.next Issue

Another important topic to address is that although we have “artifacts”, we also have this concept of
content. Version 1.0 of the application can get pushed into production, and is heavily used by the
business. Version 1.1 is pushed to test, and in test typically the change management process just
overwrites the entire sub site in which the application sits. When you get to production, though, you
can’t simply overwrite because it has production content in it. In the scenario of an “annual leave
application”, there will be all of the applications submitted since the v1.0 which cannot be lost. In
SharePoint, the separation of artifacts and content is very blurred in that they are both stored in
lists/libraries in the sub site compared to other application approaches such as an ASP.NET web
app where the content layer is typically a SQL database with a specific schema and is more clearly
separated.

Maturity Levels of ALM SharePoint Development

As stated in the Wikipedia quotations, there are lots of activities to take into account for the lifecycle of an
application: requirements management, architecture, coding, testing, tracking, and release management.
If you have been to many conferences, you have likely heard people focus on “requirements management”
and “architecture” guidance. Unfortunately, you do not see many focusing on the other guidance areas of
ALM in SharePoint development.

 

One of the first things I introduce as a concept when I present on ALM is the ALM SharePoint
Development maturity model, which is really geared toward Visual Studio developers. As you can see above,
the least immature teams are still not even using source control for their applications while the most mature
are automatically deploying their applications.

I’m often asked, “Where can I start to improve my ALM maturity?” First, it’s imperative to understand where
you are in the model and address the recommended step at that point. Below, I’ve quickly outlined some
resources to help you progress to the next stage.  

Source Control

Obviously, if you aren’t even doing source code then you need to start here. If you’re a SharePoint Designer
developer, you may be using source control without even realizing it. As SharePoint artifacts are created and
modified in various lists and libraries, there will be versioning enabled inthe master page and page layout
galleries. The only problem with this story is that it is contained within the SharePoint environment and not
easily pulled out at the list level and moved into other SharePoint Designer environments if that model is
being followed. Because of this complexity, itisn’t common to have multiple environments and SharePoint
Designer developers share environments.

With Visual Studio developers, source control hooks are integrated into the IDE, but unfortunately I’ve
found that many developers do not use it. The compelling aforementioned reasons for using source control
for team development, along with the ability to have a backup of your source control off of your
development environment, make this extremely important!

The easiest approach is to convince your team to install Team Foundation Server, start creating TFS projects,
and check in your code directly from Visual Studio. There are varying degrees of licensing for this, which can
be found on the TFS website.

Static Code Analysis

Static code analysis was something we used a lot in our web application development projects in large
teams by running FXCop rules to ensure we were using the correct CamelCase and formatting standards
to enforce consistency. This is now built into certain SKUs of Visual Studio 2010, and typically the rule sets
are tweaked to meet the needs of the team – sometimes some rules are unchecked as deemed too fussy.

A perfect addition to this is to run SPDisposeCheck to analyze the code to see whether all SPWeb and
SPSite objects are disposed correctly. More information on why you need to do this is available on MSDN.

Both of these can be run after each build within Visual Studio, and any errors or warnings are shown in the
message window.

Automated Builds

Often the phrase “it works on my machine” comes up where classes are referenced in other classes, but the
files aren’t checked in and therefore don’t make it to the build machine. Automated builds really help here
where every check in to source control triggers the build server to pull down the latest source code, compile
the code, run any static analysis rules, and report any errors along the way. This is a great way of ensuring
that even if the developers locally don’t run the static analysis rules, they will be run here and also ensures
that all source code is checked in. TFS has very simple ways to configure all this, and SharePoint MVP Chris
O’Brien has covered this in great detail on the SharePoint Developer blog.

Automated Deployment

Automated deployment can also be triggered as part of the automated build process; often you will want to
deploy the solution packages that are constructed inside Visual Studio 2010. On a development machine,
these are generated automatically by the Package command. This command can also be triggered on the
build server, and then PowerShell can be called to deploy these to a SharePoint environment. It is a great
way to ensure that all the declarative code is correct and works at runtime as opposed to the managed
code that can be checked at compile time.

Automated Testing

Automated testing is another step that can be triggered as part of the automated build. Typically,
SharePoint unit testing is an extremely complex process because not all SharePoint classes are interfaced
and therefore no easy way to mock them without third-party tools. Due to this, teams tend to either focus
on integration testing or web testing instead. There are automated web testing tools within certain SKUs
of Visual Studio 2010 that can be used to record a path through SharePoint and replayed at a later date.

The reason all testing is complicated in SharePoint is that it is particularly hard to set up an environment,
run the tests, and then reset the environment for the next time a test is run, especially on live replication
data.

Has this all helped you? Have more questions? Please feel free to reach out with any comments you

Author's Bio

As AvePoint’s Enterprise Architect, Jeremy utilizes more
than 10 years of experience in the software development industry and his
expertise in Microsoft technologies to educate the global SharePoint community,
as well as work directly with enterprise customers and AvePoint’s research
& development team to develop solutions that will set the standard for the
next generation of collaboration platforms. Follow him on @jthake or see his
articles at https://www.NothingButSharePoint.com/

MVP Mondays

The MVP Monday Series is created by Melissa Travers. In this series we work to
provide readers with a guest post from an MVP every Monday. Melissa is a
Community Program Manager for Dynamics, Excel, Office 365, Platforms and
SharePoint in the United States. She has been working with MVPs since her
early days as Microsoft Exchange Support Engineer when MVPs would answer
all the questions in the old newsgroups before she could get to them.

Talking Cloud with Connected System Developer MVP Alan Smith

$
0
0

Connected System Developer MVP Alan Smith

From: Sweden

Time as a MVP: 8 years

How did you first get started in community?
I published a free e-book “The Bloggers Guide to BizTalk” compiled of content from the best
bloggers in the
BizTalk development community.

As there was very little documentation on using BizTalk Server at the time the guide became
a great success.
I updated it everymonth or two, and it evolved to over 300 articles from
about 50 contributors.

Which technical community or communities are you most active in?
I run the Sweden Windows Azure Group (SWAG). I tweet as @alansmith, host a website at

www.cloudcasts.net and I blog at geekswithblogs.net/asmith. I am a regular speaker at
conferences and user
groups in Sweden.

What’s the best technical tip you have today for implementing a cloud
deployment?
Whatever experience you have in .NET development you will find a way to apply your skills
to develop using
Windows Azure. The best tip I have is to get your hands dirty, create a
trial or MSDN account, and work
through some hands-on tutorials.

I started with a simple tutorial to create an Azure ASP.NET application for uploading and
displaying photos. After a couple of quick code changes I modified it to upload
and display
videos. Three weeks later I launched an Azure hosted community
webcast site
(www.cloudcasts.net).

When considering using the Cloud, what do you tell people if they aren’t sure about
moving to the Cloud?
I tell them that there is always the option to “Extend to the Cloud” rather than
“Move to the
Cloud”.

Extending to the cloud involves developing hybrid applications that leverage the capabilities
of cloud based platforms. Many of the Windows Azure solutions
developed today involve
extending the capabilities of existing applications and
infrastructure to leverage the
capabilities of Azure.

These hybrid applications allow developers to combine the best of both worlds. Existing IT
infrastructure can be utilized and the capabilities of cloud platforms
used to extend the
applications. This can provide enhanced functionality at minimal
cost and low risk.

I have delivered sessions in “Extending to the Cloud” at conferences and user groups, and
have heard feedback from attendees that they have started to use Azure in their
projects
after seeing my presentations.

I’ve recorded some of the demos I use in these sessions as webcasts.

3D Animation Rendering using Azure Worker Roles
http://www.cloudcasts.net/ViewWebcast.aspx?webcastid=2521151757355602379

Source Control Provider using Azure Storage
http://www.cloudcasts.net/ViewWebcast.aspx?webcastid=2521140395426551779

Large File Transfer musing Azure Storage
http://www.cloudcasts.net/ViewWebcast.aspx?webcastid=2521176611509322274

All three of these are scenarios where on-premise applications extend their functionally
using capabilities in Windows Azure.

Do you have a blog/website link to Cloud related Tips or deployment stories you
would like to share?
I published a free e-book covering development using the Azure Service Bus:

http://www.cloudcasts.net/devguide/Default.aspx.

I launched and maintain an Azure hosted community webcast site: www.cloudcasts.net.

I have blogged a number of walkthrough articles on using the Windows Azure Service
Bus on my blog geekswithblogs.net/asmith

Speaking from your experience, what words of advice do you have for new MVPs?
Welcome to a great community! Make the most of the MVP Summit; it’s the best place to
make contact with others in your area of expertise who share your passion.

MVP Friday Five: March 23, 2012

$
0
0

We've received some great suggestions for blog posts this past week and selected a few for you covering the gamut of Exchange Server to SQL Server.  Read, enjoy and reach out to us on your channel of choice: Facebook, Twitter, etc. with your comments or just to say hi!

 

1. Update and Populate LegacyExchangeDN Attribute for Exchange Mail-Enabled Contacts

By Exchange Server MVP Zahir Hussain Shah | @zhshah

Zahir walks through the steps for troubleshooting your Exchange Mail-Enabled Contacts, and populating the necessary LegacyExchagneDN attribute for the Mail-Enabled objects, for the proper flow of emails to these Mail-Enabled Objects.


2. RDS in Windows Server 8, What’s New, the Summary

By Remote Desktop Services MVP Freek Berson | @fberson

A series of blog posts and feature highlights to show you some of the new features related Remote Desktop Services on Windows Server 8 (Beta).


3. Recovery Advisor Feature in SQL Server 2011

By SQL Server MVP Suherman STP | @emantin34

This is a video tutorial about a new visual timeline in SQL Server Management Studio to simplify the database restore process. The timeline can be used to specify backups to restore a database to a point in time.

 

4. SQL Server Data Tools (SSDT) Lengthens Life of SQL Server 2005!

By SQL Server MVP Neil Glenn Barbilla Gamboa

SQL Server 2012 is now the flagship product of Microsoft when it comes to handling data … but with SSDT’s help, SQL Server 2005 is here to stay for a bit more.

 

5. WCF Async Queryable Services Architecture Overview Video

By Data Platform Development MVP Matthieu Mézil | @MatthieuMEZIL

Matthieu shares his video covering the WCF Async Queryable Services Architecture.

 

 

An Introduction to New Features in C# 5.0

$
0
0

 

Introduction of New Features in C# 5.0

 

1. C# Evolution Matrix

Microsoft just published a new version of C# : 5.0 beta with CLR version 4.5 (Visual Studio 11 beta).
In order to get a big picture of the whole evolution of C# language, I summarized all the key features
into a C# Evolution Matrix for your reference as below diagram shows:



In C# version 5.0, there are two key features: Async Programming and Caller Information.

2. Async Feature

Two new key words are used for Async feature: async modifier and await operator. Method marked
with async modifier is called async method. This new feature will help us a lot in async programming.
For example, in the programming of Winform, the UI thread will be blocked while we use
HttpWebRequest synchronously request any resource in the Internet. From the perspective of user
experience, we cannot interact with the form before the request is done.

private void
btnTest_Click(object sender, EventArgs e)

{

var request = WebRequest.Create(txtUrl.Text.Trim());

var content=new MemoryStream();

using (var response = request.GetResponse())

{

using (var responseStream = response.GetResponseStream())

{

responseStream.CopyTo(content);

}

}

txtResult.Text = content.Length.ToString();

}




In the above example, after we clicked the Test button, we cannot not make any change to the form
before the txtResult textbox shows the result.

In the past, we can also use BeginGetResponse method to send async request as this sample in MSDN
shows:
http://msdn.microsoft.com/zh-cn/library/system.net.httpwebrequest.begingetresponse(v=vs.80).aspx. But it
will takes us a lot effort to realize it.

Now, we can simply use below code to do request asynchronously :

private async void
btnTest_Click(object sender, EventArgs e)

{

var request = WebRequest.Create(txtUrl.Text.Trim());

var content = new MemoryStream();

Task<WebResponse> responseTask = request.GetResponseAsync();

using (var response = await responseTask)

{

using (var
responseStream = response.GetResponseStream())

{

Task copyTask = responseStream.CopyToAsync(content);

//await operator to supends the excution of the method until the task is completed. In the meantime,
the control is returned the UI thread.

await copyTask;

}

}

txtResult.Text = content.Length.ToString();

}

The await operator is applied to the returned task. The await operator suspends execution of the
method until the task is completed. Meanwhile, control is returned to the caller of the suspended
method.

3. Caller Information

Caller Information can help us in tracing, debugging and creating diagnose tools. It will help us
to avoid duplicate codes which are generally invoked in many methods for same purpose, such
as logging and tracing.

We could get the below information of caller method :

Below example are a common practice prior to the new feature of Caller Information:

using System;

using System.Collections.Generic;

using System.Linq;

using System.Text;

using System.Threading.Tasks;

namespace
ConsoleApplicationTest

{

class Program

{

static void Main(string[] args)

{

InsertLog("Main");

MethodB();

Console.ReadLine();

}

static void MethodA()


{

InsertLog("MethodA");

MethodB();

}

static void MethodB()


{ }

static void
InsertLog(string methodName)

{

Console.WriteLine("{0} called method B at {1}", methodName,
DateTime.Now);

}

}

}

In both Main and MethodA methods, method InsertLog is invoked for logging. Now we can change the
codes to be as per below lines:

using System;

using System.Collections.Generic;

using System.Linq;

using System.Runtime.CompilerServices;

using System.Text;

using System.Threading.Tasks;

namespace
ConsoleApplicationTest

{

class Program

{

static void Main(string[] args)

{

//InsertLog("Main");

MethodB();

Console.ReadLine();

}

static void MethodA()


{

//InsertLog("MethodA");

MethodB();

}

static void MethodB(

[CallerMemberName] string memberName = "",

[CallerFilePath] string sourceFilePath = "",

[CallerLineNumber] int sourceLineNumber = 0)

{

InsertLog(memberName);

}

static void
InsertLog(string methodName)

{

Console.WriteLine("{0} called method B at {1}", methodName,
DateTime.Now);

}

}

}

4. Summary

The new features in C# 5.0 will help us to code more easily and improve the productivity. Have a nice
experience with the new release of Visual Studio 11!

 

Author's Bio

Fahao Tang, Visual C# MVP, majored in Environment Engineering. In 2008, he joined
Microsoft
MSDN forums and became a moderator one year later.  Through these
forums, he helped a lot of users to resolve all kinds of questions and issues. In 2012,
he successfully joined one of the GSO hubs of the
Australia and New Zealand Banking
Group in Chengdu. The new role as Senior MIS
Developer as well as the youngest
senior developer in the department, brings
him a lot challenges with opportunities. 
Beside the improvement in work,
he also helped to organize offline technique sharing
meetings.  He is
currently a programmer and an information participator and initiator. 

MVP Mondays

The MVP Monday Series is created by Melissa Travers. In this series we work to
provide readers with a guest post from an MVP every Monday. Melissa is a
Community Program Manager for Dynamics, Excel, Office 365, Platforms and
SharePoint in the United States. She has been working with MVPs since her
early days as Microsoft Exchange Support Engineer when MVPs would answer
all the questions in the old newsgroups before she could get to them.

 

C#进化图

$
0
0

C#如今已经发展到5.0版本,CLR版本为4.5,伴随Visual Studio 2011发布。我总结了一个进化图,以供大家参考。


在C# 5.0中主要增加了Async Programming 以及Caller Information两个特性,以下分别作介绍。

Async Feature

在C# 5.0新增了async修饰符以及await操作符;标记有async的方法被称为异步方法。
异步编程可以给我们带来很大的便利。比如在WinForm编程中,当我们使用HttpWebRequest请求网络资源的时候,
如果使用同步请求,那么如果请求响应时间过长,会导致我们的UI线程堵塞,
从直观上的感受是窗体无响应或者无法进行UI交互操作。

private void btnTest_Click(object sender, EventArgs e)

{

var request = WebRequest.Create(txtUrl.Text.Trim());

var content=new MemoryStream();

using (var response = request.GetResponse())

{

using (var responseStream = response.GetResponseStream())

{

responseStream.CopyTo(content);

}

}

txtResult.Text = content.Length.ToString();

}



 

当点击Test按钮后,在txtResult显示结果之前,我们将不能对窗体进行任何操作。

在没有async之前,我们一般也可以使用BeginGetResponse方法进行异步操作,如MSDN文档上的示例所示,
我们需要编写大量的代码去实现异步的效果:
http://msdn.microsoft.com/zh-cn/library/system.net.httpwebrequest.begingetresponse(v=vs.80).aspx

下面我们将对以上的窗体进行改造,使用新增的异步编程特性,代码如下:

private async void btnTest_Click(object sender, EventArgs e)


{

var request = WebRequest.Create(txtUrl.Text.Trim());

var content = new MemoryStream();

Task<WebResponse> responseTask = request.GetResponseAsync();

using (var response = await responseTask)

{

using (var responseStream = response.GetResponseStream())

{

Task copyTask = responseStream.CopyToAsync(content);

//await operator to supends the excution of the method until the task is
completed. In the meantime, the control is returned the UI thread.

await copyTask;

}

}

txtResult.Text = content.Length.ToString();

}

通过await,可以使我们在语义上让我们理解为reponse为异步执行后的结果,而编译器会负责所有的代码生成,
我们无需再去操作复杂的Callback。这样为我们异步编程节省很多的时间。从直观上,当我们使用以上代码后,
窗体在点击Test按钮后,我们还可以进行交互操作。

Caller Information

从字面上,我们可以理解为在被调用者方法中可以获得调用者的信息,这对我们开发跟踪、
调试以及诊断工具的时候特别有用。而在之前,我们可能需要在调用者方法中自己进行相关操作,
比如插入日志信息记录哪个方法执行了等。我印象很深的是,我参与的一个项目根据要求使用微软的企业库,
在调用每个Data Access的方法前都要插入调用者的信息到日志中,以便跟踪调查等;
这样导致每个调用者方法都会有相同的调用记录日志的方法的代码。有了Caller Information,
我们可以在被调用者方法中获得以下几个信息:

CallerFilePathAttribute 调用者方法所在的源文件地址

CallerLineNumberAttribute 方法被调用的行号

CallerMemberNameAttribute 调用方法的名称

下面我将举例说明。

在以前我们可能会用到如下的方法:

using System;

using System.Collections.Generic;

using System.Linq;

using System.Text;

using System.Threading.Tasks;

namespace ConsoleApplicationTest

{

class Program

{

static void Main(string[] args)

{

InsertLog("Main");

MethodB();

Console.ReadLine();

}

static void MethodA()

{

InsertLog("MethodA");

MethodB();

}

static void MethodB()

{ }

static void InsertLog(string methodName)

{

Console.WriteLine("{0} called method B at {1}", methodName,
DateTime.Now);

}

}

}

在Main和MethodA中都调用了InsertLog方法。有了新特性的支持,我们可以修改代码为:

using System;

using System.Collections.Generic;

using System.Linq;

using System.Runtime.CompilerServices;

using System.Text;

using System.Threading.Tasks;

namespace ConsoleApplicationTest

{

class Program

{

static void Main(string[] args)

{

//InsertLog("Main");

MethodB();

Console.ReadLine();

}

static void MethodA()

{

//InsertLog("MethodA");

MethodB();

}

static void MethodB(

[CallerMemberName] string memberName = "",

[CallerFilePath] string sourceFilePath = "",

[CallerLineNumber] int sourceLineNumber = 0)

{

InsertLog(memberName);

}

static void InsertLog(string methodName)

{

Console.WriteLine("{0} called method B at {1}", methodName,
DateTime.Now);

}

}

}

总结

C# 5.0新增的特性可以提高我们编程的效率同时减少代码量,VS11 IDE方面也新增加了很多新的功能,
大家可以慢慢去体会。

 

Author's Bio

Fahao Tang, Visual C# MVP, majored in Environment Engineering. In 2008, he joined
Microsoft
MSDN forums and became a moderator one year later.  Through these
forums, he helped a lot of users to resolve all kinds of questions and issues. In 2012,
he successfully joined one of the GSO hubs of the
Australia and New Zealand Banking
Group in Chengdu. The new role as Senior MIS
Developer as well as the youngest
senior developer in the department, brings
him a lot challenges with opportunities. 
Beside the improvement in work,
he also helped to organize offline technique sharing
meetings.  He is
currently a programmer and an information participator and initiator.

 

MVP Mondays

The MVP Monday Series is created by Melissa Travers. In this series we work to
provide readers with a guest post from an MVP every Monday. Melissa is a
Community Program Manager for Dynamics, Excel, Office 365, Platforms and
SharePoint in the United States. She has been working with MVPs since her
early days as Microsoft Exchange Support Engineer when MVPs would answer
all the questions in the old newsgroups before she could get to them.

Talking Cloud with Office 365 MVP David Greve

$
0
0

Office 365 MVP David Greve

How long have you been an MVP?
6 Months

How did you first start in community?
I’ve been an active part of the community for many years now. In the past few years, I’ve helped
run a SharePoint User Group as well as found a UC user group. My core skills are primarily UC
focused, which has lead me to both BPOS and Office 365.

Which technical community or communities are you most active in (where can people find
you)? 
You can find my blog at http://blogs.pointbridge.com. I’m also active on Office 365 community sites
and I often speak at cloud events.

What’s the best technical tip you have today for implementing a cloud deployment?
Business objectives and technical constraints often drive the goals of a migration, where the focus
on user experience is minimized. When the user experience is minimized, migration teams find
themselves struggling to support the large amount of support issue that arise. If you improve your
focus on user experience and include its measurement as high as your business and technical
goals, you will have completed a successful migration.

When considering using the Cloud, what do you tell people if they aren’t sure about moving
to the Cloud?
Where would you rather focus your time? Keeping the infrastructure lights on or focusing on the
real issues, which are improving efficiencies within the business? Managing a server is not terribly
exciting. Improving someone’s way of doing business, coupled with costs savings, is an
accomplishment.

Do you have a blog/website link to Cloud related Tips or deployment stories you would like to
share?
I speak about the user experience in this 2 part video blog.

Part 1: http://blogs.pointbridge.com/Blogs/greve_david/Pages/Post.aspx?_ID=30

Part 2: http://blogs.pointbridge.com/Blogs/greve_david/Pages/Post.aspx?_ID=31


Speaking from your experience, what words of advice do you have for new MVPs?
Other MVPs have a wealth of experience and information.  Leverage these resources to help
customers, your business and the community!

MVP Friday Five: March 30, 2012

$
0
0

We're going a little more international with our blog authors this week to give you a feel for what our worldwide MVPs are talking about. 

1. Here's What's Right With Windows 8

By ASP.NET MVP Chris Love | @chrislove

Provides Chris's perspective on the approach taken with Windows 8.

 

2. HDFS and File System Wanderlust

By Visual Basic MVP Andrew Brust | @andrewbrust

The Hadoop Distributed File System (HDFS) is a pillar of Hadoop. But its single-point-of-failure topology, and its ability to write to a file
only once, leaves the Enterprise wanting more. Some vendors are trying to answer the call.

 

3. New feature in C# 5.0 – [CallerMemberName]

By Visual C# MVP Bahrudin Hrnjica | @bhrnjica

Bahrudin introduces us to the [CallerMemberName] attribute within C#.

 

4. Windows Phone Performance

By Silverlight MVP András Velvárt | @vbandi

This is the first in a series of posts derived from a Windows Phone Developer book he co-authored with his Hungarian peers.

 

5. TechNet Radio Community Corner: Microsoft MVP Sai Kev on Hyper-V, System Center 2012 and Cloud Security

By Directory Services MVP Sainath KEV

This TechNet Radio edition covers the business and technical benefits that you should consider when thinking about Hyper V and System Center 2012.

 

 


Configure Remote BLOB Storage (RBS) with the FILESTREAM provider (SharePoint 2010)

$
0
0

In this article I am describing about how to install and configure Remote BLOB Storage (RBS) with the
FILESTREAM provider on a Microsoft SQL Server 2008 database server that supports a Microsoft
SharePoint Server 2010 system. RBS is typically recommended in the case where the content
databases are 4 gigabytes (GB) or larger.

SharePoint stores the files (or the blobs) in content database. The advantage of that approach is it
simplifies the backup and restores process. We can back up a site collection related data in one file.
The disadvantage is the database size can be enormous if Site contains large number of files. It
may a critical problem if you use SQL Server Express Edition which has a limitation in database file
size. RBS is a library API set that is incorporated as an add-on feature pack for Microsoft SQL Server
2008 and Microsoft SQL Server 2008 Express. RBS is designed to move the storage of binary large
objects (BLOBs) from database servers to commodity storage solutions. RBS ships with the RBS
FILESTREAM provider, which uses the RBS APIs to store BLOBs.

We can configure SharePoint to stores the blob in the file system, instead of in database by possible
by leveraging FILESTREAM feature of SQL Server 2008 and Remote BLOB Storage (RBS). With
SharePoint 2010, the Remote Blob Storage (RBS) functionality allows putting documents into the
database file system instead of the database itself. Each content database is located in a specific
section of the file system where all the documents are stored.

Enable File Stream in SQL Server

  1. Connect to SQL Server 
         Start -> All Programs -> Microsoft SQL Server 2008 ->Configuration Tools -> SQL Server
         Configurations Manager
  2. In the Services list, click "SQL Server Services"
  3. Choose your SQL instance (For me "SQL Server (SHAREPOINT)") and right click Properties.
  4. Switch to the FILESTREAM tab and check all the available checkboxes
  5. Click Apply -> OK 

         
                
  6. Now start SQL Server Management Studio
  7. Open a query windows and execute the below SQL Statement mater database is fine

         EXEC sp_configure filestream_access_level, 2

         RECONFIGURE
          
  8. Next we should have a web application and a content database created for the same.
    If you didn't created create a content data base for your web application. I have created
    a content data base with the name SP2010_Blob
  9. You can use Power shell to create a new content DB with the following command
         
         New-SPContentDatabase -name SP2010_Blob-WebApplication url 
          
  10. Now we have to enable our new content DB to be prepared to use FILE STREAM
  11. Start
         -> All Programs -> Microsoft SQL Server 2008 R2 ->SQL Server Management 
        Studio
  12. Right click on SP2010_Blob and select a query window and execute the below
    statement. Replace "C:\BlobFiles " by your storage-path 

         use [SP2010_Blob]   

         if not exists (select * from sys.symmetric_keys where name =
         N'##MS_DatabaseMasterKey##')create master key encryption by
         password = N'Admin Key Password !2#4'

         use [SP2010_Blob]

         if not exists (select groupname from sysfilegroups where
         groupname=N'RBSFilestreamProvider')alter database [SP2010_Blob] add
         filegroup RBSFilestreamProvider contains filestream

         use [SP2010_Blob]

         alter database [SP2010_Blob] add file (name = RBSFilestreamFile, filename
         = 'c:\ BlobFiles) to filegroup RBSFilestreamProvider

Install RBS on Web Server
      

  1. You must install RBS on the database server and on all Web servers and application
    servers in the SharePoint farm. You must configure RBS separately for each
    associated content database.
  2. On any Web server, go to http://go.microsoft.com/fwlink/?LinkID=177388
    (http://go.microsoft.com/fwlink/?LinkID=177388) to download the RBS.msi file.
  3. Click Start and then type cmd in the text box. In the list of results,right-click cmd, and
    then click Run as administrator. Click OK.
  4. Copy and paste the following command at the command prompt:
  5. Make sure you are running this command in the downloaded RBS.msi folder

         msiexec /qn /lvx* rbs_install_log.txt /i RBS_X64.msi
         TRUSTSERVERCERTIFICATE=true FILEGROUP=PRIMARY
         DBNAME="<ContentDbName>"
         DBINSTANCE="<DBInstanceName>"
         FILESTREAMFILEGROUP=RBSFilestreamProvider
         FILESTREAMSTORENAME=FilestreamProvider_1
          
  6. Where:

         1. <ContentDbName> is the database name in my case SP2010_Blob.

         2. <DBInstanceName> is the SQL Server instance name in my case DB
         SERVERNAME\SHAREPOINT.Please note you should give full name like I given

         To install RBS on all additional Web and application servers
  7. Click Start and then type cmd in the text box. In the list of results, right-click cmd,
    and then click Run as administrator. Click OK.
  8. Copyand paste the following command at the command prompt: 

          msiexec /qn /lvx* rbs_install_log.txt /i
         RBS_X64.msi DBNAME="ContentDbName"
         DBINSTANCE="DBInstanceName"
         ADDLOCAL="Client,Docs,Maintainer,ServerScript,FilestreamClient,FilestreamServer"
          
  9. You should repeat this procedure on all Web servers and application servers. If you do 
    not install RBS on every Web and application server, users will encounter errors when
    they try to write to the content databases.
  10. The rbs_install_log.txt log file is created in the same location as the RBS.msi file. Open
    the rbs_install_log.txt log file with a text editor and scroll toward the bottom of the file.
    Within the last 20 lines of the end of the file, an entry should read as follows:
    "Product: SQL Remote Blob Storage - Installation completed successfully".
  11. The script has created a some tables in the database, the following query help us to
    check that

        

         
         
         use SP2010_Blob

         select * from dbo.sysobjects

         where name like 'rbs%' 


To enable RBS 
      

  1. On the Start menu, click Programs, click Microsoft SharePoint 2010 Products, and then
    click SharePoint 2010 Management Shell.
  2. At the Windows PowerShell command prompt, type each of the following commands. 

         $cdb = Get-SPContentDatabase -WebApplication Your
         URL

         $rbss = $cdb.RemoteBlobStorageSettings

         $rbss.Installed()

         $rbss.Enable()

         $rbss.SetActiveProviderName($rbss.GetProviderNames()[0])

         $rbss

          
  3. We almost did with the configuration. Create a site collection and should use
         the content database "SP2010_Blob".
  4. Connect to WFE
  5. Go to SharePoint 2010 Central Administration
  6. Under Application Management - Manage Content Databases Choose the applicable
    Web Application from the drop down above
  7. Configure the Content Database to put the next site collection to "SP2010_Blob"
  8. Go back to the main site of Central Administration
  9. Application Management - Create Site Collection
  10. Make sure by going to content database that the site collection we created will be in the
    content database that we configured

To test the RBS data store

  1. Connect to a document library on any Web server.
  2. Upload a file that is at least 100 kilobytes (KB) to the document library.
  3. Documents below 100 kB will not be put to the file system but will be stored in the
    database.
  4. On the computer that contains the RBS data store, click Start, and then click
    Computer.
  5. Browse to the RBS data store directory.
  6. Browse to the file list and open the folder that has the most recent modified date
    (other than $FSLOG). In that folder, open the file that has the most recent modified date.
    Verify that this file has the same size and contents as the file that you uploaded

Author's Bio

Destin Joy is a Microsoft MVP on SharePoint Server. He is
an author, speaker and a blogger in Microsoft technology. Currently Destin is
in the final phase of creating his second EBook on “SharePoint 2010 Capacity
Planning”, which will be published in C# corner.

MVP Mondays

The MVP Monday Series is created by Melissa Travers. In this series we work to
provide readers with a guest post from an MVP every Monday. Melissa is a
Community Program Manager for Dynamics, Excel, Office 365, Platforms and
SharePoint in the United States. She has been working with MVPs since her
early days as Microsoft Exchange Support Engineer when MVPs would answer
all the questions in the old newsgroups before she could get to them.

Talking Cloud with Exchange Server MVP Jorge Diaz

$
0
0

Exchange Server MVP Jorge Diaz

How long have you been an MVP?
Three months

How did you first start in community?
Responding to technical responses in the BPOS and Exchange Technet forums as well as
blogging on my trials and tribulations with both Exchange and Office 365.

Which technical community or communities are you most active in (where can people find
you)? 

The Office 365 Community

My Blog: jorgerdiaz.wordpress.com

TechNet Forums

What’s the best technical tip you have today for implementing a cloud deployment?
Plan, plan, plan! All phases of a migration to the cloud builds off of each other so building a solid
approach to the migration is critical to the success of the project. From a technical perspective the
best way to approach a cloud implementation from an Exchange perspective is to understand a
cross forest migration and understand a migration to the cloud is virtually identical, the limitations
are the same. This is key, especially when configuring hybrid mode between on-prem and cloud
solutions. Exchange 2010 SP2 provides an excellent hybrid configuration wizard, but without
understanding the concepts behind what the wizard is doing an engineer won’t be able to
effectively troubleshoot the problem if things don’t go as planned.

When considering using the Cloud, what do you tell people if they aren’t sure about
moving
to the Cloud?
It gives you all the benefit of having the systems in house, with additional benefits of ultra-high
availability with no hardware/software to maintain. That in addition to software assurance gives
you the best Microsoft has to offer without having to worry about upgrading every time a new
version comes out.

Do you have a blog/website link to Cloud related Tips or deployment stories you
would like to share?
My blog: http://jorgerdiaz.wordpress.com

Speaking from your experience, what words of advice do you have for new MVPs?
I am new so it is hard to say but continuing to participate and getting to know other MVPs is
critical to success in the MVP world. The wealth of resources and knowledge goes up by an
exponential factor as soon as you become an MVP so be sure to utilize the resource and get
everything you can out of it!

Through Co-creation, MVPs Help Build New Answers NNTP Bridge

$
0
0

Last month, Microsoft Answers launched its NNTP Bridge—in large part in response to requests
by the community and the contributions of two passionate MVPs.

Visual C++ MVP Jochen Kalmbach and Internet Explorer MVP Kai Schaetzl volunteered to work
with Microsoft to bring this popular feature back to the Answers platform, after it was not included
in the major upgrade of Microsoft Answers last year. The lack of an NNTP reader was highlighted
by MVP power users, who lauded its simple ability to review and answer a large number of posts
in a short amount of time.

As Jochen explained, “After building the community bridge for MSDN/TechNet, I was hot to do
such an adoption for Answers.”

The Microsoft Answers team has a commitment to the co-creation philosophy, recognizing that
community is an ecosystem where there are many constituents, including MVPs, Microsoft
Community Contributors, other forum power users, new users, browser/lurkers, service delivery
agents, and business representatives. With this development approach, all voices are heard:
the NNTP Bridge is a manifestation of that.

To make the NNTP Bridge development possible, Microsoft provided a CPS Lite Service, on
which Jochen and Kai built the NNTP Bridge application. Kai said, “I'm very happy that there's
finally an NNTP bridge for the new Answers website available. It's taken quite a long time and
I had given up hope, almost. I've learned how long decision making and funding processes can
take in a large company like Microsoft, but I'm also amazed how quickly the technical side of it
was done once a decision was made.”

The new NNTP Bridge application is optimized for Answers 2.X and serves as a channel for
NNTP newsreaders to read and write content to Answers forums.  Through the NNTP Bridge,
users who want to participate in the Answers forum through a more traditional newsreader style
interface may once again do so.

Congratulations April MVP Awardees!

$
0
0

Join us in congratulating this quarter’s MVP Awardees!

On April 1, 913 exemplary community leaders from around the world were notified they had
received the MVP Award. Of those, 121 were first-time recipients!

For nearly 20 years, Microsoft has been saying “thank you” to individuals who demonstrate
their deep commitment to helping others in the community with the MVP Award. Today there
are more than 100 million social and technical community members, and only a small portion
are selected to be recognized as MVPs.

MVPs voluntarily share their passion and real-world knowledge of Microsoft products, helping
others make the most of their technology. They are recognized each quarter for this annual
award, and each year around 4,000 MVPs are honored.

MVPs are nominated by Microsoft, other community individuals, or in some cases themselves.
Candidates are rigorously evaluated for their technical expertise, community leadership, and
voluntary community contributions for the previous year. They come from more than 90
countries, speak over 40 different languages, and are awarded in more than 90 Microsoft
technologies. Together, they answer more than 10 million questions a year!

Congratulations to the new MVPs, and welcome back to renewed MVPs. We are very excited
to recognize your amazing accomplishments!

Please visit our nomination page to nominate a community leader to be considered for an
MVP Award.

If you’re a new MVP, maybe you’d like to take this opportunity to introduce yourself and tell
us about your interests in the comments below.

If you’re a renewing MVP, please share your tips for getting the most out of the MVP Award
experience with your new colleagues!

MVP Friday Five: April 6, 2012

$
0
0

We're running the gamut in this week's blog collection, ranging from Microsoft Excel and Windows Phone Development to Windows Azure and Microsoft Project.  Want your blog posts highlighted in future editions of the Friday Five?  Share your favorite blog posts with your Community Program Manager and we'll see about getting them in.  We are always looking for great content to share with our MVPs.

1. Excel: Customizing Your Right Click Menu to List & Run Macros

By Microsoft Excel MVP Tom Urtis | @tomurtis

Tom shows us how to leverage the right click menu to effectively run macros within Excel.


2. Declaring VARCHAR Without Length

By SQL Server MVP  Vadivel Mohanakrishnan | @vmvadivel

Vadivel shares with us how to specificy the length of a string without fail within SQL Server.


3. Timeline View in MS Project 2010

By Microsoft Project MVP Nenad Trajkovski 

Nenad runs through the timeline view feature found in Project 2010.


4. How to put .Net Framework 4.5 Beta & ASP.NET MVC 4 Beta on Windows Azure

By Windows Azure MVP Magnus Martensson | @noopman

Magnus reviews the steps you need to take in order to put .NET Framework 4.5 Beta and ASP.NET MVC 4
on Windows Azure and run it in a Web Role.

5. Visualizzare immagini GIF in un’applicazione Windows Phone

By Windows Phone Developer MVP Matteo Pagani | @qmatteoq

Learn how to use GIF images within the Windows Phone environment.

 

 

 

 

Use Digital IDs (Certificates) to prove your identity in Outlook email transactions

$
0
0

Digital signatures are encryption-based, secure stamp of authentication on a macro or document.
This signature confirms that the macro or document originated from the signer and has not been
altered. Digital IDs includes digital certificate (certificate: A digital means of proving your identity.
When you send a digitally signed message you are sending your certificate and public key.
Certificates are issued by a certification authority, and like a driver's license, can expire or be
revoked. The key a sender gives to a recipient so that the recipient can verify the sender's signature
and confirm that the message was not altered. Recipients also use the public key to encrypt (lock)
email messages to the sender.), used to sign the contents of an Email message proves to the
recipient that you are not an imposter.

Note: You must get a digital ID before you can digitally sign an email message.

In this article, I will guide you through various steps involved in signing Email messages and
identify the validity of a signed message in in Outlook 2010.

How to ‘Digitally sign’ an Email message in Microsoft Outlook?

Note: You should purchase a Digital ID, from a trusted third party Certificate provider or get a Digital ID
from your organization’s CA. You may go here, to find out some of the Digital ID providers. Copy the .pfx
file to your computer before proceeding to the following steps:

The following steps are prepared in Outlook 2010 Professional.

Open Outlook 2010 and perform the following operations to include a Digital ID in outlook and send email
using that:

  1. Open the ‘Email Security’ section in Outlook 2010
  • Click the File tab and click Options
  • Click Trust Center from the left side menu of the window shown.
  • Click Trust Center Settings button
  • Select E-mail Security from the left side menu of the window shown:

This will evoke a window as shown below:
 

2. Under Encrypted e-mail section, select the Add digital signature to outgoing messages check box.

3.  If available, you can select one of the followingoptions:

    • If you want recipients who don't have S/MIME (S/MIME: Secure Multipurpose Internet Mail
      Extensions (S/MIME) is aspecification for secure email messages that uses the X.509
      format for digitalcertificates and uses various encryption algorithms such as 3DES.) security
      to be able to read the message, select the Send clear text signed message when sending
      signed messages
      check box.
    • To verify that your digitally signed message was received unaltered by the intended
      recipients, select the Request S/MIME receipt for all S/MIME signed messages check
      box. You can request notification tellingyou who opened the message and when it was
      opened, when you send a message that uses an S/MIME return receipt request, this
      verification information is returned as a message sent to your Inbox.

4. Now we want to import the Digital ID stored in your computer to the outlook system.

    • Click Import/Export button at the Trust Center window
    • Browse the Digital ID (.pfx file) stored location in your computer. Give the password used for
      encryption. Also give a friendly name. Press ‘OK’ in the window to import the digital id to the
      outlook system.
    • You will be prompted with ‘Importing a new Private Exchange key’ window. You can set
      thesecurity level of usage of the certificate there. Press ‘OK’ after doing necessary changes.

5. Now we want to choose the Digital ID for our email signing purposes

    • Click ‘Settings’ button at the ‘Trust Center’ window and you will be prompted with
      ‘Change Security Settings’ window as shown below:

    • Give a friendly security settings name
    • In ‘Certificates and Algorithm’ section press ‘Choose’ to choose the certificate imported for
      the purpose. You will be able to get the certificate as shown in the below screen shot:






























    • Press ‘OK’ to select the certificate
    • Now you may set the ‘Hashing Algorithm’ and ‘Encryption Algorithm’, according your
      security concerns in the ‘Change Security Settings’ window

6. Accept the changes in the open windows by pressing ‘OK’ buttons. Now you will be able to use the
    Digital ID to sign the email send from the corresponding email address configured in your outlook
    application

7. You can see the following highlighted changes in the sent email as shown below:

8. The recipient can check the ‘digital signature’ of the email received, by clicking the right most
     security icon. Click there to get the following screen:

9. You may now ‘edit trust’, ‘View certificate details’ or ‘can trust the certification authority’ in the window.

 

 

Author's Bio

Manu works as an Associate Consultant of IT Infrastructure division of UST
Global in Technopark, Kerala India. Before working with UST Global, Manu was
associated with different companies named as JDA Software Inc., RM PLC,
Visionics (India), MNCs operated in IT Industry in maintaining their IT
Infrastructure and Data Center resources.

MVP Mondays

The MVP Monday Series is created by Melissa Travers. In this series we work to
provide readers with a guest post from an MVP every Monday. Melissa is a
Community Program Manager for Dynamics, Excel, Office 365, Platforms and
SharePoint in the United States. She has been working with MVPs since her
early days as Microsoft Exchange Support Engineer when MVPs would answer
all the questions in the old newsgroups before she could get to them.

Talking Cloud with Windows Azure MVP Yves Goeleven

$
0
0


Windows Azure MVP Yves Goeleven

How long have you been an MVP?
I’ve been awarded for the second time now.

How did you first start in community?
I started out in the community in 2005 when the first and largest Belgian .net oriented user group,
Visug, got founded by some colleagues of mine.

Which technical community or communities are you most active in  (where can
people find you)?
Today I run the Belgian Windows Azure user group, Azug, you can always find me at any of our
events. But as we have a lot of user groups on Microsoft related development topics here in this
small country, you can also find me there.

What's the best technical tip you have today for implementing a cloud deployment?
The most important thing is to unlearn what you know from on-premise software development.

When considering using the Cloud, what do you tell people if they aren't sure about moving to the
Cloud?
Most people I meet are daunted by the architectural implications on their applications when they
have to be deployed in a distributed environment, especially since Belgium is a very small country
most people were not aware of the implications. Then I kindly give them a demo of the open
source framework I’m contributing to, NServiceBus, which drastically simplifies development of
distributed applications. You can read more about the Azure support for NServiceBus on
http://cloudshaper.wordpress.com/category/nservicebus/

Do you have a blog/website link to Cloud related tips or deployment stories you would like to
share?
My blog is full of those :-), http://cloudshaper.wordpress.com/

Speaking from your experience, what words of advice do you have for new MVPs?
First of all, enjoy.., it’s an amazing experience! And secondly make sure that you build your
network with other MVPs and the Microsoft product group.


MS NETWORK strikes back with the help of CEE MVPs!

$
0
0

The beautiful city of Mostar, in Bosnia and Herzegovina, held the Microsoft Network 2.0 event
April 4th and 5th. The conference presented a great opportunity for all local IT-orientated people
to get together, connect with each other and also attend many interesting technical quality
sessions. Although this conference is pretty young (this is the second year), it is very well
organized and it has much to offer.

In the Country Manager Layla Zukic-Krivdic’s own words:

“Like last year, Microsoft Network 2.0 gathered a respectable representation of the BiH private
and public sectors, as well as local and International IT professionals with the aim of presenting
the latest Microsoft and Partner solutions and their use for business efficiency. With a large
number of presentations and case studies for IT professionals, developers and managers in
the field of business productivity, infrastructure and security, and application development
platform, Microsoft Network 2.0 is the conference not be missed. It is a unique event where you
get to know the current Microsoft and partner solutions and IT trends, be able to exchange
experiences, enhance knowledge and create new business contacts.

With 40 speakers (of which 13 are MVPs!), 50 technical sessions, 4 different tracks (including
a Community one for community peers only), Microsoft Network 2.0 is the biggest annual event
ever held in Bosnia and will probably have 25% more attendees compared to last year.

Here are some pictures and quotes from the local MVPs attending as speakers:

Jasmin Azerovic
Bahrudin Hrnjica
Damir Dizdarevic
“I’m an MVP for Management Infrastructure, and I
usually deliver presentations on all conferences in
the region. Since I live in Bosnia, it was a special
pleasure to me to deliver some very interesting
sessions about Windows Server 8 and System
Center 2012 during the MS Network Conference
this year.  As well as the year before, we had a
whole track of sessions devoted to lectures
delivered by the MSCommunity Bosnia
members. I’m very happy to see that this year
we had a large number of MVPs from the whole
region presenting on this conference.”


Adis Jugo
“The first Microsoft NetWork conference, which
was held the last year in Banja Luka, was the first
Microsoft technology, business and community
conference in Bosnia and Herzegovina.
Impressions and outcome of NetWork 1.0
were actually much higher than anyone has
expected, and the praise was coming from all sides.
Awareness about Microsoft technologies, Microsoft
partners and Microsoft Community has strongly
increased in Bosnia and Herzegovina after that
conference.

I expect from Microsoft NetWork 2.0 to continue to impact the awareness growth, this year especially with
Community Track being equally positioned as the other three tracks (Business, Development, IT Pro). 
Since six MVPs had their sessions in this year’s community track, we have good chances to meet that goal:
to present Microsoft Community as a pool of talented, enthusiastic people, who are creating some great
things on Microsoft technologies, sharing knowledge, having fun and enjoying being a part of the community.

For me, personally, this conference has a special importance. One year ago, on Microsoft NetWork 1.0, we
have created SharePoint User Group Bosnia and Herzegovina. This year, 12 months, 6 meetings and 10
tech-sessions later, we have an established group of people, meeting on regular basis on premise and
online. Microsoft NetWork 2.0 is SharePoint User Group’s anniversary, but also an opportunity to look back
at 12 months which lay behind us, and especially to look forward into the future. My session at the
community track (“Why is SharePoint community such a cool thing?”) served also as a SharePoint User
Group meeting, where we discussed the group activities, lessons learned from the last year, and set some
expectations for the coming year. We wanted to give an opportunity to the group members to tell what
kind of SharePoint user group they would like to have, what kind of sessions they want to hear, and where
we should go in the future. 

All four Microsoft MVPs from Bosnia and Herzegovina presented their sessions on Microsoft NetWork,
as well as numerous MVPs from the region (Croatia, Serbia, Romania, Slovenia, Germany). First of all,
it was great to see the support of local and regional MVPs to such a conference, and of course, it
guaranteed high quality. This definitively was a “who-is-who,” top-level conference in our region.”

And have a look here to see which other Microsoft MVPs attended the Conference as speakers this year:
it was a memorable event!

About the Author:

Alessandro Teglia is a Community Program Manager responsible for the MVP
Award for Central, Eastern Europe & Italy for Microsoft. 

You can find him on Twitter at @alead or his blog at
http://belead.me

 

Once Again, the MVP Friday Five: April 13, 2012.

$
0
0

We're highlighting some great posts we've seen that our MVP Awardees write over the past few weeks.  This
week concentrates on Windows Azure, Visual C#, ASP.NET and SQL Server!

 

1. Exceeding the SLA-It's About Resilience

By Windows Azure MVP Brent Stineman | @brentcodemonkey

Brent walks through fundamentals of SLAs after finding a lack of proper architect solutions which take
advantage of cloud computing.

 

2. Passing Parameters to SQL Server

By Visual C# MVP David Giard | @davidgiard

SQL Injection  is one of the most frequently-exploited vulnerabilities in the software world. It refers to
user-entered data making its way into commands sent to back-end systems. It is common because so
many developers are unaware of the risk and how to mitigate it.

 

3. DocShare: Illustrating the CQRS Pattern with Windows Azure and MVC4 Web API

By Windows Azure MVP David Pallmann | @davidpallmann

The Command-Query Responsibility Separation pattern (CQRS) is a recent pattern that is gaining
popularity. In this post David briefly explains the CQRS pattern and shows an example of it in a Windows
Azure solution that also uses MVC4 & ASP.NET Web API.



4. Using Log Parser to List All Blocked IP Requests

By ASP.NET/IIS MVP Scott Mitchell | @scottonwriting

IIS makes blocking a series of IP addresses easy with its IPv4 Address and Domain Restrictions feature. 
This lets the webmaster specify specific or ranges of IP addresses that are either allowed or denied
access to the website.  Scott shows us how.



5. Analytic Functions – They’re Not Aggregates

By SQL Server MVP Rob Farley | @rob_farley

SQL 2012 brings us a bunch of new analytic functions, together with enhancements to the OVER clause.
Rob is a big fan of the OVER clause and the types of things that it brings us when applied to aggregate
functions, as well as the ranking functions that it enables.  This post is going to look at a particular aspect
of the analytic functions though (not the enhancements to the OVER clause).

 

SharePoint 2010 External Content Type to read data from SQL Server using SQL Authentication and Secure Store Service

$
0
0

Editor's Note: The following MVP Monday post is by SharePoint MVP Destin Joy

SharePoint 2010 External Content Type to read data from SQL Server using SQL Authentication and
Secure Store Service

Most of the articles on BCS are explaining about reading data from database using windows authentication.
Windows authentication to read data from database won’t work in many of the scenarios. We have to use
SQL authentication to read or manipulate data from SQL. We have to use secure Store Service to use
SQL authentication

Here we have following five steps to achieve the same

  • Create a dummy table
  • Configure Secure Store Service
  • Configure BCS service
  • Creating External Content type.
  • Creating External List
  1. Before starting, you should have an SQL server Table. I have created a database with name
    Contact List.
  2. Right click Table tab of the Contact_List and click on New Table
  3. I have created  a table with the below fields and saved the table with the name Contact_Table

 

No

Field Name

Type

1

Name

nchar(10)

2

Address

nchar(10)

3

Phone

nchar(10)

 

4. Now go to your central Administration Screen. Click on application management an select
Manage Services from Service Applications

Configuring Secure Service Store

5. From the list of service applications select Secure Store Service

 

6. Click on Generate New Key tab from the top

 

7. Give a Pass Phrase and confirm the Phrase

8. Click on the new from the ribbon once you done with key generation


9. In the next screen give the below details.

 No

Options

Value

1

Target Application ID

Any Name

2

Display Name

Any Name

3

Contact Email

Any Email

4

Target Application

Individual

10. Click Next. In the screen you got, give the details as shown below

11. Please note, to select Username and password from the filed type


12. Once done with details, click Next

13. Give service account name in the screen and give permission for the user to edit the details if needed later

14. We have almost done with Secure Service Store. Click on the menu SQL_BCS and select set Credentials from the menu


15. Give credentials Owner .SQL user name and password to retrieve the data from SQL

 

Creating
External Content Type

16. Hopes you already have a web application, where we can create external content type.

17. Open your web application in SharePoint designer, Select External Content type from the ribbon

18. Once opened in SharePoint Designer Click on External Content as shown below

19. Give proper name and display name, I have given BCS_Contacts

20. Select Office item type if you want to integrate the same data with Out look

21. Click on “Click here to discover external data source and define operations” link from External System


22. You will get the below screen, click on Add connections. Please note, to select the last option

23. In the screen Give the below screen. Give below details

No

Options

Value

1

Database Server Name

Your database server name or IP

2

Database r Name

Your database  name

3

Name

Any Name

4

Connect with impersonated Custom Identity

The Secure Store Service name we created in first step

24. You will be prompted with the user name and password. Give the same we entered to connect to SQL server
database

25. You will get the below screen populated with the database details

26. Right click on the Table and select Create all operations if you want to do all the operations in the database.

27. You will be prompted with the below scree

28. Click Next

29. If you want to add any filter add the same from the below screen.

30. Click Save the BCS from the top

31. Go to manages services and select Business Connectivity service then click Administration


32. Give the ID of the user who can use this service

33. Then click on permission and give permission to who all need to use this service

34. Click on the Secure Store Service ,select the metadata store permission from the ribbon

35. Give permission to all the users who want to use this service to read data from the data base

Creating External List

36. Click on View all site content from your web application

37. Click on create

38. Select External list from  the menu

39. Give proper name and click Create

40. Give the name for the External content type and select the externalcontent Type external
Content Type as the one we created using designer.  Click Create


41. You can see your list populated with the data from database

 

Author's Bio

Destin Joy is a Microsoft MVP on SharePoint Server. He is
an author, speaker and a blogger in Microsoft technology. Currently Destin is
in the final phase of creating his second EBook on “SharePoint 2010 Capacity
Planning”, which will be published in C# corner.

MVP Mondays

The MVP Monday Series is created by Melissa Travers. In this series we work to
provide readers with a guest post from an MVP every Monday. Melissa is a
Community Program Manager for Dynamics, Excel, Office 365, Platforms and
SharePoint in the United States. She has been working with MVPs since her
early days as Microsoft Exchange Support Engineer when MVPs would answer
all the questions in the old newsgroups before she could get to them.

 

Talking Cloud Tuesday: Windows Azure MVP Ming Chung Chu

$
0
0

Windows Azure MVP Ming Chung Chu

How long have you been an MVP?
I have been an MVP for 9 years (2004-2012)


How did you first start in community?

My community experience started with the Programmer-Club managing “Training and
Certification” and “.NET group” forums at 2001. In 2003, I joined the Microsoft online
community newsgroups to answer questions about .NET, ASP.NET, ADO.NET, Web
Service, SQL Server, Windows Server etc. I was awarded as an MVP for the first time
in SQL Server in October 2003. Currently, I am a Windows Azure MVP.


Which technical community or communities are you most active in (where can people find
you)? 
MSDN and TechNet Forum, and my blog: http://www.dotblogs.com.tw/regionbbs

What’s the best technical tip you have today for implementing a cloud deployment?
A Cloud solution is different to an on-premise solutions with high availability, state
management, storage and deployment strategy. Good design of solution architectures
and plans is very important for Cloud solution deployment.


When considering using the Cloud, what do you tell people if they aren’t sure about
moving
to the Cloud?
The Cloud environment is highly useful, extensive and low cost. Enterprise resources
could be provided by the Cloud which is suitable for newly founded companies or
organizations that do not need an IT team.


Do you have a blog/website link to Cloud related Tips or deployment stories you
would like to share?
Yes. http://www.dotblogs.com.tw/regionbbs


Speaking from your experience, what words of advice do you have for new MVPs?
Enjoying sharing knowledge will bring happiness and richness.

MVP’s User Group Receives CompTIA National Award for Excellence

$
0
0

Congratulations to Data Center Management MVP Dave Sanders and the Carolina IT Pro Group
(CITPG) on being awarded the CompTia National Award for Excellence. For 12 years, this
exemplary user group has contributed to helping countless people—within the technical
community and beyond.

“Our motto is Make a Difference,” explained Dave, the group’s founder and president. Their
formula is simple: offer valuable content and great door prizes and ask everyone who participates
to contribute an article of food or clothing at the door. And it’s working. Last year, CITPG’s
membership grew by nearly 750 and it contributed around 14 tons of food and clothing to the
homeless in their community.

User groups are a vital part of technical communities around the world. They provide a place
for those with an interest in technology to meet face-to-face and learn from each other. CITPG
is no different, established as a place welcoming to all—where everybody has the opportunity
to learn.

Dave tells the story of a woman who began coming to their group’s meetings, who sat in the
back and never participated. Eventually she confided to Dave that she knew nothing about
computers but would like to join the group—an idea he fully supported. As a door prize one
month, a company had contributed a full MCSE training package and, as luck would have it,
the woman who knew nothing about technology won it. “She asked me what she was going to
do with it,” Dave said. “I told her—you’re going to study and become an MCSE.” She failed the
exam twice but on her third attempt she passed. She began helping Dave out on a fairly large
contract he was working on at the time and, when her family moved to Texas, he introduced
her to some people in the local technical community. Now she’s a network administrator for a
large company.

Dave is hoping to expand CITPG’s reach by creating a recurring regional event. They’re off to
a good start. Last month they held an IT Pro Appreciation Day which attracted 200 attendees.
It boasted five other MVPs as speakers, as well as a keynote presentation by Scott Davidson,
General Manager of Microsoft Developer and Platform Evangelism, US East Region.

 


Congratulations again to Dave and the CITPG membership!

You can find out more about CITPG here: www.carolinait.org.

Viewing all 788 articles
Browse latest View live




Latest Images