SMS and email Two-Factor Authentication in ASP.NET MVC 5

Create an ASP.NET  MVC app

Start by installing and running Visual Studio Express 2013 for Web or  Visual Studio 2013 .  Install Visual Studio 2013 Update 3 or higher.

Warning: You should complete Create a secure ASP.NET MVC 5 web app with log in, email confirmation and password reset before proceeding. You must install Visual Studio 2013 Update 3 or higher to complete this tutorial.
  1. Create a new ASP.NET Web project and select the MVC template. Web Forms also supports ASP.NET Identity, so you could follow similar steps in a web forms app.
  2. Leave the default authentication as Individual User Accounts. If you’d like to host the app in Azure, leave the check box checked. Later in the tutorial we will deploy to Azure. You can open an Azure account for free .
  3. Set the project to use SSL .

Set up SMS for Two-factor authentication

This tutorial provides instructions for using either Twilio or ASPSMS but you can use any other SMS provider.

  1. Creating a User Account with an SMS provider

    Create a Twilio or an ASPSMS account.

  2. Installing additional packages or adding service references

    Twilio:
    In the Package Manager Console, enter the following command:
    Install-Package Twilio

    ASPSMS:
    The following service reference needs to be added:

    Address:
    https://webservice.aspsms.com/aspsmsx2.asmx?WSDL

    Namespace:
    ASPSMSX2

  3. Figuring out SMS Provider User credentials

    Twilio:
    From the Dashboard tab of your Twilio account, copy the Account SID and Auth token.

    ASPSMS:
    From your account settings, navigate to Userkey and copy it together with your self-defined Password.

    We will later store these values in the web.config file within the keys "SMSAccountIdentification" and"SMSAccountPassword".

  4. Specifying SenderID / Originator

    Twilio:
    From the Numbers tab, copy your Twilio phone number.

    ASPSMS:
    Within the Unlock Originators Menu, unlock one or more Originators or choose an alphanumeric Originator (Not supported by all networks).

    We will later store this value in the web.config file within the key "SMSAccountFrom".

  5. Transferring SMS provider credentials into app

    Make the credentials and sender phone number available to the app. To keep things simple we will store these values in the web.config file. When we deploy to Azure, we can store the values securely in the app settingssection on the web site configure tab.

    </connectionStrings>
       <appSettings>
          <add key="webpages:Version" value="3.0.0.0" />
          <!-- Markup removed for clarity. -->
          <!-- SendGrid-->
          <add key="mailAccount" value="account" />
          <add key="mailPassword" value="password" />
          <add key="SMSAccountIdentification" value="My Identification" />
          <add key="SMSAccountPassword" value="My Password" />
          <add key="SMSAccountFrom" value="+12065551234" />
       </appSettings>
      <system.web>
    Security Note: Never store sensitive data in your source code. The account and credentials are added to the code above to keep the sample simple. See Best practices for deploying passwords and other sensitive data to ASP.NET and Azure .
  6. Implementation of data transfer to SMS provider

    Configure the SmsService class in the App_Start\IdentityConfig.cs file.

    Depending on the used SMS provider activate either the Twilio or the ASPSMS section:

    public class SmsService : IIdentityMessageService
    {
        public Task SendAsync(IdentityMessage message)
        {
            // Twilio Begin
            // var Twilio = new TwilioRestClient(
            //   System.Configuration.ConfigurationManager.AppSettings["SMSAccountIdentification"],
            //   System.Configuration.ConfigurationManager.AppSettings["SMSAccountPassword"]);
            // var result = Twilio.SendMessage(
            //   System.Configuration.ConfigurationManager.AppSettings["SMSAccountFrom"],
            //   message.Destination, message.Body
            // );
            // Status is one of Queued, Sending, Sent, Failed or null if the number is not valid
            // Trace.TraceInformation(result.Status);
            // Twilio doesn't currently have an async API, so return success.
            // return Task.FromResult(0);
            // Twilio End
    
            // ASPSMS Begin 
            // var soapSms = new MvcPWx.ASPSMSX2.ASPSMSX2SoapClient("ASPSMSX2Soap");
            // soapSms.SendSimpleTextSMS(
            //   System.Configuration.ConfigurationManager.AppSettings["SMSAccountIdentification"],
            //   System.Configuration.ConfigurationManager.AppSettings["SMSAccountPassword"],
            //   message.Destination,
            //   System.Configuration.ConfigurationManager.AppSettings["SMSAccountFrom"],
            //   message.Body);
            // soapSms.Close();
            // return Task.FromResult(0);
            // ASPSMS End
        }
    }
  7. Update the Views\Manage\Index.cshtml Razor view: (note: don’t just remove the comments in the exiting code, use the code below.)
    @model MvcPWy.Models.IndexViewModel
    @{
       ViewBag.Title = "Manage";
    }
    <h2>@ViewBag.Title.</h2>
    <p class="text-success">@ViewBag.StatusMessage</p>
    

    Change your account settings


    />
    class="dl-horizontal">
    Password:
    [ @if (Model.HasPassword) { @Html.ActionLink("Change your password", "ChangePassword") } else { @Html.ActionLink("Create", "SetPassword") } ]
    External Logins:
    @Model.Logins.Count [ @Html.ActionLink("Manage", "ManageLogins") ]
    Phone Number:
    @(Model.PhoneNumber ?? "None") [ @if (Model.PhoneNumber != null) { @Html.ActionLink("Change", "AddPhoneNumber") @:  |  @Html.ActionLink("Remove", "RemovePhoneNumber") } else { @Html.ActionLink("Add", "AddPhoneNumber") } ]
    Two-Factor Authentication:
    @if (Model.TwoFactor) { using (Html.BeginForm("DisableTwoFactorAuthentication", "Manage", FormMethod.Post, new { @class = "form-horizontal", role = "form" })) { @Html.AntiForgeryToken() Enabled type="submit" value="Disable" class="btn btn-link" /> } } else { using (Html.BeginForm("EnableTwoFactorAuthentication", "Manage", FormMethod.Post, new { @class = "form-horizontal", role = "form" })) { @Html.AntiForgeryToken() Disabled type="submit" value="Enable" class="btn btn-link" /> } }
  8. Verify the EnableTwoFactorAuthentication and DisableTwoFactorAuthentication action methods in the ManageController have the [ValidateAntiForgeryToken] attribute:
    //
    // POST: /Manage/EnableTwoFactorAuthentication
    [HttpPost,ValidateAntiForgeryToken]
    public async Task<ActionResult> EnableTwoFactorAuthentication()
    {
        await UserManager.SetTwoFactorEnabledAsync(User.Identity.GetUserId(), true);
        var user = await UserManager.FindByIdAsync(User.Identity.GetUserId());
        if (user != null)
        {
            await SignInAsync(user, isPersistent: false);
        }
        return RedirectToAction("Index", "Manage");
    }
    //
    // POST: /Manage/DisableTwoFactorAuthentication
    [HttpPost, ValidateAntiForgeryToken]
    public async Task<ActionResult> DisableTwoFactorAuthentication()
    {
        await UserManager.SetTwoFactorEnabledAsync(User.Identity.GetUserId(), false);
        var user = await UserManager.FindByIdAsync(User.Identity.GetUserId());
        if (user != null)
        {
            await SignInAsync(user, isPersistent: false);
        }
        return RedirectToAction("Index", "Manage");
    }
  9. Run the app and log in with the account you previously registered.
  10. Click on your User ID, which activates the Index action method in Manage controller.
  11. Click Add.
  12. The AddPhoneNumber action method displays a dialog box to enter a phone number that can receive SMS messages.
    // GET: /Account/AddPhoneNumber
    public ActionResult AddPhoneNumber()
    {
       return View();
    }

  13. In a few seconds you will get a text message with the verification code. Enter it and press Submit.
  14. The Manage view shows your phone number was added.

Enable two-factor authentication

In the template generated app, you need to use the UI to enable two-factor authentication (2FA). To enable 2FA, click on your user ID (email alias) in the navigation bar.

Click on enable 2FA.

Log out, then log back in. If you’ve enabled email (see my previous tutorial), you can select the SMS or email for 2FA.

The Verify Code page is displayed where you can enter the code (from SMS or email).

Clicking on the Remember this browser check box will exempt you from needing to use 2FA to log in when using the browser and device where you checked the box. As long as malicious users can’t gain access to your device, enabling 2FA and clicking on the Remember this browser will provide you with convenient one step password access, while still retaining strong 2FA protection for all access from non-trusted devices. You can do this on any private device you regularly use.

 

Reference: http://www.asp.net/mvc/overview/security/aspnet-mvc-5-app-with-sms-and-email-two-factor-authentication

ID-Less URL Structure Demystified – How Does to Resolve ID-Less URLs?

I believe many of you nopCommerce pro users and developers are aware that nopCommerce 2.70 and 2.80 have employed a cleaner URL compared to the previous versions. From URLs that are suffixed with ‘.aspx’ in versions 1.XX; to extentionless but rather verbous URLs in versions 2.65 and below, we have seen a lot of changes in the URL structure in nopCommerce. However, none of them are as mysterious as the URLs in 2.70 and 2.80. Why? Because nopCommerce seems to know the magic to convert from any arbitrary texts to integer IDs.

For example, the link for my NopLite – nopCommerce Responsive Theme ishttp://www.pronopcommerce.com/noplite-nopcommerce-responsive-theme. You don’t see ANY integer in the URL, but nopCommerce somehow knows how to map from the URL to the appropriate ID. On the other hand, the nopCommerce 2.65 URL for my NopLite theme would have been:http://www.pronopcommerce.com/p/7/noplite-nopcommerce-responsive-theme. Note the ‘7’ somewhere in between the URL, that’s the Integer Product ID.

So the question is, how does nopCommerce 2.70 and 2.80 know the ID without looking the ID?

The UrlRecord Database Table

Well, the information is actually stored in a database table called UrlRecord. The table stores the slugs of entities to be mapped. A slug is any URL friendly-text and must be unique per nopCommerce installation. And then there is the EntityId column, which actually maps back to the actual entity represented by the slug. Last but not least, the EntityName column tells nopCommerce the actual entity type (Category, Product, BlogPost and etc) that an EntityId represents.

The nopCommerce database table UrlRecord stores the information of URL Slugs

This table, although useful, is only one part of the equation. We have stored the information, then there must be a way to connect the dots to somehow retrieve the information from the database, and map it with the URLs. The next part of the “magic” lies in the code.

Connecting the Dots – GenericUrlRouteProvider, GenericPathRoute and GenericPathRouteExtensions

First of all, let’s open Nop.Web.Framework.Seo.GenericPathRoute.cs, and you’ll see something like below:

Code snippet from GenericPathRoute.cs showing how it retrives UrlRecord from the database

Basically what the GenericPathRoute class does is to retrieve the RouteData information from the HttpRequest, extract the slug, and compare it with the database record (remember our UrlRecord database table?). If it eventually finds any active exsting record, it then provides additional values to the RouteData (see figure below) such as the Controller, the Action and the ID. In short, GenericPathRoute.cs encapsulates the logic that glue together the three pieces: UrlRecord database table, the actual Controller & Action that is responsible for producing the HTML result, and any other parameters required for the Action to perform correctly.

Code snippet showing how GenericPathRoute.cs class adds the required RouteData

But we are still missing one thing – we need to actually tell MVC to map the ID-less URLs to our freshly baked GenericPathRoute class. In other words, we have to let MVC routing engine knows that: when there is any ID-less URL coming in, we’ll let GenericPathRoute to do the heavy lifting of determining which Controller and Action to call and with what parameters. The figure below shows the GenericUrlRouteProvider class (found inNop.Web.Infrastructure.GenericUrlRouteProvider.cs) doing exactly this job. See the lines around the MapGenericPathRoute() method. The MapGenericPathRoute() method can be found inNop.Web.Framework.Seo.GenericPathRouteExtensions.cs.

GenericUrlRouteProvider doing the mapping between ID-less URLs and GenericPathRoute class

Conclusion – There is Actually No Magic in nopCommerce ID-less URLs

Yeap, the whole architecture in nopCommerce ID-less URLs is pretty clever, but there is really no magic in it. To recap, here are what make up of the ID-less URLs architecture:

  • UrlRecord database table – to store the mapping between a slug and the actual entity
  • GenericPathRoute class – to map a slug with the actual entity with the help of UrlRecord, thereby providing the RouteData to MVC’s routing engine
  • GenericUrlRouteProvider class – to tell MVC’s routing engine to let GenericPathRoute class handle ID-less URLs

Hope this explains the issue! Have any other topics that you want explained? Let me know in the comments, or better yet, use the UserVoice feedback widget at the right side to tell me your ideas! 🙂

 

Reference : http://www.pronopcommerce.com/nopcommerce-id-less-url-structure-demystified-how-does-nopcommerce-270-and-280-resolve-urls

Host a Workflow Service with Windows Server App Fabric

Hosting workflow services in App Fabric is similar to hosting under IIS/WAS. The only difference is the tools App Fabric provides for deploying, monitoring, and managing workflow services. This topic uses the workflow service created in the Creating a Long-running Workflow Service. That topic will walk you through creating a workflow service. This topic will explain how to host the workflow service using App Fabric. For more information about Windows Server App Fabric, see Windows Server App Fabric Documentation. Before completing the steps below make sure you have Windows Server App Fabric installed. To do this open up Internet Information Services (inetmgr.exe), click your server name in the Connections view, click Sites, and click Default Web Site. In the right-hand side of the screen you should see a section called App Fabric. If you don’t see this section (it will be on the top of the right-hand pane) you do not have App Fabric installed. For more information about installing Windows Server App Fabric seeInstalling Windows Server App Fabric.

Creating a Simple Workflow Service

  1. Open Visual Studio 2012 and load the OrderProcessing solution you created in the Creating a Long-running Workflow Service topic.

  2. Right click the OrderService project and select Properties and select the Web tab.

  3. In the Start Action section of the property page select Specific Page and type Service1.xamlx in the edit box.

  4. In the Servers section of the property page select Use Local IIS Web Server and type in the following URL: http://localhost/OrderService.

  5. Click the Create Virtual Directory button. This will create a new virtual directory and set up the project to copy the needed files to the virtual directory when the project is built. Alternatively you could manually copy the .xamlx, the web.config, and any needed DLLs to the virtual directory.

Configuring a Workflow Service Hosted in Windows Server App Fabric

  1. Open Internet Information Services Manager (inetmgr.exe).

  2. Navigate to the OrderService virtual directory in the Connections pane.

  3. Right click OrderService and select Manage WCF and WF ServicesConfigure…. The Configure WCF and WF for Application dialog box is displayed.

  4. Select the General tab to display general information about the application as shown in the following screen shot.

    General tab of the App Fabric Configuration dialog

  5. Select the Monitoring tab. This shows various monitoring settings as shown in the following screen shot.

    App Fabric Configuration Monitoring tab

    For more information about configuring workflow service monitoring in App Fabric see Configuring monitoring with App Fabric.

  6. Select the Workflow Persistence tab. This allows you to configure your application to use App Fabric’s default persistence provider as shown in the following screen shot.

    App Fabric Configuration - Persistence

    For more information about configuring workflow persistence in Windows Server App Fabric see Configuring Workflow Persistence in App Fabric.

  7. Select the Workflow Host Management tab. This allows you to specify when idle workflow service instances should be unloaded and persisted as shown in the following screen shot.

    App Fabric Configuration  Workflow Host Management

    For more information about workflow host management configuration see Configuring Workflow Host Management in App Fabric.

  8. Select the Auto-Start tab. This allows you to specify auto-start settings for the workflow services in the application as shown in the following screen shot.

    App Fabric Auto-start configuration

    For more information about configuring Auto-Start see Configuring Auto-Start with App Fabric.

  9. Select the Throttling tab. This allows you to configure throttling settings for the workflow service as shown in the following screen shot.

    App Fabric configuration throttling

    For more information about configuring throttling see Configuring Throttling with App Fabric.

  10. Select the Security tab. This allows you to configure security settings for the application as shown in the following screen shot.

    App Fabric Security Configuration

    For more information about configuring security with Windows Server App Fabric see Configuring Security with App Fabric.

Using Windows Server App Fabric

  1. Build the solution to copy the necessary files to the virtual directory.

  2. Right click the OrderClient project and select DebugStart New Instance to launch the client application.

  3. The client will run and Visual Studio will display an Attach Security Warning dialog box, click the Don’t Attach button. This tells Visual Studio to not attach to the IIS process for debugging.

  4. The client application will immediately call the Workflow service and then wait. The workflow service will go idle and be persisted. You can verify this by starting Internet Information Services (inetmgr.exe), navigating to the OrderService in the Connections pane and selecting it. Next, click the App Fabric Dashboard icon in the right-hand pane. Under Persisted WF Instances you will see there is one persisted workflow service instance as shown in the following screen shot.

    App Fabric Dashboard

    The WF Instance History lists information about the workflow service such as the number of workflow service activations, the number of workflow service instance completions, and the number of workflow instances with failures. Under Active or Idle instances a link will be displayed, clicking on the link will display more information about the idle workflow instances as shown in the following screen shot.

    Persisted Workflow Instance Details

    For more information about Windows Server App Fabric features and how to use them see Windows Server App Fabric Hosting Features

Reference: http://msdn.microsoft.com/en-us/library/ff729689(v=vs.110).aspx

Why Katana & OWIN – Why Now?

Regardless whether one is discussing a developer framework or end-user product, it’s important to understand the underlying motivations for creating the product – and part of that includes knowing who the product was created for. ASP.NET was originally created with two customers in mind.

The first group of customers was classic ASP developers. At the time, ASP was one of the primary technologies for creating dynamic, data-driven Web sites and applications by interweaving markup and server-side script. The ASP runtime supplied server-side script with a set of objects that abstracted core aspects of the underlying HTTP protocol and Web server and provided access to additional services such session and application state management, cache, etc. While powerful, classic ASP applications became a challenge to manage as they grew in size and complexity. This was largely due to the lack of structure found in in scripting environments coupled with the duplication of code resulting from the interleaving of code and markup. In order to capitalize on the strengths of classic ASP while addressing some of its challenges, ASP.NET took advantage of the code organization provided by the object-oriented languages of the .NET Framework while also preserving the server-side programming model to which classic ASP developers had grown accustomed.

The second group of target customers for ASP.NET was Windows business application developers. Unlike classic ASP developers, who were accustomed to writing HTML markup and the code to generate more HTML markup, WinForms developers (like the VB6 developers before them) were accustomed to a design time experience that included a canvas and a rich set of user interface controls. The first version of ASP.NET – also known as “Web Forms” provided a similar design time experience along with a server-side event model for user interface components and a set of infrastructure features (such as ViewState) to create a seamless developer experience between client and server side programming. Web Forms effectively hid the Web’s stateless nature under a stateful event model that was familiar to WinForms developers.

Challenges Raised by the Historical Model

The net result was a mature, feature-rich runtime and developer programming model. However, with that feature-richness came a couple notable challenges. Firstly, the framework was monolithic, with logically disparate units of functionality being tightly coupled in the same System.Web.dll assembly (for example, the core HTTP objects with the Web forms framework). Secondly, ASP.NET was included as a part of the larger .NET Framework, which meant that the time between releases was on the order of years. This made it difficult for ASP.NET to keep pace with all of the changes happening in rapidly evolving Web development. Finally, System.Web.dll itself was coupled in a few different ways to a specific Web hosting option: Internet Information Services (IIS).

Evolutionary steps: ASP.NET MVC and ASP.NET Web API

And lots of change was happening in Web development! Web applications were increasingly being developed as a series of small, focused components rather than large frameworks. The number of components as well as the frequency with which they were released was increasing at an ever faster rate. It was clear that keeping pace with the Web would require frameworks to get smaller, decoupled and more focused rather than larger and more feature-rich, therefore the ASP.NET team took several evolutionary steps to enable ASP.NET as a family of pluggable Web components rather than a single framework.

One of the early changes was the rise in popularity of the well-known model-view-controller (MVC) design pattern thanks to Web development frameworks like Ruby on Rails. This style of building Web applications gave the developer greater control over her application’s markup while still preserving the separation of markup and business logic, which was one of the initial selling points for ASP.NET. To meet the demand for this style of Web application development, Microsoft took the opportunity to position itself better for the future by developing ASP.NET MVC out of band (and not including it in the .NET Framework).  ASP.NET MVC was released as an independent download. This gave the engineering team the flexibility to deliver updates much more frequently than had been previously possible.

Another major shift in Web application development was the shift from dynamic, server-generated Web pages to static initial markup with dynamic sections of the page generated from client-side script communicating with backend Web APIs through AJAX requests. This architectural shift helped propel the rise of Web APIs, and the development of the ASP.NET Web API framework. As in the case of ASP.NET MVC, the release of ASP.NET Web API provided another opportunity to evolve ASP.NET further as a more modular framework. The engineering team took advantage of the opportunity and built ASP.NET Web API such that it had no dependencies on any of the core framework types found in System.Web.dll. This enabled two things: first, it meant that ASP.NET Web API could evolve in a completely self-contained manner (and it could continue to iterate quickly because it is delivered via NuGet). Second, because there were no external dependencies to System.Web.dll, and therefore, no dependencies to IIS, ASP.NET Web API included the capability to run in a custom host (for example, a console application, Windows service, etc.)

The Future: A Nimble Framework

By decoupling framework components from one another and then releasing them on NuGet, frameworks could now iterate more independently and more quickly. Additionally, the power and flexibility of Web API’s self-hosting capability proved very attractive to developers who wanted a small, lightweight host for their services. It proved so attractive, in fact, that other frameworks also wanted this capability, and this surfaced a new challenge in that each framework ran in its own host process on its own base address and needed to be managed (started, stopped, etc.) independently. A modern Web application generally supports static file serving, dynamic page generation, Web API, and more recently real-time/push notifications. Expecting that each of these services should be run and managed independently was simply not realistic.

What was needed was a single hosting abstraction that would enable a developer to compose an application from a variety of different components and frameworks, and then run that application on a supporting host.

The Open Web Interface for .NET (OWIN)

Inspired by the benefits achieved by Rack  in the Ruby community, several members of the .NET community set out to create an abstraction between Web servers and framework components. Two design goals for the OWIN abstraction were that it was simple and that it took the fewest possible dependencies on other framework types. These two goals help ensure:

  •  New components could be more easily developed and consumed.
  • Applications could be more easily ported between hosts and potentially entire platforms/operating systems.

The resulting abstraction consists of two core elements. The first is the environment dictionary. This data structure is responsible for storing all of the state necessary for processing an HTTP request and response, as well as any relevant server state. The environment dictionary is defined as follows:

IDictionary<string, object>

An OWIN-compatible Web server is responsible for populating the environment dictionary with data such as the body streams and header collections for an HTTP request and response. It is then the responsibility of the application or framework components to populate or update the dictionary with additional values and write to the response body stream.

In addition to specifying the type for the environment dictionary, the OWIN specification defines a list of core dictionary key value pairs. For example, the following table shows the required dictionary keys for an HTTP request:

Key Name Value Description
"owin.RequestBody" A Stream with the request body, if any. Stream.Null MAY be used as a placeholder if there is no request body. See Request Body.
"owin.RequestHeaders" An IDictionary<string, string[]><string, string[]=""> of request headers. See Headers.
"owin.RequestMethod" string containing the HTTP request method of the request (e.g.,"GET""POST").
"owin.RequestPath" string containing the request path. The path MUST be relative to the “root” of the application delegate; see Paths.
"owin.RequestPathBase" string containing the portion of the request path corresponding to the “root” of the application delegate; see Paths.
"owin.RequestProtocol" string containing the protocol name and version (e.g. "HTTP/1.0" or"HTTP/1.1").
"owin.RequestQueryString" string containing the query string component of the HTTP request URI, without the leading “?” (e.g., "foo=bar&baz=quux"). The value may be an empty string.
"owin.RequestScheme" string containing the URI scheme used for the request (e.g., "http","https"); see URI Scheme.

The second key element of OWIN is the application delegate. This is a function signature which serves as the primary interface between all components in an OWIN application. The definition for the application delegate is as follows:

Func<IDictionary<string, object>, Task>;

The application delegate then is simply an implementation of the Func delegate type where the function accepts the environment dictionary as input and returns a Task. This design has several implications for developers:

  • There are a very small number of type dependencies required in order to write OWIN components. This greatly increases the accessibility of OWIN to developers.
  • The asynchronous design enables the abstraction to be efficient with its handling of computing resources, particularly in more I/O intensive operations.
  • Because the application delegate is an atomic unit of execution and because the environment dictionary is carried as a parameter on the delegate, OWIN components can be easily chained together to create complex HTTP processing pipelines.

From an implementation perspective, OWIN is a specification (http://owin.org/spec/owin-1.0.0.html). Its goal is not to be the next Web framework, but rather a specification for how Web frameworks and Web servers interact.

If you’ve investigated OWIN or Katana, you may also have noticed the Owin NuGet package and Owin.dll. This library contains a single interface, IAppBuilder, which formalizes and codifies the startup sequence described insection 4 of the OWIN specification. While not required in order to build OWIN servers, the IAppBuilder interface provides a concrete reference point, and it is used by the Katana project components.

Project Katana

Whereas both the OWIN specification and Owin.dll are community owned and community run open source efforts, the Katana project represents the set of OWIN components that, while still open source, are built and released by Microsoft. These components include both infrastructure components, such as hosts and servers, as well as functional components, such as authentication components and bindings to frameworks such as SignalR andASP.NET Web API. The project has the following three high level goals:

  • Portable – Components should be able to be easily substituted for new components as they become available. This includes all types of components, from the framework to the server and host. The implication of this goal is that third party frameworks can seamlessly run on Microsoft servers while Microsoft frameworks can potentially run on third party servers and hosts.
  • Modular/flexible – Unlike many frameworks which include a myriad of features that are turned on by default, Katana project components should be small and focused, giving control over to the application developer in determining which components to use in her application.
  • Lightweight/performant/scalable – By breaking the traditional notion of a framework into a set of small, focused components which are added explicitly by the application developer, a resulting Katana application can consume fewer computing resources, and as a result, handle more load, than with other types of servers and frameworks. As the requirements of the application demand more features from the underlying infrastructure, those can be added to the OWIN pipeline, but that should be an explicit decision on the part of the application developer. Additionally, the substitutability of lower level components means that as they become available, new high performance servers can seamlessly be introduced to improve the performance of OWIN applications without breaking those applications.

Getting Started with Katana Components

When it was first introduced, one aspect of the Node.js framework that immediately drew people’s attention was the simplicity with which one could author and run a Web server. If Katana goals were framed in light of Node.js, one might summarize them by saying that Katana brings many of the benefits of Node.js (and frameworks like it) without forcing the developer to throw out everything she knows about developing ASP.NET Web applications. For this statement to hold true, getting started with the Katana project should be equally simple in nature to Node.js.

Creating “Hello World!”

One notable difference between JavaScript and .NET development is the presence (or absence) of a compiler. As such, the starting point for a simple Katana server is a Visual Studio project. However, we can start with the most minimal of project types: the Empty ASP.NET Web Application.

Next, we will install the  Microsoft.Owin.Host.SystemWeb NuGet package into the project. This package provides an OWIN server that runs in the ASP.NET request pipeline. It can be found on the NuGet gallery and can be installed using either the Visual Studio package manager dialog or the package manager console with the following command:

install-package Microsoft.Owin.Host.SystemWeb

Installing the Microsoft.Owin.Host.SystemWeb package will install a few additional packages as dependencies. One of those dependencies is Microsoft.Owin, a library which provides several helper types and methods for developing OWIN applications. We can use those types to quickly write the following “hello world” server.

public class Startup
{
   public void Configuration(IAppBuilder app)
   {
      app.Run(context =>
      {
         context.Response.ContentType = "text/plain";
         return context.Response.WriteAsync("Hello World!");
      });
   }
}

This very simple Web server can now be run using Visual Studio’s F5 command and includes full support for debugging.

Switching hosts

By default, the previous “hello world” example runs in the ASP.NET request pipeline, which uses System.Web in the context of IIS. This can by itself add tremendous value as it enables us to benefit from the flexibility and composabity of an OWIN pipeline with the management capabilities and overall maturity of IIS. However, there may be cases where the benefits provided by IIS are not required and the desire is for a smaller, more lightweight host. What is needed, then, to run our simple Web server outside of IIS and System.Web?

To illustrate the portability goal, moving from a Web-server host to a command line host requires simply adding the new server and host dependencies to project’s output folder and then starting the host. In this example, we’ll host our Web server in a Katana host called OwinHost.exe and will use the Katana HttpListener-based server. Similarly to the other Katana components, these will be acquired from NuGet using the following command:

install-package OwinHost

From the command line, we can then navigate to the project root folder and simply run the OwinHost.exe (which was installed in the tools folder of its respective NuGet package). By default, OwinHost.exe is configured to look for the HttpListener-based server and so no additional configuration is needed. Navigating in a Web browser to http://localhost:5000/ shows the application now running through the console.

Katana Architecture

The Katana component architecture divides an application into four logical layers, as depicted below: host, server, middleware, and application. The component architecture is factored in such a way that implementations of these layers can be easily substituted, in many cases, without requiring recompilation of the application.

Host

The host is responsible for:

  •  Managing the underlying process.
  • Orchestrating the workflow that results in the selection of a server and the construction of an OWIN pipeline through which requests will be handled.

At present, there are 3 primary hosting options for Katana-based applications:

IIS/ASP.NET: Using the standard HttpModule and HttpHandler types, OWIN pipelines can run on IIS as a part of an ASP.NET request flow. ASP.NET hosting support is enabled by installing the Microsoft.AspNet.Host.SystemWeb NuGet package into a Web application project. Additionally, because IIS acts as both a host and a server, the OWIN server/host distinction is conflated in this NuGet package, meaning that if using the SystemWeb host, a developer cannot substitute an alternate server implementation.

Custom Host: The Katana component suite gives a developer the ability to host applications in her own custom process, whether that is a console application, Windows service, etc. This capability looks similar to the self-host capability provided by Web API. The following example shows a custom host of Web API code:

static void Main()
{
    var baseAddress = new Uri("http://localhost:5000");

    var config = new HttpSelfHostConfiguration(baseAddress);
    config.Routes.MapHttpRoute("default", "{controller}");

    using (var svr = new HttpSelfHostServer(config))
    {
        svr.OpenAsync().Wait();
        Console.WriteLine("Press Enter to quit.");
        Console.ReadLine();
    }
}

The self-host setup for a Katana application is similar:

static void Main(string[] args)
{
    const string baseUrl = "http://localhost:5000/";

    using (WebApplication.Start<Startup>(new StartOptions { Url = baseUrl })) 
    {
        Console.WriteLine("Press Enter to quit.");
        Console.ReadKey();
    }
}

One notable difference between the Web API and Katana self-host examples is that the Web API configuration code is missing from the Katana self-host example. In order to enable both portability and composability, Katana separates the code that starts the server from the code that configures the request processing pipeline. The code that configures Web API, then is contained in the class Startup, which is additionally specified as the type parameter in WebApplication.Start.

public class Startup
{
    public void Configuration(IAppBuilder app)
    {
        var config = new HttpConfiguration();
        config.Routes.MapHttpRoute("default", "{controller}");
        app.UseWebApi(config);
    }
}

The startup class will be discussed in greater detail later in the article. However, the code required to start a Katana self-host process looks strikingly similar to the code that you may be using today in ASP.NET Web API self-host applications.

OwinHost.exe: While some will want to write a custom process to run Katana Web applications, many would prefer to simply launch a pre-built executable that can start a server and run their application. For this scenario, the Katana component suite includes OwinHost.exe. When run from within a project’s root directory, this executable will start a server (it uses the HttpListener server by default) and use conventions to find and run the user’s startup class. For more granular control, the executable provides a number of additional command line parameters.

Server

While the host is responsible for starting and maintaining process within which the application runs, the responsibility of the server is to open a network socket, listen for requests, and send them through the pipeline of OWIN components specified by the user (as you may have already noticed, this pipeline is specified in the application developer’s Startup class). Currently, the Katana project includes two server implementations:

  • Microsoft.Owin.Host.SystemWeb: As previously mentioned, IIS in concert with the ASP.NET pipeline acts as both a host and a server. Therefore, when choosing this hosting option, IIS both manages host-level concerns such as process activation and listens for HTTP requests. For ASP.NET Web applications, it then sends the requests into the ASP.NET pipeline. The Katana SystemWeb host registers an ASP.NET HttpModule and HttpHandler to intercept requests as they flow through the HTTP pipeline and send them through the user-specified OWIN pipeline.
  • Microsoft.Owin.Host.HttpListener: As its name indicates, this Katana server uses the .NET Framework’s HttpListener class to open a socket and send requests into a developer-specified OWIN pipeline. This is currently the default server selection for both the Katana self-host API and OwinHost.exe.

Middleware/framework

As previously mentioned, when the server accepts a request from a client, it is responsible for passing it through a pipeline of OWIN components, which are specified by the developer’s startup code. These pipeline components are known as middleware.
At a very basic level, an OWIN middleware component simply needs to implement the OWIN application delegate so that it is callable.

Func<IDictionary<string, object>, Task>

However, in order to simplify the development and composition of middleware components, Katana supports a handful of conventions and helper types for middleware components. The most common of these is theOwinMiddleware class. A custom middleware  component built using this class would look similar to the following:

public class LoggerMiddleware : OwinMiddleware
{
    private readonly ILog _logger;

    public LoggerMiddleware(OwinMiddleware next, ILog logger) : base(next)
    {
        _logger = logger;
    }

    public override async Task Invoke(IOwinContext context)
    {
        _logger.LogInfo("Middleware begin");
        await this.Next.Invoke(context);
        _logger.LogInfo("Middleware end");
    }
}

This class derives from OwinMiddleware, implements a constructor that accepts an instance of the next middleware in the pipeline as one of its arguments, and then passes it to the base constructor. Additional arguments used to configure the middleware are also declared as constructor parameters after the next middleware parameter.

At runtime, the middleware is executed via the overridden Invoke method. This method takes a single argument of type OwinContext. This context object is provided by the Microsoft.Owin NuGet package described earlier and provides strongly-typed access to the request, response and environment dictionary, along with a few additional helper types.

The middleware class can be easily added to the OWIN pipeline in the application startup code as follows:

public class Startup
{
   public void Configuration(IAppBuilder app)
   {
      app.Use<LoggerMiddleware>(new TraceLogger());

   }
}

Because the Katana infrastructure simply builds up a pipeline of OWIN middleware components, and because the components simply need to support the application delegate to participate in the pipeline, middleware components can range in complexity from simple loggers to entire frameworks like ASP.NET, Web API, or SignalR. For example, adding ASP.NET Web API to the previous OWIN pipeline requires adding the following startup code:

public class Startup
{
   public void Configuration(IAppBuilder app)
   {
      app.Use<LoggerMiddleware>(new TraceLogger());

      var config = new HttpConfiguration();
      // configure Web API 
      app.UseWebApi(config);

      // additional middleware registrations            
   }
}

The Katana infrastructure will build the pipeline of middleware components based on the order in which they were added to the IAppBuilder object in the Configuration method. In our example, then, LoggerMiddleware can handle all requests that flow through the pipeline, regardless of how those requests are ultimately handled. This enables powerful scenarios where a middleware component (e.g. an authentication component) can process requests for a pipeline that includes multiple components and frameworks (e.g. ASP.NET Web API, SignalR, and a static file server).

Applications

As illustrated by the previous examples, OWIN and the Katana project should not be thought of as a new application programming model, but rather as an abstraction to decouple application programming models and frameworks from server and hosting infrastructure. For example, when building Web API applications, the developer framework will continue to use the ASP.NET Web API framework, irrespective of whether or not the application runs in an OWIN pipeline using components from the Katana project. The one place where OWIN-related code will be visible to the application developer will be the application startup code, where the developer composes the OWIN pipeline. In the startup code, the developer will register a series of UseXx statements, generally one for each middleware component that will process incoming requests. This experience will have the same effect as registering HTTP modules in the current System.Web world. Typically, a larger framework middleware, such as ASP.NET Web API or SignalR will be registered at the end of the pipeline. Cross-cutting middleware components, such as those for authentication or caching, are generally registered towards the beginning of the pipeline so that they will process requests for all of the frameworks and components registered later in the pipeline. This separation of the middleware components from each other and from the underlying infrastructure components enables the components to evolve at different velocities while ensuring that the overall system remains stable.

Components – NuGet Packages

Like many current libraries and frameworks, the Katana project components are delivered as a set of NuGet packages. For the upcoming version 2.0, the Katana package dependency graph looks as follows. (Click on image for larger view.)

Nearly every package in the Katana project depends, directly or indirectly, on the Owin package. You may remember that this is the package that contains the IAppBuilder interface, which provides a concrete implementation of the application startup sequence described in section 4 of the OWIN specification. Additionally, many of the packages depend on Microsoft.Owin, which provides a set of helper types for working with HTTP requests and responses. The remainder of the package can be classified as either hosting infrastructure packages (servers or hosts) or middleware. Packages dependencies that are external to the Katana project are displayed in orange.

The hosting infrastructure for Katana 2.0 includes both the SystemWeb and HttpListener-based servers, the OwinHost package for running OWIN applications using OwinHost.exe, and the Microsoft.Owin.Hosting package for self-hosting OWIN applications in a custom host (e.g. console application, Windows service, etc.)

For Katana 2.0, the middleware components are primarily focused on providing different means of authentication. One additional middleware component for diagnostics is provided, which enables support for a start and error page. As OWIN grows into the de facto hosting abstraction, the ecosystem of middleware components, both those developed by Microsoft and third parties, will also grow in number.

Conclusion

From its beginning, the Katana project’s goal has not been to create and thereby force developers to learn yet another Web framework. Rather, the goal has been to create an abstraction to give .NET Web application developers more choice than has previously been possible. By breaking up the logical layers of a typical Web application stack into a set of replaceable components, the Katana project enables components throughout the stack to improve at whatever rate makes sense for those components. By building all components around the simple OWIN abstraction, Katana enables frameworks and the applications built on top of them to be portable across a variety of different servers and hosts. By putting the developer in control of the stack, Katana ensures that the developer makes the ultimate choice about how lightweight or how feature-rich her Web stack should be.

Reference: http://www.asp.net/aspnet/overview/owin-and-katana/an-overview-of-project-katana

Tutorial: Getting Started with SignalR and MVC 4 (C#)

Overview

This tutorial introduces you to real-time web application development with ASP.NET SignalR and ASP.NET MVC 4. The tutorial uses the same chat application code as the SignalR Getting Started tutorial, but shows how to add it to an MVC 4 application based on the Internet template.

In this topic you will learn the following SignalR development tasks:

  • Adding the SignalR library to an MVC 4 application.
  • Creating a hub class to push content to clients.
  • Using the SignalR jQuery library in a web page to send messages and display updates from the hub.

The following screen shot shows the completed chat application running in a browser.

Chat instances

Sections:

Set up the Project

Prerequisites:

  • Visual Studio 2010 SP1, Visual Studio 2012, or Visual Studio 2012 Express. If you do not have Visual Studio, seeASP.NET Downloads to get the free Visual Studio 2012 Express Development Tool.
  • For Visual Studio 2010, install ASP.NET MVC 4.

This section shows how to create an ASP.NET MVC 4 application, add the SignalR library, and create the chat application.

    1. In Visual Studio create an ASP.NET MVC 4 application, name it SignalRChat, and click OK.

      Note: In VS 2010, select .NET Framework 4 in the Framework version dropdown control. SignalR code runs on .NET Framework versions 4 and 4.5.

      Create mvc web

    2. Select the Internet Application template, clear the option to Create a unit test project, and click OK.

      Create mvc internet site

    3. Open the Tools | Library Package Manager | Package Manager Console and run the following command. This step adds to the project a set of script files and assembly references that enable SignalR functionality.

      install-package Microsoft.AspNet.SignalR

    4. In Solution Explorer expand the Scripts folder. Note that script libraries for SignalR have been added to the project.

      Library references

    5. In Solution Explorer, right-click the project, select Add | New Folder, and add a new folder namedHubs.
    6. Right-click the Hubs folder, click Add | Class, and create a new C# class named ChatHub.cs. You will use this class as a SignalR server hub that sends messages to all clients.

Note: If you use Visual Studio 2012 and have installed the ASP.NET and Web Tools 2012.2 update, you can use the new SignalR item template to create the hub class. To do that, right-click the Hubs folder, click Add | New Item, select SignalR Hub Class, and name the class ChatHub.cs.

  1. Replace the code in the ChatHub class with the following code.
    using System;
    using System.Web;
    using Microsoft.AspNet.SignalR;
    
    namespace SignalRChat
    {
        public class ChatHub : Hub
        {
            public void Send(string name, string message)
            {
                // Call the addNewMessageToPage method to update clients.
                Clients.All.addNewMessageToPage(name, message);
            }
        }
    }
  2. Open the Global.asax file for the project, and add a call to the method RouteTable.Routes.MapHubs(); as the first line of code in the Application_Start method. This code registers the default route for SignalR hubs and must be called before you register any other routes. The completed Application_Start method looks like the following example.
    protected void Application_Start()
    {
        RouteTable.Routes.MapHubs();
        AreaRegistration.RegisterAllAreas();
    
        WebApiConfig.Register(GlobalConfiguration.Configuration);
        FilterConfig.RegisterGlobalFilters(GlobalFilters.Filters);
        RouteConfig.RegisterRoutes(RouteTable.Routes);
        BundleConfig.RegisterBundles(BundleTable.Bundles);
        AuthConfig.RegisterAuth();
    }
  3. Edit the HomeController class found in Controllers/HomeController.cs and add the following method to the class. This method returns the Chat view that you will create in a later step.
    public ActionResult Chat()
    {
        return View();
    }
  4. Right-click within the Chat method you just created, and click Add View to create a new view file.
  5. In the Add View dialog, make sure the check box is selected to Use a layout or master page (clear the other check boxes), and then click Add.

    Add a view

  6. Edit the new view file named Chat.cshtml. After the <h2> tag, paste the following <div> section and@section scripts code block into the page. This script enables the page to send chat messages and display messages from the server. The complete code for the chat view appears in the following code block.

    Important: When you add SignalR and other script libraries to your Visual Studio project, the Package Manager might install versions of the scripts that are more recent than the versions shown in this topic. Make sure that the script references in your code match the versions of the script libraries installed in your project.

    @{
        ViewBag.Title = "Chat";
    }
    
    <h2>Chat</h2>
    
    <div class="container">
        <input type="text" id="message" />
        <input type="button" id="sendmessage" value="Send" />
        <input type="hidden" id="displayname" />
        <ul id="discussion">
        </ul>
    </div>@section scripts {
        <!--Script references. -->
        <!--The jQuery library is required and is referenced by default in _Layout.cshtml. -->
        <!--Reference the SignalR library. -->
        <script src="~/Scripts/jquery.signalR-1.0.1.js"></script>
        <!--Reference the autogenerated SignalR hub script. -->
        <script src="~/signalr/hubs"></script>
        <!--SignalR script to update the chat page and send messages.--> 
        <script>
            $(function () {
                // Reference the auto-generated proxy for the hub.  
                var chat = $.connection.chatHub;
                // Create a function that the hub can call back to display messages.
                chat.client.addNewMessageToPage = function (name, message) {
                    // Add the message to the page. 
                    $('#discussion').append('<li><strong>' + htmlEncode(name) 
                        + '</strong>: ' + htmlEncode(message) + '</li>');
                };
                // Get the user name and store it to prepend to messages.
                $('#displayname').val(prompt('Enter your name:', ''));
                // Set initial focus to message input box.  
                $('#message').focus();
                // Start the connection.
                $.connection.hub.start().done(function () {
                    $('#sendmessage').click(function () {
                        // Call the Send method on the hub. 
                        chat.server.send($('#displayname').val(), $('#message').val());
                        // Clear text box and reset focus for next comment. 
                        $('#message').val('').focus();
                    });
                });
            });
            // This optional function html-encodes messages for display in the page.
            function htmlEncode(value) {
                var encodedValue = $('<div />').text(value).html();
                return encodedValue;
            }
        </script>
    }
  7. Save All for the project.

Run the Sample

  1. Press F5 to run the project in debug mode.
  2. In the browser address line, append /home/chat to the URL of the default page for the project. The Chat page loads in a browser instance and prompts for a user name.

    Enter user name

  3. Enter a user name.
  4. Copy the URL from the address line of the browser and use it to open two more browser instances. In each browser instance, enter a unique user name.
  5. In each browser instance, add a comment and click Send. The comments should display in all browser instances.

    Note: This simple chat application does not maintain the discussion context on the server. The hub broadcasts comments to all current users. Users who join the chat later will see messages added from the time they join.

  6. The following screen shot shows the chat application running in a browser.

    Chat browsers

  7. In Solution Explorer, inspect the Script Documents node for the running application. This node is visible in debug mode if you are using Internet Explorer as your browser. There is a script file named hubs that the SignalR library dynamically generates at runtime. This file manages the communication between jQuery script and server-side code. If you use a browser other than Internet Explorer, you can also access the dynamic hubsfile by browsing to it directly, for example http://mywebsite/signalr/hubs.

    Generated hub script

Examine the Code

The SignalR chat application demonstrates two basic SignalR development tasks: creating a hub as the main coordination object on the server, and using the SignalR jQuery library to send and receive messages.

SignalR Hubs

In the code sample the ChatHub class derives from the Microsoft.AspNet.SignalR.Hub class. Deriving from theHub class is a useful way to build a SignalR application. You can create public methods on your hub class and then access those methods by calling them from jQuery scripts in a web page.

In the chat code, clients call the ChatHub.Send method to send a new message. The hub in turn sends the message to all clients by calling Clients.All.addNewMessageToPage.

The Send method demonstrates several hub concepts :

  • Declare public methods on a hub so that clients can call them.
  • Use the Microsoft.AspNet.SignalR.Hub.Clients property to access all clients connected to this hub.
  • Call a jQuery function on the client (such as the addNewMessageToPage function) to update clients.
    public class ChatHub : Hub
    {
        public void Send(string name, string message)
        {
            Clients.All.addNewMessageToPage(name, message);
        }
    }

SignalR and jQuery

The Chat.cshtml view file in the code sample shows how to use the SignalR jQuery library to communicate with a SignalR hub. The essential tasks in the code are creating a reference to the auto-generated proxy for the hub, declaring a function that the server can call to push content to clients, and starting a connection to send messages to the hub.

The following code declares a proxy for a hub.

var chat = $.connection.chatHub; 

Note: In jQuery the reference to the server class and its members is in camel case. The code sample references the C# ChatHub class in jQuery as chatHub. If you want to reference the ChatHub class in jQuery with conventional Pascal casing as you would in C#, edit the ChatHub.cs class file. Add a using statement to reference theMicrosoft.AspNet.SignalR.Hubs namespace. Then add the HubName attribute to the ChatHub class, for example[HubName("ChatHub")]. Finally, update your jQuery reference to the ChatHub class.

The following code shows how to create a callback function in the script. The hub class on the server calls this function to push content updates to each client. The optional call to the htmlEncode function shows a way to HTML encode the message content before displaying it in the page, as a way to prevent script injection.

chat.client.addNewMessageToPage = function (name, message) {
    // Add the message to the page. 
    $('#discussion').append('<li><strong>' + htmlEncode(name) 
        + '</strong>: ' + htmlEncode(message) + '</li>');
};

The following code shows how to open a connection with the hub. The code starts the connection and then passes it a function to handle the click event on the Send button in the Chat page.

Note: This approach ensures that the connection is established before the event handler executes.

$.connection.hub.start().done(function () {
    $('#sendmessage').click(function () {
        // Call the Send method on the hub. 
        chat.server.send($('#displayname').val(), $('#message').val());
        // Clear text box and reset focus for next comment. 
        $('#message').val('').focus();
    });
});

Next Steps

You learned that SignalR is a framework for building real-time web applications. You also learned several SignalR development tasks: how to add SignalR to an ASP.NET application, how to create a hub class, and how to send and receive messages from the hub.

You can make the sample application in this tutorial or other SignalR applications available over the Internet by deploying them to a hosting provider. Microsoft offers free web hosting for up to 10 web sites in a free Windows Azure trial account. For a walkthrough on how to deploy a simple SignalR application, see Publish the SignalR Getting Started Sample as a Windows Azure Web Site. For detailed information about how to deploy a Visual Studio web project to a Windows Azure Web Site, see Deploying an ASP.NET Application to a Windows Azure Web Site. (Note: The WebSocket transport is not currently supported for Windows Azure Web Sites. When WebSocket transport is not available, SignalR uses the other available transports as described in the Transports section of theIntroduction to SignalR topic.)

To learn more advanced SignalR developments concepts, visit the following sites for SignalR source code and resources :

Reference : http://www.asp.net/signalr/overview/getting-started/tutorial-getting-started-with-signalr-and-mvc-4

Introduction to SignalR

ASP.NET SignalR is a library for ASP.NET developers that simplifies the process of adding real-time web functionality to applications. Real-time web functionality is the ability to have server code push content to connected clients instantly as it becomes available, rather than having the server wait for a client to request new data.

SignalR can be used to add any sort of “real-time” web functionality to your ASP.NET application. While chat is often used as an example, you can do a whole lot more. Any time a user refreshes a web page to see new data, or the page implements long polling to retrieve new data, it is a candidate for using SignalR. Examples include dashboards and monitoring applications, collaborative applications (such as simultaneous editing of documents), job progress updates, and real-time forms.

SignalR also enables completely new types of web applications that require high frequency updates from the server, for example, real-time gaming. For a great example of this, see the ShootR game.

SignalR provides a simple API for creating server-to-client remote procedure calls (RPC) that call JavaScript functions in client browsers (and other client platforms) from server-side .NET code. SignalR also includes API for connection management (for instance, connect and disconnect events), and grouping connections.

Invoking methods with SignalR

SignalR handles connection management automatically, and lets you broadcast messages to all connected clients simultaneously, like a chat room. You can also send messages to specific clients. The connection between the client and server is persistent, unlike a classic HTTP connection, which is re-established for each communication.

SignalR supports “server push” functionality, in which server code can call out to client code in the browser using Remote Procedure Calls (RPC), rather than the request-response model common on the web today.

SignalR applications can scale out to thousands of clients using Service Bus, SQL Server or Redis.

SignalR is open-source, accessible through GitHub.

SignalR and WebSocket

SignalR uses the new WebSocket transport where available, and falls back to older transports where necessary. While you could certainly write your application using WebSocket directly, using SignalR means that a lot of the extra functionality you would need to implement will already have been done for you. Most importantly, this means that you can code your application to take advantage of WebSocket without having to worry about creating a separate code path for older clients. SignalR also shields you from having to worry about updates to WebSocket, since SignalR will continue to be updated to support changes in the underlying transport, providing your application a consistent interface across versions of WebSocket.

While you could certainly create a solution using WebSocket alone, SignalR provides all of the functionality you would need to write yourself, such as fallback to other transports and revising your application for updates to WebSocket implementations.

Transports and fallbacks

SignalR is an abstraction over some of the transports that are required to do real-time work between client and server. A SignalR connection starts as HTTP, and is then promoted to a WebSocket connection if it is available. WebSocket is the ideal transport for SignalR, since it makes the most efficient use of server memory, has the lowest latency, and has the most underlying features (such as full duplex communication between client and server), but it also has the most stringent requirements: WebSocket requires the server to be using Windows Server 2012 or Windows 8, and .NET Framework 4.5. If these requirements are not met, SignalR will attempt to use other transports to make its connections.

HTML 5 transports

These transports depend on support for HTML 5. If the client browser does not support the HTML 5 standard, older transports will be used.

  • WebSocket (if the both the server and browser indicate they can support Websocket). WebSocket is the only transport that establishes a true persistent, two-way connection between client and server. However, WebSocket also has the most stringent requirements; it is fully supported only in the latest versions of Microsoft Internet Explorer, Google Chrome, and Mozilla Firefox, and only has a partial implementation in other browsers such as Opera and Safari.
  • Server Sent Events, also known as EventSource (if the browser supports Server Sent Events, which is basically all browsers except Internet Explorer.)

Comet transports

The following transports are based on the Comet web application model, in which a browser or other client maintains a long-held HTTP request, which the server can use to push data to the client without the client specifically requesting it.

  • Forever Frame (for Internet Explorer only). Forever Frame creates a hidden IFrame which makes a request to an endpoint on the server that does not complete. The server then continually sends script to the client which is immediately executed, providing a one-way realtime connection from server to client. The connection from client to server uses a separate connection from the server to client connection, and like a standard HTML request, a new connection is created for each piece of data that needs to be sent.
  • Ajax long polling. Long polling does not create a persistent connection, but instead polls the server with a request that stays open until the server responds, at which point the connection closes, and a new connection is requested immediately. This may introduce some latency while the connection resets.

For more information on what transports are supported under which configurations, see Supported Platforms.

Monitoring transports

You can determine what transport your application is using by enabling logging on your hub, and opening the console window in your browser.

To enable logging for your hub’s events in a browser, add the following command to your client application:

$.connection.myHub.logging = true;

  • In Internet Explorer, open the developer tools by pressing F12, and click the Console tab.Console in Microsoft Internet Explorer
  • In Chrome, open the console by pressing Ctrl+Shift+J.Console in Google Chrome

With the console open and logging enabled, you’ll be able to see which transport is being used by SignalR.

Console in Internet Explorer showing WebSocket transport

Specifying a transport

Negotiating a transport takes a certain amount of time and client/server resources. If the client capabilities are known, then a transport can be specified when the client connection is started. The following code snippet demonstrates starting a connection using the Ajax Long Polling transport, as would be used if it was known that the client did not support any other protocol:

connection.start({ transport: 'longPolling' });

You can specify a fallback order if you want a client to try specific transports in order. The following code snippet demonstrates trying WebSocket, and failing that, going directly to Long Polling.

connection.start({ transport: ['webSockets','longPolling'] });

The string constants for specifying transports are defined as follows:

  • webSockets
  • forverFrame
  • serverSentEvents
  • longPolling

Connections and Hubs

The SignalR API contains two models for communicating between clients and servers: Persistent Connections and Hubs.

A Connection represents a simple endpoint for sending single-recipient, grouped, or broadcast messages. The Persistent Connection API (represented in .NET code by the PersistentConnection class) gives the developer direct access to the low-level communication protocol that SignalR exposes. Using the Connections communication model will be familiar to developers who have used connection-based APIs such as Windows Communcation Foundation.

A Hub is a more high-level pipeline built upon the Connection API that allows your client and server to call methods on each other directly. SignalR handles the dispatching across machine boundaries as if by magic, allowing clients to call methods on the server as easily as local methods, and vice versa. Using the Hubs communication model will be familiar to developers who have used remote invocation APIs such as .NET Remoting. Using a Hub also allows you to pass strongly typed parameters to methods, enabling model binding.

Architecture diagram

The following diagram shows the relationship between Hubs, Persistent Connections, and the underlying technologies used for transports.

SignalR Architecture Diagram showing APIs, transports, and clients

How Hubs work

When server-side code calls a method on the client, a packet is sent across the active transport that contains the name and parameters of the method to be called (when an object is sent as a method parameter, it is serialized using JSON). The client then matches the method name to methods defined in client-side code. If there is a match, the client method will be executed using the deserialized parameter data.

The method call can be monitored using tools like Fiddler. The following image shows a method call sent from a SignalR server to a web browser client in the Logs pane of Fiddler. The method call is being sent from a hub calledMoveShapeHub, and the method being invoked is called updateShape.

View of Fiddler log showing SignalR traffic

In this example, the hub name is identified with the H parameter; the method name is identified with the Mparameter, and the data being sent to the method is identified with the A parameter. The application that generated this message is created in the High-Frequency Realtime tutorial.

Choosing a communication model

Most applications should use the Hubs API. The Connections API could be used in the following circumstances:

  • The format of the actual message sent needs to be specified.
  • The developer prefers to work with a messaging and dispatching model rather than a remote invocation model.
  • An existing application that uses a messaging model is being ported to use SignalR.

Reference : http://www.asp.net/signalr/overview/getting-started/introduction-to-signalr

ASP.NET Web API

The last few years have seen the rise of Web APIs – services exposed over plain HTTP rather than through a more formal service contract (like SOAP or WS*).  Exposing services this way can make it easier to integrate functionality with a broad variety of device and client platforms, as well as create richer HTML experiences using JavaScript from within the browser.  Most large sites on the web now expose Web APIs (some examples: Facebook, Twitter, LinkedIn, Netflix, etc), and the usage of them is going to accelerate even more in the years ahead as connected devices proliferate and users demand richer user experiences.

Our new ASP.NET Web API support enables you to easily create powerful Web APIs that can be accessed from a broad range of clients (ranging from browsers using JavaScript, to native apps on any mobile/client platform).  It provides the following support:

  • Modern HTTP programming model: Directly access and manipulate HTTP requests and responses in your Web APIs using a clean, strongly typed HTTP object model.  In addition to supporting this HTTP programming model on the server, we also support the same programming model on the client with the new HttpClient API that can be used to call Web APIs from any .NET application.
  • Content negotiation: Web API has built-in support for content negotiation – which enables the client and server to work together to determine the right format for data being returned from an API.  We provide default support for JSON, XML and Form URL-encoded formats, and you can extend this support by adding your own formatters, or even replace the default content negotiation strategy with one of your own.
  • Query composition: Web API enables you to easily support querying via the OData URL conventions.  When you return a type of IQueryable from your Web API, the framework will automatically provide OData query support over it – making it easy to implement paging and sorting.
  • Model binding and validation: Model binders provide an easy way to extract data from various parts of an HTTP request and convert those message parts into .NET objects which can be used by Web API actions.  Web API supports the same model binding and validation infrastructure that ASP.NET MVC supports today.
  • Routes: Web APIs support the full set of routing capabilities supported within ASP.NET MVC and ASP.NET today, including route parameters and constraints. Web API also provides smart conventions by default, enabling you to easily create classes that implement Web APIs without having to apply attributes to your classes or methods.  Web API configuration is accomplished solely through code – leaving your config files clean.
  • Filters: Web APIs enables you to easily use and create filters (for example: [authorization]) that enable you to encapsulate and apply cross-cutting behavior.
  • Improved testability: Rather than setting HTTP details in static context objects, Web API actions can now work with instances of HttpRequestMessage and HttpResponseMessage – two new HTTP objects that (among other things) make testing much easier. As an example, you can unit test your Web APIs without having to use a Mocking framework.
  • IoC Support: Web API supports the service locator pattern implemented by ASP.NET MVC, which enables you to resolve dependencies for many different facilities.  You can easily integrate this with an IoC container or dependency injection framework to enable clean resolution of dependencies.
  • Flexible Hosting: Web APIs can be hosted within any type of ASP.NET application (including both ASP.NET MVC and ASP.NET Web Forms based applications).  We’ve also designed the Web API support so that you can also optionally host/expose them within your own process if you don’t want to use ASP.NET/IIS to do so.  This gives you maximum flexibility in how and where you use it.

Reference : http://weblogs.asp.net/scottgu/archive/2012/02/23/asp-net-web-api-part-1.aspx

What news in ASP.NET MVC 4?

 

Installation Notes

ASP.NET MVC 4 for Visual Studio 2010 can be installed from the ASP.NET MVC 4 home page using the Web Platform Installer.

We recommend uninstalling any previously installed previews of ASP.NET MVC 4 prior to installing ASP.NET MVC 4. You can upgrade the ASP.NET MVC 4 Beta and Release Candidate to ASP.NET MVC 4 without uninstalling.

This release is not compatible with any preview releases of .NET Framework 4.5. You must separately upgrade the any installed preview releases of .NET Framework 4.5 to the final version prior to installing ASP.NET MVC 4.

ASP.NET MVC 4 can be installed and run side-by-side with ASP.NET MVC 3.

Documentation

Documentation for ASP.NET MVC is available on the MSDN website at the following URL:

http://go.microsoft.com/fwlink/?LinkID=243043

Tutorials and other information about ASP.NET MVC are available on the MVC 4 page of the ASP.NET website (http://www.asp.net/mvc/mvc4).

Support

ASP.NET MVC 4 is fully supported. If you have questions about working with this release you can also post them to the ASP.NET MVC forum (http://forums.asp.net/1146.aspx), where members of the ASP.NET community are frequently able to provide informal support.

Software Requirements

The ASP.NET MVC 4 components for Visual Studio require PowerShell 2.0 and either Visual Studio 2010 with Service Pack 1 or Visual Web Developer Express 2010 with Service Pack 1.

New Features in ASP.NET MVC 4

This section describes features that have been introduced in the ASP.NET MVC 4 release.

ASP.NET Web API

ASP.NET MVC 4 includes ASP.NET Web API, a new framework for creating HTTP services that can reach a broad range of clients including browsers and mobile devices. ASP.NET Web API is also an ideal platform for building RESTful services.

ASP.NET Web API includes support for the following features:

  • Modern HTTP programming model: Directly access and manipulate HTTP requests and responses in your Web APIs using a new, strongly typed HTTP object model. The same programming model and HTTP pipeline is symmetrically available on the client through the newHttpClient type.
  • Full support for routes: ASP.NET Web API supports the full set of route capabilities of ASP.NET Routing, including route parameters and constraints. Additionally, use simple conventions to map actions to HTTP methods.
  • Content negotiation: The client and server can work together to determine the right format for data being returned from a web API. ASP.NET Web API provides default support for XML, JSON, and Form URL-encoded formats and you can extend this support by adding your own formatters, or even replace the default content negotiation strategy.
  • Model binding and validation: Model binders provide an easy way to extract data from various parts of an HTTP request and convert those message parts into .NET objects which can be used by the Web API actions. Validation is also performed on action parameters based on data annotations.
  • Filters: ASP.NET Web API supports filters including well-known filters such as the [Authorize]attribute. You can author and plug in your own filters for actions, authorization and exception handling.
  • Query composition: Use the [Queryable] filter attribute on an action that returns IQueryable to enable support for querying your web API via the OData query conventions.
  • Improved testability: Rather than setting HTTP details in static context objects, web API actions work with instances of HttpRequestMessage and HttpResponseMessage. Create a unit test project along with your Web API project to get started quickly writing unit tests for your Web API functionality.
  • Code-based configuration: ASP.NET Web API configuration is accomplished solely through code, leaving your config files clean. Use the provide service locator pattern to configure extensibility points.
  • Improved support for Inversion of Control (IoC) containers: ASP.NET Web API provides great support for IoC containers through an improved dependency resolver abstraction
  • Self-host: Web APIs can be hosted in your own process in addition to IIS while still using the full power of routes and other features of Web API.
  • Create custom help and test pages: You now can easily build custom help and test pages for your web APIs by using the new IApiExplorer service to get a complete runtime description of your web APIs.
  • Monitoring and diagnostics: ASP.NET Web API now provides light weight tracing infrastructure that makes it easy to integrate with existing logging solutions such as System.Diagnostics, ETW and third party logging frameworks. You can enable tracing by providing an ITraceWriterimplementation and adding it to your web API configuration.
  • Link generation: Use the ASP.NET Web API UrlHelper to generate links to related resources in the same application.
  • Web API project template: Select the new Web API project form the New MVC 4 Project wizard to quickly get up and running with ASP.NET Web API.
  • Scaffolding: Use the Add Controller dialog to quickly scaffold a web API controller based on an Entity Framework based model type.

For more details on ASP.NET Web API please visit http://www.asp.net/web-api.

Enhancements to Default Project Templates

The template that is used to create new ASP.NET MVC 4 projects has been updated to create a more modern-looking website:

In addition to cosmetic improvements, there’s improved functionality in the new template. The template employs a technique called adaptive rendering to look good in both desktop browsers and mobile browsers without any customization.

To see adaptive rendering in action, you can use a mobile emulator or just try resizing the desktop browser window to be smaller. When the browser window gets small enough, the layout of the page will change.

Mobile Project Template

If you’re starting a new project and want to create a site specifically for mobile and tablet browsers, you can use the new Mobile Application project template. This is based on jQuery Mobile, an open-source library for building touch-optimized UI:

This template contains the same application structure as the Internet Application template (and the controller code is virtually identical), but it’s styled using jQuery Mobile to look good and behave well on touch-based mobile devices. To learn more about how to structure and style mobile UI, see thejQuery Mobile project website.

If you already have a desktop-oriented site that you want to add mobile-optimized views to, or if you want to create a single site that serves differently styled views to desktop and mobile browsers, you can use the new Display Modes feature. (See the next section.)

Display Modes

The new Display Modes feature lets an application select views depending on the browser that’s making the request. For example, if a desktop browser requests the Home page, the application might use the Views\Home\Index.cshtml template. If a mobile browser requests the Home page, the application might return the Views\Home\Index.mobile.cshtml template.

Layouts and partials can also be overridden for particular browser types. For example:

  • If your Views\Shared folder contains both the _Layout.cshtml and _Layout.mobile.cshtml templates, by default the application will use _Layout.mobile.cshtml during requests from mobile browsers and _Layout.cshtml during other requests.
  • If a folder contains both _MyPartial.cshtml and _MyPartial.mobile.cshtml, the instruction @Html.Partial(“_MyPartial”) will render _MyPartial.mobile.cshtml during requests from mobile browsers, and _MyPartial.cshtml during other requests.

If you want to create more specific views, layouts, or partial views for other devices, you can register a new DefaultDisplayMode instance to specify which name to search for when a request satisfies particular conditions. For example, you could add the following code to the Application_Start method in the Global.asax file to register the string “iPhone” as a display mode that applies when the Apple iPhone browser makes a request:

DisplayModeProvider.Instance.Modes.Insert(0, new
DefaultDisplayMode("iPhone")
{
    ContextCondition = (context => context.GetOverriddenUserAgent().IndexOf
        ("iPhone", StringComparison.OrdinalIgnoreCase) >= 0)
 });

After this code runs, when an Apple iPhone browser makes a request, your application will use the Views\Shared\_Layout.iPhone.cshtml layout (if it exists). For more information on Display Mode, seeASP.NET MVC 4 Mobile Features. Applications using DisplayModeProvider should install the Fixed DisplayModes NuGet package. The ASP.NET Fall 2012 Update includes the Fixed DisplayModes NuGet package in the new project templates. See ASP.NET MVC 4 Mobile Caching Bug Fixedd for details on the fix.

jQuery Mobile and Mobile Features

For information on building Mobile applications with ASP.NET MVC 4 using jQuery Mobile, see the tutorial  ASP.NET MVC 4 Mobile Features.

  • Task Support for Asynchronous Controllers

    You can now write asynchronous action methods as single methods that return an object of type Taskor Task<ActionResult>.

    For more information see Using Asynchronous Methods in ASP.NET MVC 4.

    Azure SDK

    ASP.NET MVC 4 supports the 1.6 and newer releases of the Windows Azure SDK.

    Database Migrations

    ASP.NET MVC 4 projects now include Entity Framework 5. One of the great features in Entity Framework 5 is support for database migrations. This feature enables you to easily evolve your database schema using a code-focused migration while preserving the data in the database. For more information on database migrations, see Adding a New Field to the Movie Model and Table in the Introduction to ASP.NET MVC 4 tutorial.

    Empty Project Template

    The MVC Empty project template is now truly empty so that you can start from a completely clean slate. The earlier version of the Empty project template has been renamed to Basic.

    Add Controller to any project folder

    You can now right click and select Add Controller from any folder in your MVC project. This gives you more flexibility to organize your controllers however you want, including keeping your MVC and Web API controllers in separate folders.

    Bundling and Minification

    The bundling and minification framework enables you to reduce the number of HTTP requests that a Web page needs to make by combining individual files into a single, bundled file for scripts and CSS. It can then reduce the overall size of those requests by minifying the contents of the bundle. Minifying can include activities like eliminating whitespace to shortening variable names to even collapsing CSS selectors based on their semantics. Bundles are declared and configured in code and are easily referenced in views via helper methods which can generate either a single link to the bundle or, when debugging, multiple links to the individual contents of the bundle. For more information see Bundling and Minification.

    Enabling Logins from Facebook and Other Sites Using OAuth and OpenID

    The default templates in ASP.NET MVC 4 Internet Project template now includes support for OAuth and OpenID login using the DotNetOpenAuth library. For information on configuring an OAuth or OpenID provider, see OAuth/OpenID Support for WebForms, MVC and WebPages and the OAuth and OpenID feature documentation in ASP.NET Web Pages.

    Upgrading an ASP.NET MVC 3 Project to ASP.NET MVC 4

    ASP.NET MVC 4 can be installed side by side with ASP.NET MVC 3 on the same computer, which gives you flexibility in choosing when to upgrade an ASP.NET MVC 3 application to ASP.NET MVC 4.

    The simplest way to upgrade is to create a new ASP.NET MVC 4 project and copy all the views, controllers, code, and content files from the existing MVC 3 project to the new project and then to update the assembly references in the new project to match any non-MVC template included assembiles you are using. If you have made changes to the Web.config file in the MVC 3 project, you must also merge those changes into the Web.config file in the MVC 4 project.

    To manually upgrade an existing ASP.NET MVC 3 application to version 4, do the following:

      1. In all Web.config files in the project (there is one in the root of the project, one in the Views folder, and one in the Views folder for each area in your project), replace every instance of the following text (note: System.Web.WebPages, Version=1.0.0.0 is not found in projects created with Visual Studio 2012):
    System.Web.Mvc, Version=3.0.0.0
    System.Web.WebPages, Version=1.0.0.0
    System.Web.Helpers, Version=1.0.0.0
    System.Web.WebPages.Razor, Version=1.0.0.0

    with the following corresponding text:

    System.Web.Mvc, Version=4.0.0.0
    System.Web.WebPages, Version=2.0.0.0
    System.Web.Helpers, Version=2.0.0.0
    System.Web.WebPages.Razor, Version=2.0.0.0
      1. In the root Web.config file, update the webPages:Version element to “2.0.0.0” and add a newPreserveLoginUrl key that has the value “true”:
    <appSettings>
      <add key="webpages:Version" value="2.0.0.0" />
      <add key="PreserveLoginUrl" value="true" />
    </appSettings>
    1. In Solution Explorer, right-click on the References and select Manage NuGet Packages. In the left pane, select Online\NuGet official package source, then update the following:
      • ASP.NET MVC 4
      • (Optional) jQuery, jQuery Validation and jQuery UI
      • (Optional) Entity Framework
      • (Optonal) Modernizr
    2. In Solution Explorer, right-click the project name and then select Unload Project. Then right-click the name again and select Edit ProjectName.csproj.
    3. Locate the ProjectTypeGuids element and replace {E53F8FEA-EAE0-44A6-8774-FFD645390401} with {E3E379DF-F4C6-4180-9B81-6769533ABE47}.
    4. Save the changes, close the project (.csproj) file you were editing, right-click the project, and then select Reload Project.
    5. If the project references any third-party libraries that are compiled using previous versions of ASP.NET MVC, open the root Web.config file and add the following three bindingRedirect elements under the configuration section:
      <configuration>
        <!--... elements deleted for clarity ...-->
       
        <runtime>
          <assemblyBinding xmlns="urn:schemas-microsoft-com:asm.v1">
            <dependentAssembly>
              <assemblyIdentity name="System.Web.Helpers" 
                   publicKeyToken="31bf3856ad364e35" />
              <bindingRedirect oldVersion="1.0.0.0" newVersion="2.0.0.0"/>
            </dependentAssembly>
            <dependentAssembly>
              <assemblyIdentity name="System.Web.Mvc" 
                   publicKeyToken="31bf3856ad364e35" />
              <bindingRedirect oldVersion="1.0.0.0-3.0.0.0" newVersion="4.0.0.0"/>
            </dependentAssembly>
            <dependentAssembly>
              <assemblyIdentity name="System.Web.WebPages" 
                   publicKeyToken="31bf3856ad364e35" />
              <bindingRedirect oldVersion="1.0.0.0" newVersion="2.0.0.0"/>
            </dependentAssembly>
          </assemblyBinding>
        </runtime>
      </configuration>

    Changes from ASP.NET MVC 4 Release Candidate

    The release notes for ASP.NET MVC 4 Release Candidate can be found here:

    The major changes from ASP.NET MVC 4 Release Candidate in this release are summarized below:

    • Per controller configuration: ASP.NET Web API controllers can be attributed with a custom attribute that implements IControllerConfiguration to setup their own formatters, action selector and parameter binders. The HttpControllerConfigurationAttribute has been removed.
    • Per route message handlers: You can now specify the final message handler in the request chain for a given route. This enables support for ride-along frameworks to use routing to dispatch to their own (non-IHttpController) endpoints.
    • Progress notifications: The ProgressMessageHandler generates progress notification for both request entities being uploaded and response entities being downloaded. Using this handler it is possible to keep track of how far you are uploading a request body or downloading a response body.
    • Push content: The PushStreamContent class enables scenarios where a data producer wants to write directly to the request or response(either synchronously or asynchronously) using a stream. When the PushStreamContent is ready to accept data it calls out to an action delegate with the output stream. The developer can then write to the stream for as long as necessary and close the stream when writing has completed. The PushStreamContent detects the closing of the stream and completes the underlying asynchronous Task for writing out the content.
    • Creating error responses: Use the HttpError type to consistently represent error information from such as validation errors and exceptions while still honoring the IncludeErrorDetailPolicy. Use the new CreateErrorResponse extension methods to easily create error responses with HttpError as content. The HttpError content is fully content negotiated.
    • MediaRangeMapping removed: Media type ranges are now handled by the default content negotiator.
    • Default parameter binding for simple type parameters is now [FromUri]: In previous releases of ASP.NET Web API the default parameter binding for simple type parameters used model binding. The default parameter binding for simple type parameters is now [FromUri].
    • Action selection honors required parameters: Action selection in ASP.NET Web API will now only select an action if all required parameters that come from the URI are provided. A parameter can be specified as optional by providing a default value for the argument in the action method signature.
    • Customize HTTP parameter bindings: Use the ParameterBindingAttribute to customize the parameter binding for a specific action parameter or use the ParameterBindingRules on theHttpConfiguration to customize parameter bindings more broadly.
    • MediaTypeFormatter improvements: Formatters now have access to the full HttpContentinstance.
    • Host buffering policy selection: Implement and configure the IHostBufferPolicySelector service in ASP.NET Web API to enable hosts to determine the policy for when buffering is to be used.
    • Access client certificates in a host agnostic manner: Use the GetClientCertificate extension method to get the supplied client certificate from the request message.
    • Content negotiation extensibility: Customize content negotiation by deriving from theDefaultContentNegotiator and overriding any aspect of content negotiation that you would like.
    • Support for returning 406 Not Acceptable responses: You can now return 406 Not Acceptable responses in ASP.NET Web API when a suitable formatter is not found by creating aDefaultContentNegotiator with the excludeMatchOnTypeOnly parameter set to true.
    • Read form data as NameValueCollection or JToken: You can read form data in the URI query string or in the request body as a NameValueCollection using the ParseQueryString andReadAsFormDataAsync extension methods respectively. Similarly, you can read form data in the URI query string or in the request body as a JToken using the TryReadQueryAsJson andReadAsAsync extension methods respectively.
    • Multipart improvements: It is now possible to write a MultipartStreamProvider that is completely tailored to the type of MIME multipart data that it can read and present the result in the optimal way to the user. You can also hook a post processing step on the MultipartStreamProvider that allows the implementation to do whatever post processing it wants on the MIME multipart body parts. For example, the MultipartFormDataStreamProvider implementation reads the HTML form data parts and adds them to a NameValueCollection so they are easy to get at from the caller.
    • Link generation improvements: The UrlHelper no longer depends on HttpControllerContext. You can now access the UrlHelper from any context where the HttpRequestMessage is available.
    • Message handler execution order change: Message handlers are now executed in the order that they are configured instead of in reverse order.
    • Helper for wiring up message handlers: The new HttpClientFactory that can wire upDelegatingHandlers and create an HttpClient with the desired pipeline ready to go. It also provides functionality for wiring up with alternative inner handlers (the default is HttpClientHandler) as well as do the wiring up when using HttpMessageInvoker or another DelegatingHandler instead ofHttpClient as the top-invoker.
    •  Support for CDNs in ASP.NET Web Optimization: ASP.NET Web Optimization now provides support for CDN alternate paths enabling you to specify for each bundle an additional URL which points to that same resource on a content delivery network. Supporting CDNs enables you to get your script and style bundles geographically closer to the end consumers of your Web applications.
    • ASP.NET Web API routes and configuration moved to WebApiConfig.Register static method that can be resused in test code. ASP.NET Web API routes previously were added inRouteConfig.RegisterRoutes along with the standard MVC routes. The default ASP.NET Web API routes and configuration are now handled in a separate WebApiConfig.Register method to facilitate testing.

    Known Issues and Breaking Changes

    • The RC and RTM version of ASP.NET MVC 4 incorrectly returned cached desktop views when mobile views should be returned.
    • Breaking changes in the Razor View Engine. The following types were removed fromSystem.Web.Mvc.Razor:
      • ModelSpan
      • MvcVBRazorCodeGenerator
      • MvcCSharpRazorCodeGenerator
      • MvcVBRazorCodeParser

      The following methods were also removed:

      • MvcCSharpRazorCodeParser.ParseInheritsStatement(System.Web.Razor.Parser.CodeBlockInfo)
      • MvcWebPageRazorHost.DecorateCodeGenerator(System.Web.Razor.Generator.RazorCodeGenerator)
      • MvcVBRazorCodeParser.ParseInheritsStatement(System.Web.Razor.Parser.CodeBlockInfo)
    • When WebMatrix.WebData.dll is included in in the /bin directory of an ASP.NET MVC 4 apps, it takes over the URL for forms authentication. Adding the WebMatrix.WebData.dll assembly to your application (for example, by selecting “ASP.NET Web Pages with Razor Syntax” when using the Add Deployable Dependencies dialog) will override the authentication login redirect to /account/logon rather than /account/login as expected by the default ASP.NET MVC Account Controller. To prevent this behavior and use the URL specified already in the authentication section of web.config, you can add an appSetting called PreserveLoginUrl and set it to true:
      <appSettings>
          <add key="PreserveLoginUrl" value="true"/>
      </appSettings>
    • The NuGet package manager fails to install when attempting to install ASP.NET MVC 4 for side by side installations of Visual Studio 2010 and Visual Web Developer 2010. To run Visual Studio 2010 and Visual Web Developer 2010 side by side with ASP.NET MVC 4 you must install ASP.NET MVC 4 after both versions of Visual Studio have already been installed.
    • Uninstalling ASP.NET MVC 4 fails if prerequisites have already been uninstalled. To cleanly uninstall ASP.NET MVC 4 you must uninstall ASP.NET MVC 4 prior to uninstalling Visual Studio.
    • Installing ASP.NET MVC 4  breaks ASP.NET MVC 3 RTM applications. ASP.NET MVC 3 applications that were created with the RTM release (not with the ASP.NET MVC 3 Tools Updaterelease) require the following changes in order to work side-by-side with ASP.NET MVC 4. Building the project without making these updates results in compilation errors.Required updates
        1. In the root Web.config file, add a new <appSettings> entry with the key webPages:Version and the value 1.0.0.0.
      <appSettings>
          <add key="webpages:Version" value="1.0.0.0"/>
          <add key="ClientValidationEnabled" value="true"/>
          <add key="UnobtrusiveJavaScriptEnabled" value="true"/>
      </appSettings>
      1. In Solution Explorer, right-click the project name and then select Unload Project. Then right-click the name again and select Edit ProjectName.csproj.
      2. Locate the following assembly references:
        <Reference Include="System.Web.WebPages"/> 
        <Reference Include="System.Web.Helpers" />

        Replace them with the following:

        <Reference Include="System.Web.WebPages, Version=1.0.0.0,
        Culture=neutral, PublicKeyToken=31bf3856ad364e35, processorArchitecture=MSIL "/> 
        <Reference Include="System.Web.Helpers, Version=1.0.0.0,
        Culture=neutral, PublicKeyToken=31bf3856ad364e35, processorArchitecture=MSIL" />
      3. Save the changes, close the project (.csproj) file you were editing, and then right-click the project and select Reload.
    • Changing an ASP.NET MVC 4 project to target 4.0 from 4.5 does not update the EntityFramework assembly reference: If you change an ASP.NET MVC 4 project to target 4.0 after targetting 4.5 the reference to the EntityFramework assembly will still point to the 4.5 version. To fix this issue uninstall and reinstall the EntityFramework NuGet package.
    • 403 Forbidden when running an ASP.NET MVC 4 application on Azure after changing to target 4.0 from 4.5: If you change an ASP.NET MVC 4 project to target 4.0 after targetting 4.5 and then deploy to Azure you may see a 403 Forbidden error at runtime. To workaround this issue add the following to your web.config: <modules runAllManagedModulesForAllRequests="true" />
    • Visual Studio 2012 crashes when you type a ‘\’ in a string literal in a Razor file. To work around the issue enter the closing quote of the string literal first.
    • Browsing to “Account/Manage” in the Internet template results in a runtime error for CHS, TRK and CHT languages. To fix the issue modify the page to separate out @User.Identity.Nameby puting it as the only content within the <strong> tag.
    • Google and LinkedIn providers are not supported within Azure Web Sites. Use alternative authentication providers when deploying to Azure Web Sites.
    • When using UriPathExtensionMapping with IIS 8 Express/IIS, you would receive 404 Not Found errors when you try to use the extension. The static file handler will interfere with requests to web APIs that use UriPathExtensionMappings. SetrunAllManagedModulesForAllRequests=true in web.config to work around the issue.
    • Controller.Execute method is no longer called. All MVC controllers are now always executed asynchronously.

Reference : http://www.asp.net/whitepapers/mvc4-release-notes

Check a valid URL of image

public static string CheckImageUrlExist(string ImageUrl)
{

  try
  {
    HttpWebRequest imageUrlRequest =
                   (HttpWebRequest)HttpWebRequest.Create(ImageUrl);

    if (imageUrlRequest == null) return NoImagePath;

    imageUrlRequest.Method = "HEAD";

    imageUrlRequest.GetResponse();
    return ImageUrl;
  }
  catch (UriFormatException)
  {
    return NoImagePath;
  }
  catch (WebException)
  {
    return NoImagePath;
  }
}

public static string NoImagePath{get{return "/Content/Images/noimage.png";}}

Build a RESTful API architecture within an ASP.NET MVC 3 application.

ASP.NET MVC 3, with its glorious URL structures and ease of working with and controlling HTTP request/response data is primed to build REST type API services. But how does one accomplish that and what does the whole RESTful thing really mean?

Building a full blown API (of any type) involves a lot of architecture components, from data validation to security and beyond. This post does not attempt to address all of that. It focuses on the initial structure of a RESTful service within an ASP.NET MVC 3 application that works with JSON data in and out. We will look at how we can use the route engine, the HTTP verb attributes and a lean controller design to provide a starting point for a REST API.

 

Continue reading

When to use ViewBag, ViewData, or TempData in ASP.NET MVC 3 applications

“When should I use a ViewBag vs. ViewData vs. TempData objects?” — a frequent question in online forums, during presentations, and at events. There are enough similarities and differences between these objects that warrant a closer look to see exactly how you can use each of these objects while developing MVC applications.

All three objects are available as properties of both the view and controller. As a rule of thumb, you’ll use the ViewData, ViewBag, and TempData objects for the purposes of transporting small amounts of data from and to specific locations (e.g., controller to view or between views). Both the ViewData and ViewBag objects work well in the following scenarios:

  • Incorporating dropdown lists of lookup data into an entity
  • Components like a shopping cart
  • Widgets like a user profile widget
  • Small amounts of aggregate data

While the TempData object works well in one basic scenario:

  • Passing data between the current and next HTTP requests

Creating a MVC 3 Application with Razor and Unobtrusive JavaScript

Creating a MVC 3 Application with Razor and Unobtrusive JavaScript

The User List sample web application demonstrates how simple it is to create ASP.NET MVC  3  applications using the Razor view engine. The sample application shows how to use the new Razor view engine with ASP.NET MVC version 3 and Visual Studio 2010 to create a fictional  User List website that includes functionality such as creating, displaying, editing, and deleting users.

This tutorial describes the steps that were taken in order to build the User List sample ASP.NET MVC 3 application.  A Visual Studio project with C# and VB source code is available to accompany this topic:Download. If you have questions about this tutorial, please post them to the MVC forum.

Simple .NET MVC 3 web application with integrated Facebook OAuth API

Introduction

Before creating a .NET MVC application, we have to register the domain name that will be used for the web site at the Facebook development site: http://developers.facebook.com/setup/. After successful registration, we will have a Facebook APIKey and Facebook Secret.

Now let’s create a simple ASP.NET MVC application in VS:

1_1.png

I will use the Facebook API button in this sample to show an alternative log in option to the user. Let’s change the_LogOnPartial.cshtml file in such a way:

 Collapse
@if(Request.IsAuthenticated) {
    <text>Welcome <strong>@User.Identity.Name</strong>!
    [ @Html.ActionLink("Log Off", "LogOff", "Account") ]</text>
}
else {
    <fb:login-button perms="email,user_checkins" onlogin="afterFacebookConnect();" 
       autologoutlink="false" ></fb:login-button>
    <div id="fb-root" style="display:inline; margin-left:20px;"></div>
    @:[ @Html.ActionLink("Log On", "LogOn", "Account") ]
}
<script language="javascript" type="text/javascript">
    window.fbAsyncInit = function () {
        FB.init({ appId: ' -- YOUR REAL APPLICATION ID SHOUD BE HERE --', 
                  status: true, cookie: false, xfbml: true });
    };
    function afterFacebookConnect() {
        FB.getLoginStatus(function (response) {
            if (response.session) {
                window.location = "../Account/FacebookLogin?token=" + 
                       response.session.access_token;
            } else {
                // user clicked Cancel
            }
        });
    };
    $(document).ready(function () {
        if (document.getElementById('fb-root') != undefined) {
            var e = document.createElement('script');
            e.type = 'text/javascript';
            e.src = document.location.protocol + '//connect.facebook.net/en_US/all.js';
            e.async = true;
            document.getElementById('fb-root').appendChild(e);
        }
    });
</script>

The following elements were added to the control:

  • Facebook login button (fb:login-button).
  • Container which will contain all Facebook scripts (div id="fb-root").
  • FB initialization script (FB.fbAsyncInit). You have to replace the sample appId value with the real one received when registering your app on the Facebook development site.
  • afterFacebookConnect – script which will be called after the user closes the Facebook login dialog window (after successful or failed login).
  • Script for loading Facebook JavaScript libraries (e.src = document.location.protocol + '//connect.facebook.net/en_US/all.js';).

After successful login, we will have the access_token value, and now we can load detailed user’s info, store this info (if we need to), and authenticate the user. To do this, we will redirect the user to theAccount.FacebookLogin action and pass the access_token value as a parameter to this action. So at this stage, we will implement the “FacebookLogin” action. The created action will look like this:

 Collapse
using System.Web.Mvc;
using System.Web.Security;
using MVCOAuth.Models;
using System.Net;
using Newtonsoft.Json.Linq;
using System;

namespace MVCOAuth.Controllers
{
    public class AccountController : Controller
    {
        [HttpGet]
        public ActionResult FacebookLogin(string token)
        {
            WebClient client = new WebClient();
            string JsonResult = client.DownloadString(string.Concat(
                   "https://graph.facebook.com/me?access_token=", token));
            // Json.Net is really helpful if you have to deal
            // with Json from .Net http://json.codeplex.com/
            JObject jsonUserInfo = JObject.Parse(JsonResult);
            // you can get more user's info here. Please refer to:
            //     http://developers.facebook.com/docs/reference/api/user/
            string username = jsonUserInfo.Value<string>("username");
            string email = jsonUserInfo.Value<string>("email");
            string locale = jsonUserInfo.Value<string>("locale");
            int facebook_userID = jsonUserInfo.Value<int>("id");

            // store user's information here...
            FormsAuthentication.SetAuthCookie(username, true);
            return RedirectToAction("Index", "Home");
        }

And that’s it! We have integrated alternative Facebook authentication on the MVC site. Before login:

R_1.png

After successful Facebook authentication:

R_2.png

Hope this will be helpful for someone!

 

 

Globalization, Internationalization and Localization in ASP.NET MVC 3, JavaScript and jQuery – Part 1

Posted 2011-05-25 07:56 PM in ASP.NET | ASP.NET MVC | Internationalization | Javascript.

There are several books worth of information to be said about Internationalization (i18n) out there, so I can’t solve it all in a blog post. Even 9 pages of blog posts. I like to call it Iñtërnâtiônàlizætiøn, actually.

There’s a couple of basic things to understand though, before you create a multilingual ASP.NET application. Let’s agree on some basic definitions as these terms are often used interchangeably.

  • Internationalization (i18n) – Making your application able to support a range of languages and locales
  • Localization (L10n) – Making your application support a specific language/locale.
  • Globalization – The combination of Internationalization and Localization
  • Language – For example, Spanish generally. ISO code “es”
  • Locale – Mexico. Note that Spanish in Spain is not the same as Spanish in Mexico, e.g. “es-ES” vs. “es-MX”

Culture and UICulture

The User Interface Culture is a CultureInfo instance from the .NET base class library (BCL). It lives on Thread.CurrentThread.CurrentUICulture and if you felt like it, you could set it manually like this:

1
Thread.CurrentThread.CurrentUICulture = new CultureInfo("es-MX");

The CurrentCulture is used for Dates, Currency, etc.

1
Thread.CurrentThread.CurrentCulture = new CultureInfo("es-MX");

However, you really ought to avoid doing this kind of stuff unless you know what you’re doing and you really have a good reason.

The user’s browser will report their language preferences in the Accept-Languages HTTP Header like this:

GET http://www.hanselman.com HTTP/1.1
Connection: keep-alive
Cache-Control: max-age=0
Accept-Encoding: gzip,deflate,sdch
Accept-Language: en-US,en;q=0.8

See how I prefer en-US and then en? I can get ASP.NET to automatically pass those values and setup the threads with with the correct culture. I need to set my web.config like this:

1
2
3
<system.web>
    <globalization culture="auto" uiCulture="auto" enableClientBasedCulture="true" />
...snip...

That one line will do the work for me. At this point the current thread and current UI thread’s culture will be automatically set by ASP.NET.

The Importance of Pseudointernationalization

Back in 2005 I updated John Robbin’s Pseudoizer (and misspelled it then!) and I’ve just ported it over to .NET 4 and used it for this application. I find this technique for creating localizable sites really convenient because I’m effectively changing all the strings within my app to another language which allows me to spot strings I missed with the tedium of translating strings.

You can download the .NET Pseudoizer here.

Here’s an example from that earlier post before I run it through Pseudointernationalization:

1
2
3
4
5
6
7
8
9
<data name="Accounts.Download.Title">
  <value>Transaction Download</value>
</data>
<data name="Accounts.Statements.Action.ViewStatement">
  <value>View Statement</value>
</data>
<data name="Accounts.Statements.Instructions">
  <value>Select an account below to view or download your available online statements.</value>
</data>

I can convert these resources with the pseudoizer like this:

PsuedoizerConsole examplestrings.en.resx examplestrings.xx.resx

and here’s the result:

1
2
3
4
5
6
7
8
9
<data name="Accounts.Download.Title">
  <value>[Ŧřäʼnşäčŧįőʼn Đőŵʼnľőäđ !!! !!!]</value>
</data>
<data name="Accounts.Statements.Action.ViewStatement">
  <value>[Vįęŵ Ŝŧäŧęmęʼnŧ !!! !!!]</value>
</data>
<data name="Accounts.Statements.Instructions">
  <value>[Ŝęľęčŧ äʼn äččőūʼnŧ þęľőŵ ŧő vįęŵ őř đőŵʼnľőäđ yőūř äväįľäþľę őʼnľįʼnę şŧäŧęmęʼnŧş. !!! !!! !!! !!! !!!]</value>
</data>

Cool, eh? If you’re working with RESX files a lot, be sure to familiarize yourself with the resgen.exe command-line tool that is included with Visual Studio and the .NET SDK. You have this on your system already. You can move easily between the RESX XML-based file format and a more human- (and translator-) friendly text name=value format like this:

resgen /compile examplestrings.xx.resx,examplestrings.xx.txt

And now they are a nice name=value format, and as I said, I can move between them.

Accounts.Download.Title=[Ŧřäʼnşäčŧįőʼn Đőŵʼnľőäđ !!! !!!]
Accounts.Statements.Action.ViewStatement=[Vįęŵ Ŝŧäŧęmęʼnŧ !!! !!!]
Accounts.Statements.Instructions=[Ŝęľęčŧ äʼn äččőūʼnŧ þęľőŵ ŧő vįęŵ őř đőŵʼnľőäđ yőūř äväįľäþľę őʼnľįʼnę şŧäŧęmęʼnŧş. !!! !!! !!! !!! !!!]

During development time I like to add this Pseudoizer step to my Continuous Integration build or as a pre-build step and assign the resources to a random language I’m NOT going to be creating, like Polish (with all due respect to the Poles) so I’d make examplestrings.pl.resx and the then we can test our fake language by changing our browser’s UserLanguages to prefer pl-PL over en-US.

Localization Fallback

Different languages take different amounts of space. God bless the Germans but their strings will take an average of 30% more space than English phrases. Chinese will take 30% less. The Pseudoizer pads strings in order to illustrate these differences and encourage you to take them into consideration in your layouts.

Localization within .NET (not specific to ASP.NET Proper or ASP.NET MVC) implements a standard fallback mechanism. That means it will start looking for the most specific string from the required locale, then fallbackcontinuing to look until it ends on the neutral language (whatever that is). This fallback is handled by convention-based naming. Here is an older, but still excellent live demo of Resource Fallback at ASPAlliance.

For example, let’s say there are three resources. Resources.resx, Resources.es.resx, and Resources.es-MX.resx.

Resources.resx:
HelloString=Hello, what’s up?
GoodbyeString=See ya!
DudeString=Duuuude!

Resources.es.resx:
HelloString=¿Cómo está?
GoodbyeString=Adiós!

Resources.es-MX.resx:
HelloString=¿Hola, qué tal?

Consider these three files in a fallback scenario. The user shows up with his browser requesting es-MX. If we ask for HelloString, he’ll get the most specific one. If we ask for GoodbyeString, we have no “es-MX” equivalent, so we move up one to just “es.” If we ask for DudeString, we have no es strings at all, so we’ll fall all the way back to the neutral resource.

Using this basic concept of fallback, you can minimize the numbers of strings you localize and provide users with not only language specific strings (Spanish) but also local (Mexican Spanish) strings. And yes, I realize this is a silly example and isn’t really representative of Spaniards or Mexican colloquial language.

Views rather than Resources

If you don’t like the idea of resources, while you will still have to deal with some resources, you could also have difference views for different languages and locales. You can structure your ~/Views folders like Brian Reiter and others have. It’s actually pretty obvious once you have bought into the idea of resource fallback as above. Here’s Brian’s example:

/Views
    /Globalization
        /ar
            /Home
                /Index.aspx
            /Shared
                /Site.master
                /Navigation.aspx
        /es
            /Home
                /Index.aspx
            /Shared
                /Navigation.aspx
        /fr
            /Home
                /Index.aspx
            /Shared
    /Home
        /Index.aspx
    /Shared
        /Error.aspx
        /Footer.aspx
        /Navigation.aspx
        /Site.master

Just as you can let ASP.NET change the current UI culture based on UserLanguages or a cookie, you can also control the way that Views are selected by a small override of your favorite ViewEngine. Brian includes a few lines to pick views based on a language cookie on his blog.

He also includes some simple jQuery to allow a user to override their language with a cookie like this:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
var mySiteNamespace = {}
mySiteNamespace.switchLanguage = function (lang) {
    $.cookie('language', lang);
    window.location.reload();
}
$(document).ready(function () {
    // attach mySiteNamespace.switchLanguage to click events based on css classes
    $('.lang-english').click(function () { mySiteNamespace.switchLanguage('en'); });
    $('.lang-french').click(function () { mySiteNamespace.switchLanguage('fr'); });
    $('.lang-arabic').click(function () { mySiteNamespace.switchLanguage('ar'); });
    $('.lang-spanish').click(function () { mySiteNamespace.switchLanguage('es'); });
});

I’d probably make this a single client event and use data-language or an HTML5 attribute (brainstorming) like this:

1
2
3
4
5
$(document).ready(function () {
        $('.language').click(function (event) {
            $.cookie('language', $(event.target).data('lang'));
        })
});

But you get the idea. You can set override cookies, check those first, then check the UserLanguages header. It depends on the experience you’re looking for and you need to hook it up between the client and server

Globalized JavaScript Validation

If you’re doing a lot of client-side work using JavaScript and jQuery, you’ll need to get familiar with the jQuery Global plugin. You may also want the localization files for things like the DatePicker and jQuery UI on NuGet via “install-package jQuery.UI.i18n.”

Turns out the one thing you can’t ask your browser via JavaScript is what languages it prefers. That is sitting inside an HTTP Header called “Accept-Language” and looks like this, as it’s a weighted list.

en-ca,en;q=0.8,en-us;q=0.6,de-de;q=0.4,de;q=0.2

We want to tell jQuery and friends about this value, so we need access to it from the client side in a different way, so I propose this.

This is Cheesy – use Ajax

We could do this, with a simple controller on the server side:

1
2
3
4
5
public class LocaleController : Controller {
    public ActionResult CurrentCulture()  {
        return Json(System.Threading.Thread.Current.CurrentUICulture.ToString(), JsonRequestBehavior.AllowGet);
    }
}

And then call it from the client side. Ask jQuery to figure it out, and be sure you have the client side globalization libraries you want for the cultures you’ll support. I downloaded all 700 jQuery Globs from GitHub. Then I could make a quick Ajax call and get that info dynamically from the server. I also include the locales I want to support as scripts like  /Scripts/globinfo/jquery.glob.fr.js. You could also build a dynamic parser and load these dynamically also, or load them ALL when they show up on the Google or Microsoft CDNs as a complete blob.

1
2
3
4
5
6
7
8
9
<script>
    $(document).ready(function () {
        //Ask ASP.NET what culture we prefer
        $.getJSON('/locale/currentculture', function (data) {
            //Tell jQuery to figure it out also on the client side.
            $.global.preferCulture(data);
        });
    });
</script>

But that is a little cheesy because I have to make that little JSON call. Perhaps this belongs somewhere else, like a custom META tag.

Slightly Less Cheesy – Meta Tag

Why not put the value of this header in a META tag on the page and access it there? It means no extra AJAX call and I can still use jQuery as before. I’ll create an HTML helper and use it in my main layout page. Here’s the HTML Helper. It uses the current thread, which was automatically set earlier by the setting we added to the web.config.

1
2
3
4
5
6
7
8
9
10
11
namespace System.Web.Mvc
{
    public static class LocalizationHelpers
    {
        public static IHtmlString MetaAcceptLanguage<T>(this HtmlHelper<T> html)
        {
            var acceptLanguage = HttpUtility.HtmlAttributeEncode(Threading.Thread.CurrentThread.CurrentUICulture.ToString());
            return new HtmlString(String.Format("<meta name=\"accept-language\" content=\"{0}\" />",acceptLanguage));
        }
    }
}

I use this helper like this on the main layout page:

1
2
3
4
5
6
7
8
9
10
11
12
<html>
<head>
    <meta charset="utf-8" />
    <title>@ViewBag.Title</title>
    <link href="@Url.Content("~/Content/Site.css")" rel="stylesheet" type="text/css" />
    <script src="@Url.Content("~/Scripts/jquery-1.5.1.min.js")" type="text/javascript"></script>
    <script src="@Url.Content("~/Scripts/globinfo/jquery.glob.fr.js")" type="text/javascript"></script>
    <script src="@Url.Content("~/Scripts/modernizr-1.7.min.js")" type="text/javascript"></script>
    <script src="@Url.Content("~/Scripts/jquery.global.js")" type="text/javascript"></script>
    @Html.MetaAcceptLanguage()
</head>
...

And the resulting HTML looks like this. Note that this made-up META tag would be semantically different from the Content-Language or the lang= attributes as it’s part of the the parsed HTTP Header that ASP.NET decided was our current culture, moved into the client.

1
2
3
4
5
6
7
8
9
10
11
<html>
<head>
    <meta charset="utf-8" />
    <title>Home Page</title>
    <link href="/Content/Site.css" rel="stylesheet" type="text/css" />
    <script src="/Scripts/jquery-1.5.1.min.js" type="text/javascript"></script>
    <script src="/Scripts/globinfo/jquery.glob.fr.js" type="text/javascript"></script>
    <script src="/Scripts/modernizr-1.7.min.js" type="text/javascript"></script>
    <script src="/Scripts/jquery.global.js" type="text/javascript"></script>
    <meta name="accept-language" content="en-US" />
</head>

Now I can access it with similar code from the client side. I hope to improve this and support dynamic loading of the JS, however preferCulture isn’t smart and actually NEEDS the resources loaded in order to make a decision. I would like a method that would tell me the preferred culture so that I might load the resources on-demand.

1
2
3
4
5
6
7
8
<script>
    $(document).ready(function () {
        //Ask ASP.NET what culture we prefer, because we stuck it in a meta tag
        var data = $("meta[name='accept-language']").attr("content")
        //Tell jQuery to figure it out also on the client side.
        $.global.preferCulture(data);
    });
</script>

So what? Now when I am on the client side, my validation and JavaScript is a little smarter. Once jQuery on the client knows about your current preferred culture, you can start being smart with your jQuery. Make sure you are moving around non-culture-specific data values on the wire, then convert them as they become visible to the user.

1
2
3
var price = $.format(123.789, "c");
jQuery("#price").html('12345');
1
2
var date = $.format(new Date(1972, 2, 5), "D");
jQuery("#date").html(date);
1
2
var units = $.format(12345, "n0");
jQuery("#unitsMoved").html(units);

Now, you can apply these concepts to validation within ASP.NET MVC.

Globalized jQuery Unobtrusive Validation

Adding onto the code above, we can hook up the globalization to validation, so that we’ll better understand how to manage values like 5,50 which is 5.50 for the French, for example. There are a number of validation methods you can hook up, here’s number parsing.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
$(document).ready(function () {
    //Ask ASP.NET what culture we prefer, because we stuck it in a meta tag
    var data = $("meta[name='accept-language']").attr("content")
    //Tell jQuery to figure it out also on the client side.
    $.global.preferCulture(data);
    //Tell the validator, for example,
    // that we want numbers parsed a certain way!
    $.validator.methods.number = function (value, element) {
        if ($.global.parseFloat(value)) {
            return true;
        }
        return false;
    }
});

If I set my User Languages to prefer French (fr-FR) as in this screenshot:

Language Preference Dialog preferring French

Then my validation realizes that and won’t allow 5.50 as a value, but will allow 5,50, given this model:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
public class Example
{
    public int ID { get; set; }
    [Required]
    [StringLength(30)]
    public string First { get; set; }
    [Required]
    [StringLength(30)]
    public string Last { get; set; }
    [Required]
    public DateTime BirthDate { get; set; }
    [Required]
    [Range(0,100)]
    public float HourlyRate { get; set; }
}

I’ll see this validation error, as the client side knows our preference for , as a decimal separator.

NOTE: It seems to me that the [Range] attribute that talks to jQuery Validation doesn’t support globalization and isn’t calling into the localized methods so it won’t work with the , and . decimal problem. I was able to fix this problem by overriding the range method in jQuery like this, forcing it to use the global implementation of parseFloat. Thanks to Kostas in the comments on this post for this info.

1
2
3
4
5
6
7
jQuery.extend(jQuery.validator.methods, {
    range: function (value, element, param) {
        //Use the Globalization plugin to parse the value
        var val = $.global.parseFloat(value);
        return this.optional(element) || (val >= param[0] && val <= param[1]);
    }
});

Here it is working with validity…The Value 4.5 is not valid for Hourly Rate

And here it is in a Danish culture working with [range]:

Localized Range

I can also set the Required Attribute to use specific resources and names and localized them from an ExampleResources.resx file like this:

1
2
3
4
5
6
7
8
public class Example
{
    public int ID { get; set; }
    [Required(ErrorMessageResourceType=typeof(ExampleResources),
              ErrorMessageResourceName="RequiredPropertyValue")]
    [StringLength(30)]
    public string First { get; set; }
...snip...

And see this:

image

NOTE: I’m looking into how to set new defaults for all fields, rather than overriding them individually. I’ve been able to override some with a resource file that has keys called “PropertyValueInvalid” and “PropertyValueRequired” then setting these values in the Global.asax, but something isn’t right.

1
2
DefaultModelBinder.ResourceClassKey = "ExampleResources";
ValidationExtensions.ResourceClassKey = "ExampleResources";

I’ll continue to explore this.

Dynamically Localizing the jQuery DatePicker

Since I know what the current jQuery UI culture is, I can use it to dynamically load the resources I need for the DatePicker. I’ve installed the “MvcHtml5Templates” NuGet library from Scott Kirkland so my input type is “datetime” and I’ve added this little bit of JavaScript that says, do we support dates? Are we non-English? If so, go get the right DatePicker script and set it’s info as the default for our DatePicker by getting the regional settings given the current global culture.

1
2
3
4
5
6
7
8
9
10
11
12
//Setup datepickers if we don't support it natively!
if (!Modernizr.inputtypes.date) {
    if ($.global.culture.name != "en-us" && $.global.culture.name != "en") {
        var datepickerScriptFile = "/Scripts/globdatepicker/jquery.ui.datepicker-" + $.global.culture.name + ".js";
        //Now, load the date picker support for this language
        // and set the defaults for a localized calendar
        $.getScript(datepickerScriptFile, function () {
            $.datepicker.setDefaults($.datepicker.regional[$.global.culture.name]);
        });
    }
    $("input[type='datetime']").datepicker();
}

Then we set all input’s with type=datetime. You could have used a CSS class if you like as well.

image

Now our jQuery DatePicker is French.

Right to Left (body=rtl)

For languages like Arabic and Hebrew that read Right To Left (RTL) you’ll need to change the dir= attribute of the elements you want flipped. Most often you’ll change the root <HTML> element to <HTML dir=”rtl”> or change it with CSS like:

1
2
3
div {
   direction:rtl;
}

The point is to have a general strategy, whether it be a custom layout file for RTL languages or just flipping your shared layout with either CSS or an HTML Helper. Often folks put the direction in the resources and pull out the value ltr or rtl depending.

Conclusion

Globalization is hard and requires actual thought and analysis. The current JavaScript offerings are in flux and that’s kind.

A lot of this stuff could be made boilerplate or automatic, but much of it is a moving target. I’m currently exploring either a NuGet package that sets stuff up for you OR a “File | New Project” template with all the best practices already setup and packaged into one super-package. What’s your preference, Dear Reader?

The Complete Script

Here’s my current “complete” working script that could then be moved into its own file. This is a work in progress, to be sure. Please forgive any obvious mistakes as I’m still learning JavaScript.

    <script>
        $(document).ready(function () {
            //Ask ASP.NET what culture we prefer, because we stuck it in a meta tag
            var data = $(“meta[name=’accept-language’]”).attr(“content”)
            //Tell jQuery to figure it out also on the client side.
            $.global.preferCulture(data);
            //Tell the validator, for example,
            // that we want numbers parsed a certain way!
            $.validator.methods.number = function (value, element) {
                if ($.global.parseFloat(value)) {
                    return true;
                }
                return false;
            }
            //Fix the range to use globalized methods
            jQuery.extend(jQuery.validator.methods, {
                range: function (value, element, param) {
                    //Use the Globalization plugin to parse the value
                    var val = $.global.parseFloat(value);
                    return this.optional(element) || (val >= param[0] && val <= param[1]);
                }
            });
            //Setup datepickers if we don’t support it natively!
            if (!Modernizr.inputtypes.date) {
                if ($.global.culture.name != ‘en-us’ && $.global.culture.name != ‘en’) {
                    var datepickerScriptFile = “/Scripts/globdatepicker/jquery.ui.datepicker-” + $.global.culture.name + “.js”;
                    //Now, load the date picker support for this language
                    // and set the defaults for a localized calendar
                    $.getScript(datepickerScriptFile, function () {
                        $.datepicker.setDefaults($.datepicker.regional[$.global.culture.name]);
                    });
                }
                $(“input[type=’datetime’]”).datepicker();
            }
        });
    </script>

Better, Faster, Easier SSL testing for ASP.NET MVC & WebForms

In this blog entry I’ll show you how to test SSL on your ASP.NET MVC & WebForms  applications using IIS 7. You should make sure you have IIS 7 set up on your Windows 7 computer before proceeding. See my blog Test your ASP.NET MVC or WebForms Application on IIS 7 in 30 seconds for instructions.

Create a new ASP.NET MVC 3 Application called MvcSSL in the C:\Webs folder. Accept all the defaults.

newMVCproj

WARNING: IIS cannot run an ASP.NET project that has been created in the default Visual Studio project folder (C:\users\<user>\Documents\Visual Studio 2010\Projects). Select a folder where IIS can access the files, such as C:\Webs.

Build and run the application.

Right click the solution and select Properties.

RightClickProject

  1. Select Web in the left pane.
  2. Under Servers, select the Use Local IIS Web server radio button.
  3. Select the Create Virtual Directory button.

UseLocal_IIS_sm

Should you get the message:

Unable to create the virtual directory. To access Web sites on the local IIS Web server, you must run Visual Studio under an Administrator account.

RunAsAdmin

Read it and follow the directions and you’ll be rewarded with a friendlier message.

vid_Success

Now go back to IIS manager, refresh and drill down in the Default Web Site. Select Browse *:80(http) in the right pane.

iisMgr

Adding a SSL Cert the Super Ninja way.

  1. Download and unzip Thomas Demels awesome SelfSSL7 tool. Read about SelfSSL7 here.
  2. Open a administrative command prompt in the SelfSSL7 directory.
  3. Substitute your machine name for Q3 (which was my machine) and run the following command:
    SelfSSL7.exe /Q /T /I “Default Web Site” /N cn=Q3;cn=localhost /V 1000

That’s it. You now have a self signed certificate for testing.

Open Global.asax and add RequireHttps to all controllers and action methods.

public static void RegisterGlobalFilters(GlobalFilterCollection filters) {
           filters.Add(new RequireHttpsAttribute());
           filters.Add(new HandleErrorAttribute());
       }

Build the application and refresh the browser. Note IIS Manager now includes the SSL port.

iisMgr443

https-IE

Click on the lock icon to examine the certificate.

CertDetails

Next Steps

Change the URL to use the machine name instead of localhost.

Q1

To access this site from a remote computer, you’ll probably have to open up port 443.

To configure the firewall for HTTPS (port 443)

  1. From the Start Type_WF.mscmenu, enter wf.msc
  2. In the console tree, click Inbound Rules, and then click New Rule.InboundNewRule
  3. In the Rule Type page, click Port, and then click Next.PortNext
  4. On the Protocols and Ports page, select TCP, and then click Specific local ports.  Enter port 443. Then click Next.port443
    5. On the Action page, click Allow the connection, and then click Next.allowTheCon6. On the Profile page, make sure that the Domain, Private, and Public check boxes are selected, and then click Next to accept the default profile.profile7. On the Name page, under Name,  type something like World Wide Web Services (Rick’s HTTPS Traffic-In).

    ricks_rule

  5. You can now test the the site from a remote computer, but it won’t be trusted.RemoteNotTrusted
  6. Export the certificate and import on a remote computer. Run the following command (change the computer name from Q1 to your computer).

C:\Users\ricka\Downloads\SelfSSL7>SelfSSL7.exe /Q /T /I “Default Web Site” /N cn=Q1;cn=localhost /
V 1000 /K 2048 /X /F q1.pfx /W 5$ecURE!

  • Navigate to the PFX file from a remote computer. Double click on the PFX file to start the import wizard.CertImportWiz
  • 2_FileToImportSelect Next.
  • Select Next and enter the password. (I used 5$ecURE! in the example).3_enterPW
  • Select Place all certificates in the following store, then select Browse.4_trustedStore
  • FinishClick OK.
  • Click Finish.ImportSuccessful
  • You should now be able to browse from the remote computer without warning.

Testing with FireFox

  1. Using FireFox,, browse to the MvcSSL site. FireFox will issue a warning and not display the page.FF_untrusted
  2. FF_addSecurityExceptionSelect Add Exception.
  3. You can now browse to the site without a warning.browseFF