Cumartesi, 07 Ekim 2017 / Published in Uncategorized

Agent-based deployment in Release Management

Release Management now supports robust in-the-box multi-machine deployment. You can now orchestrate  deployments across multiple machines, perform rolling updates while ensuring high availability of the application throughout.

Agent based deployment capability relies on the same build and deployment agents. However, unlike the current approach, where you install the build and deployment agents on a set of proxy servers in an agent pool and drive deployments to remote target servers, you install the agent on each of your target servers directly and drive rolling deployment to those servers.

Preview

Agent based deployment feature is currently in early adopter phase. Submit requests for participation and suggestions here.

Deployment Group

Deployment group is a logical group of targets (machines) with agents installed on each of them. Deployment groups represent your physical environments like single box Dev, multi-machine QA or farm of machines for UAT/Prod. They also specify the security context for your physical environments.

Here are a few screenshots to explain how this experience is shaping up.

  • Create your ‘Deployment Group’,  register your machines with the cmdlet.
  • Author release definition: If you are deploying to farm of servers, you can use deployment configuration options viz., one target at a time, or half the targets at a time, or deploy to all targets at once, or use custom value. 

  • Deploy: view live logs for each server, download logs for all servers. Track your deployments steps for each server using Release log view.
  • Manage: Track the deployments down to each machine. Tag the machine in the group so that you can deploy to the targets having specific tags.

Bootstraping agents: We have made bootstrapping the agents on the target simpler. You can just copy-paste the cmdlet appropriate for the OS and it will take care of downloading, installation and configuring the agent against the deployment group. It even has an option to generate the cmdlet with ‘Personal Access Token’ with right scope so that you don’t have to.

  • If it is Azure, you can do it on-demand using Team Services extension for the VM or use Azure PowerShell/CLI to add the extension which will bootstrap the deployment agent. Or you can automate using resource extension in the Azure template json.
  • We plan to enhance ‘Azure Resource Group’ task to dynamically bootstrap agents on the newly provisioned / pre-existing Virtual Machines on Azure.

With this you can use the same proven cross-platform agent and its pull-based execution model to easily drive deployments on a large number of servers no matter which domain they are on, without having to worry about the myriad of pre-requisites.

Cumartesi, 07 Ekim 2017 / Published in Uncategorized

Download Source Code of this post from GitHub here

Node.js is one of the greatest platform to create backend of the application. Does not matter whether you are creating a mobile app for iOS or Single Page Web application using Angular, you will need a backend for the application. Usually you perform databases operations, authentications, authorizations, logging etc. in the backend. I am sure, you have heard about the term MEAN stack. If you have not, MEAN stands for MongoDB, Express JS, Angular, and Node.js.

In the MEAN stack, you create backend of the application using MongoDB, ExpressJS and NodeJS. In this article, you will learn to create a REST API to perform CRUD operations on MongoDB. You will use ExpressJS and NodeJS to create the REST API.

In this post, you are going to create a REST API to preform CRUD operations.

From the client, you will perform operations such as GET, POST, PUT, and DELETE to perform CRUD operations. Therefore, to create a record, you will perform HTTP POST operation and pass data to be inserted in the body of the HTTP request.

At the end of this post you will have a working API running which will perform CRUD operations.

Installation

To follow along this post, make sure you have following software installed on your machine.

To configure MongoDB , after installation make sure that you have in your primary drive where OS is installed(C drive in most of the cases on Windows) a folder called data\db. If you do not have create this folder. By default, MongoDB look for this folder to work with. Also, make sure to start MongoDB server by running Mongod.exe command as shown in the image below, to work with this post

After successful installation, create a blank folder in which you will create the project. Give whatever name you wish to give this folder. I am calling it as Project folder.

Step 1: Install dependencies

In the project folder, add a file with name package.json. We are going to use npm (Node Package Manager) to install all the dependencies. Node Package Manager (npm) reads package.json file to install the dependencies in the project. Add the below code in package.json.


{
    "name": "product-api",
    "main": "server.js",
    "dependencies": {
        "express": "~4.0.0",
        "body-parser": "~1.0.1",
        "cors": "2.8.1",
        "mongoose": "~3.6.13"
    }
}

In this, we are mentioning that, the project we are going to create will have following dependencies :

  • Express JS to create API routes.
  • Body parser to parse JSON data coming from the client.
  • CORS to enable the cross origin resource sharing support in your API.
  • Monogoose to work with MongoDB in NodeJS.

After updating package.json with above code, open the project in the command prompt and run the command npm install. I am assuming here that, you have already NodeJS installed on your system. If you haven’t installed it, click here and install.

Once you have successfully ran npm install command, you will find a node_modules folder which is created inside your project containing all the required dependencies to create REST API in NodeJS, performing CRUD operations on MongoDB.

Step 2: Create Model

When you work with RDBMS such as SQL Server or MySQL, you create model to perform CRUD operations on that particular table. MongoDB is a document-based database, and in this, you can insert, delete, update or read documents. MongoDB does not restrict you to create fixed model to perform database operations. However, to work with mongoose, you need a model. So, let us create a model of the document to perform CRUD operations.

In the project, add a file called product.js and create model with name Product as shown in the listing below :


var mongoose = require('mongoose');
var Schema = mongoose.Schema;
var ProductSchema = new Schema({
    title: String,
    price: Number,
    instock: Boolean,
    photo: String,
});
module.exports = mongoose.model('Product', ProductSchema);

As you might have noticed, Product model contains four properties of different data types.

Step 3: Create Server

In this step, we will create server for REST API. For creating a server, add a file in the project called server.js. Go back and examine package.json file, you will find that the value of main is set as server.js. Therefore, node server will look for server.js to start the server. We are going to put all the starting codes in the file.

To start with, in server.js, add the following required statements at the top of the file.

var express = require('express');
var bodyParser = require('body-parser');
var cors = require('cors');
var app = express();
var mongoose = require('mongoose');
var product = require('./product');

After importing all the required modules, create route, assign port, and use body parser to parse incoming JSON data. To do these, after last require statement add the codes listed below in server.js file.


app.use(bodyParser.urlencoded({ extended: true }));
app.use(bodyParser.json());
var port = process.env.PORT || 8090;
var router = express.Router();

In the above snippet, we are assigning port and creating the router. After this, at the end of the server.js file, add the code listed below :


app.use(cors());
app.use('/api', router);
app.listen(port);
console.log('REST API is runnning at ' + port);

In the above snippet, you are enabling CORS support, configuring port for the API, and also configuring that REST API would be created on baseurl/api/{routename}. At this point, you have configured server for various API settings and assigned port to work with that. Keeping everything together, at the end of this step, server.js file will look like below:

Server.js


var express = require('express');
var bodyParser = require('body-parser');
var cors = require('cors');
var app = express();
var mongoose = require('mongoose');
var product = require('./product');

app.use(bodyParser.urlencoded({ extended: true }));
app.use(bodyParser.json());
var port = process.env.PORT || 8090;
var router = express.Router();

// all other code will go here 

app.use(cors());
app.use('/api', router);
app.listen(port);
console.log('REST API is runnning at ' + port);

Step 3: Connect to MongoDB server

In this step, you will connect application to MongoDB database server. To do this, just after router creation code (or at the place where there is a comment “all codes goes here”), add the code mentioned below:


mongoose.connect('mongodb://localhost:27017/products');

You are connecting to product database of MongoDB server. By default, MongoDB runs on the port 27017.

Note:

At this point, make sure that your MongoDB server is running on this port, otherwise you will get an exception when running the API. On Windows, to run the server- navigate to c:\program files\mongodb\server\3.4\bin on the command prompt and run the command mongod.exe. This command will start the MongoDB server. Here I am assuming that, you already have MongoDB installed. If you do not have, you can download and install MongoDB from here

Step 4: Creating the middle route

Let us start with writing a route, which will be called before any route. In this route, you can perform various operations such as :

  • Authentication
  • Authorization
  • Logging

In Express JS, you can very easily create this route by adding a “use route” as shown in the listing below:


router.use(function (req, res, next) {
    // do logging 
    // do authentication 
    console.log('Logging of request will be done here');
    next(); // make sure we go to the next routes and don't stop here
});

Make sure to add above code in server.js, just after the code you created in step 3.

Step 5: Creating Record

To create the record or insert a new document, you need to create a route for the POST operation. In REST API, convention is that, whenever client will perform the POST operation, a new record will be created. So, let us start with creating a route for the POST operation. To create that, just below the “use route” line of code, add the code mentioned below :


router.route('/products').post(function (req, res) {

   
});

The above route will be called, whenever client will perform HTTP POST operation on baseurl/api/products

Next, you need to write the code to insert data in the database. For that, you need to perform two operations :

  1. Create model object using the request body. If you remember, we have created a model in the step 2.
  2. Call the save function on model object to save the record in the database.

To do the above tasks, you need to modify POST route as shown below:


router.route('/products').post(function (req, res) {
    var p = new product();
    p.title = req.body.title;
    p.price = req.body.price;
    p.instock = req.body.instock;
    p.photo = req.body.photo;
    p.save(function (err) {
        if (err) {
            res.send(err);
        }
        res.send({ message: 'Product Created !' })
    })
});

Run and Test

To test the Create API, run the command

Now your API is running on the port 8090. To test the API, open POSTMAN and perform a POST operation on URL baseurl/api/products.

Beside the above configuration, in Headers, a content-type is added as application/json. When you click on send, if everything happens as expected, you will be getting the response from the API as shown in the image below:

You might be wondering, in which database product it has been created? In the previous steps, you connected to database “products”. So all the operations will be performed with Products database. Also in step 2, while creating a model, since we have not provided any collection name, record will be created in products (plural of model name product) collection.

Step 6: Fetching Records

To fetch records, you need to perform GET operation. To do so, add a get route to the API. A GET route could be added as shown in the listing below:


router.route('/products').get(function (req, res) {
    product.find(function (err, products) {
        if (err) {
            res.send(err);
        }
        res.send(products);
    });
});

In above listing, you are using mongoose ‘find function’ to fetch all the records from the MongoDB collection.

The above route will be called, whenever client will perform HTTP GET operation on baseurl/api/products. To test API, open POSTMAN and perform a GET operation on URL baseurl/api/products.

In Postman, you will find that, HTTP GET operation returns all the records. To return to a particular record on basis of ‘product_id’, add a new route to the API such that, whenever you perform GET operation with product_id as query parameter on the same URL, you will fetch a particular record.


router.route('/products/:product_id').get(function (req, res) {

    product.findById(req.params.product_id, function (err, prod) {
        if (err)
            res.send(err);
        res.json(prod);
    });
});

To fetch a particular product, you are using mongoose findById() function which takes product_id as input parameter. To read input parameter from HTTP GET operation, we are using req.params property.

The above route will be called, whenever client will perform HTTP GET operation on baseurl/api/products/{product_id}. To test API, open POSTMAN and perform a GET operation on URL baseurl/api/products/{product_id}.

 

You will find that a particular product has been returned with matching product_id.

Step 7: Updating Record

To update a record, you need to perform HTTP PUT operation. While performing PUT operation,

  • Pass product_id need to be updated as query parameter.
  • Pass product object need to be updated as request body.

To perform PUT operation, create a PUT route as shown in the listing below:


router.route('/products/:product_id').put(function (req, res) {

    product.findById(req.params.product_id, function (err, prod) {
        if (err) {
            res.send(err);
        }
        prod.title = req.body.title;
        prod.price = req.body.price;
        prod.instock = req.body.instock;
        prod.photo = req.body.photo;
        prod.save(function (err) {
            if (err)
                res.send(err);

            res.json({ message: 'Product updated!' });
        });

    });
});

So, what is going on in above code listing? These are the operations that need to be done to perform the updated operation.

  • Fetch product to be updated from the collection.
  • Update fetched product properties with the request body object properties.
  • Save the product to the collection using save method.

The above route will be called to update a particular product, whenever the client will perform HTTP PUT operation on baseurl/api/products/{product_id}. To test API, open POSTMAN and perform a PUT operation on URL baseurl/api/products/{product_id}. You need to pass product object to be updated in the body of the request as shown in the image below:

Step 8: Deleting Record

To delete a particular record, client needs to perform HTTP DELETE operation. To do this, create a route for delete operation.


router.route('/products/:product_id').delete(function (req, res) {

    product.remove({ _id: req.param.product_id }, function (err, prod) {
        if (err) {
            res.send(err);
        }
        res.json({ message: 'Successfully deleted' });
    })

});

Client will pass product_id to be deleted in the query parameter. Note : Product remove method is used to remove a product from the collection.

The above route will be called to delete a particular product, whenever client will perform HTTP DELETE operation on baseurl/api/products/{product_id}. To test API, open POSTMAN and perform a DELETE operation on URL baseurl/api/products/{product_id} as shown in the image below:

You just created a full working REST API to perform CRUD operations on MongoDB using NodeJS. In next post, we will use this API with Angular application to complete an application using MEAN stack. Thanks for reading.

Download Source Code of this post from GitHub here

Cumartesi, 07 Ekim 2017 / Published in Uncategorized

In my prior posts, I’ve covered:

This is a great start. Next on my list is authentication. How do I handle both a web-based authentication and a mobile-based authentication pattern within a single authentication pattern. ASP.NET Core provides a modular authentication system. (I’m sensing a theme here – everything is modular!) So, in this post, I’m going to cover the basics of authentication and then cover how Azure App Service Authentication works (although Chris Gillum does a much better job than I do and you should read his blog for all the Easy Auth articles) and then introduce an extension to the authentication library that implements Azure App Service Authentication.

Authentication Basics

To implement authentication in ASP.NET Core, you place the following in the ConfigureServices() method of your Startup.cs:

services.AddAuthentication();

Here, services is the IServiceCollection that is passed in as a parameter to the ConfigureServices() method. In addition, you need to add a UseXXX() method to bring in the extension method within the Configure() method. Here is an example:

app.UseJwtBearerAuthentication(new JwtBearerOptions
{
    Authority = Configuration["JWT:Authority"],
    Audience = Configuration["JWT:Audience"]
});

Once that is done, your MVC controllers or methods can be decorated with the usual [Authorize] decorator to require authentication. Finally, you need to add the Microsoft.AspNetCore.Authentication NuGet package to your project to bring in the authentication framework.

In my project, I’ve added the services.AddAuthentication() method to ConfigureServices() and added an [Authorize] tag to my /Home/Configuration controller method. This means that the configuration viewer that I used last time now needs authentication:

That 401 HTTP status code for Configuration loading is indicative of a failed authentication. 401 is “Authorization Failed”. This is completely expected because we have not configured an authentication provider yet.

How App Service Authentication Works

Working with Azure App Service Authentication is relatively easy. A JWT-based token is submitted either as a cookie or as the X-ZUMO-AUTH header. The information necessary to decode that token is provided in environment variables:

  • WEBSITE_AUTH_ENABLED is True if the Authentication system is loaded
  • WEBSITE_AUTH_SIGNING_KEY is the passcode used for signing the key
  • WEBSITE_AUTH_ALLOWED_AUDIENCES is the list of allowed audiences for the JWT

If WEBSITE_AUTH_ENABLED is set to True, decode the X-ZUMO-AUTH header to see if the user is valid. If the user is valid, then do a HTTP GET of {issuer}/.auth/me with the X-ZUMO-AUTH header passed through to get a JSON blob with the claims. If the token is expired or non-existent, then don’t authenticate the user.

This has an issue in that you have to do another HTTP call to get the claims. This is a small overhead and the team is working to fix this for “out of process” services. In process services, such as PHP and ASP.NET, have access to server variables. The JSON blob that is returned by calling the /.auth/me endpoint is presented as a server variable so it doesn’t need to be fetched. ASP.NET Core applications are “out of process” so we can’t use this mechanism.

Configuring the ASP.NET Core Application

In the Configure() method of the Startup.cs file, I need to do something like the following:

            app.UseAzureAppServiceAuthentication(new AzureAppServiceAuthenticationOptions
            {
                SigningKey = Configuration["AzureAppService:Auth:SigningKey"],
                AllowedAudiences = new[] { $"https://{Configuration["AzureAppService:Website:HOST_NAME"]}/" },
                AllowedIssuers = new[] { $"https://{Configuration["AzureAppService:Website:HOST_NAME"]}/" }
            });

This is just pseudo-code right now because neither the UserAzureAppServiceAuthentication() method nor the AzureAppServiceAuthenticationOptions class exist. Fortunately, there are many templates for a successful implementation of authentication. (Side note: I love open source) The closest one to mine is the JwtBearer authentication implementation. I’m not going to show off the full implementation – you can go check it out yourself. However, the important work is done in the AzureAppServiceAuthenticationHandler file.

The basic premise is this:

  1. If we don’t have an authentication source (a token), then return AuthenticateResult.Skip().
  2. If we have an authentication source, but it’s not valid, return AuthenticateResult.Fail().
  3. If we have a valid authentication source, decode it, create an AuthenticationTicket and then return AuthenticateResult.Success().

Detecting the authentication source means digging into the Request.Headers[] collection to see if there is an appropriate header. The version I have created supports both the X-ZUMO-AUTH and Authorization headers (for future compatibility):

            // Grab the X-ZUMO-AUTH token if it is available
            // If not, then try the Authorization Bearer token
            string token = Request.Headers["X-ZUMO-AUTH"];
            if (string.IsNullOrEmpty(token))
            {
                string authorization = Request.Headers["Authorization"];
                if (string.IsNullOrEmpty(authorization))
                {
                    return AuthenticateResult.Skip();
                }
                if (authorization.StartsWith("Bearer ", StringComparison.OrdinalIgnoreCase))
                {
                    token = authorization.Substring("Bearer ".Length).Trim();
                    if (string.IsNullOrEmpty(token))
                    {
                        return AuthenticateResult.Skip();
                    }
                }
            }
            Logger.LogDebug($"Obtained Authorization Token = {token}");

The next step is to validate the token and decode the result. If the service is running inside of Azure App Service, then the validation has been done for me and I only need to decode the token. If I am running locally, then I should validate the token. The signing key for the JWT is encoded in the WEBSITE_AUTH_SIGNING_KEY environment variable. Theoretically, the WEBSITE_AUTH_SIGNING_KEY can be hex encoded or base-64 encoded. It will be hex-encoded the majority of the time. Using the configuration provider from the last post, this appears as the AzureAppService:Auth:SigningKey configuration variables and I can place that into the options for the authentication provider during the Configure() method of Startup.cs.

So, what’s the code for validating and decoding the token? It looks like this:

            // Convert the signing key we have to something we can use
            var signingKeys = new List<SecurityKey>();
            // If the signingKey is the signature
            signingKeys.Add(new SymmetricSecurityKey(Encoding.UTF8.GetBytes(Options.SigningKey)));
            // If it's base-64 encoded
            try
            {
                signingKeys.Add(new SymmetricSecurityKey(Convert.FromBase64String(Options.SigningKey)));
            } catch (FormatException) { /* The key was not base 64 */ }
            // If it's hex encoded, then decode the hex and add it
            try
            {
                if (Options.SigningKey.Length % 2 == 0)
                {
                    signingKeys.Add(new SymmetricSecurityKey(
                        Enumerable.Range(0, Options.SigningKey.Length)
                                  .Where(x => x % 2 == 0)
                                  .Select(x => Convert.ToByte(Options.SigningKey.Substring(x, 2), 16))
                                  .ToArray()
                    ));
                }
            } catch (Exception) {  /* The key was not hex-encoded */ }

            // validation parameters
            var websiteAuthEnabled = Environment.GetEnvironmentVariable("WEBSITE_AUTH_ENABLED");
            var inAzureAppService = (websiteAuthEnabled != null && websiteAuthEnabled.Equals("True", StringComparison.OrdinalIgnoreCase));
            var tokenValidationParameters = new TokenValidationParameters
            {
                // The signature must have been created by the signing key
                ValidateIssuerSigningKey = !inAzureAppService,
                IssuerSigningKeys = signingKeys,

                // The Issuer (iss) claim must match
                ValidateIssuer = true,
                ValidIssuers = Options.AllowedIssuers,

                // The Audience (aud) claim must match
                ValidateAudience = true,
                ValidAudiences = Options.AllowedAudiences,

                // Validate the token expiry
                ValidateLifetime = true,

                // If you want to allow clock drift, set that here
                ClockSkew = TimeSpan.FromSeconds(60)
            };

            // validate the token we received
            var tokenHandler = new JwtSecurityTokenHandler();
            SecurityToken validatedToken;
            ClaimsPrincipal principal;
            try
            {
                principal = tokenHandler.ValidateToken(token, tokenValidationParameters, out validatedToken);
            }
            catch (Exception ex)
            {
                Logger.LogError(101, ex, "Cannot validate JWT");
                return AuthenticateResult.Fail(ex);
            }

This only gives us a subset of the claims though. We want to swap out the principal (in this case) with the results of the call to /.auth/me that gives us the actual claims:

            try
            {
                client.BaseAddress = new Uri(validatedToken.Issuer);
                client.DefaultRequestHeaders.Clear();
                client.DefaultRequestHeaders.Accept.Add(new MediaTypeWithQualityHeaderValue("application/json"));
                client.DefaultRequestHeaders.Add("X-ZUMO-AUTH", token);

                HttpResponseMessage response = await client.GetAsync("/.auth/me");
                if (response.IsSuccessStatusCode)
                {
                    var jsonContent = await response.Content.ReadAsStringAsync();
                    var userRecord = JsonConvert.DeserializeObject<List<AzureAppServiceClaims>>(jsonContent).First();

                    // Create a new ClaimsPrincipal based on the results of /.auth/me
                    List<Claim> claims = new List<Claim>();
                    foreach (var claim in userRecord.UserClaims)
                    {
                        claims.Add(new Claim(claim.Type, claim.Value));
                    }
                    claims.Add(new Claim("x-auth-provider-name", userRecord.ProviderName));
                    claims.Add(new Claim("x-auth-provider-token", userRecord.IdToken));
                    claims.Add(new Claim("x-user-id", userRecord.UserId));
                    var identity = new GenericIdentity(principal.Claims.Where(x => x.Type.Equals("stable_sid")).First().Value, Options.AuthenticationScheme);
                    identity.AddClaims(claims);
                    principal = new ClaimsPrincipal(identity);
                }
                else if (response.StatusCode == System.Net.HttpStatusCode.Unauthorized)
                {
                    return AuthenticateResult.Fail("/.auth/me says you are unauthorized");
                }
                else
                {
                    Logger.LogWarning($"/.auth/me returned status = {response.StatusCode} - skipping user claims population");
                }
            }
            catch (Exception ex)
            {
                Logger.LogWarning($"Unable to get /.auth/me user claims - skipping (ex = {ex.GetType().FullName}, msg = {ex.Message})");
            }

I can skip this phase if I want to by setting an option. The result of this code is a new identity that has all the claims and a “name” that is the same as the stable_sid value from the original token. If the /.auth/me endpoint says the token is bad, I am returning a failed authentication. Otherwise, I’m skipping the response.

I could use the UserId field from the user record. However, that is not guaranteed to be stable as users can and do change the email address on their social accounts. I may update this class in the future to use stable_sid by default but allow the developer to change it if they desire.

The final step is to create an AuthenticationTicket and return success:

            // Generate a new authentication ticket and return success
            var ticket = new AuthenticationTicket(
                principal, new AuthenticationProperties(),
                Options.AuthenticationScheme);

            return AuthenticateResult.Success(ticket);

You can check out the code on my GitHub Repository at tag p5.

Wrap Up

There is still a little bit to do. Right now, the authentication filter returns a 401 Unauthorized if you try to hit an unauthorized page. This isn’t useful for a web page, but is completely suitable for a web API. It is thus “good enough” for Azure Mobile Apps. If you are using this functionality in an MVC application, then it is likely you want to set up some sort of authorization redirect to a login provider.

In the next post, I’m going to start on the work for Azure Mobile Apps.

Cumartesi, 07 Ekim 2017 / Published in Uncategorized

As per you request, I am going to make a login page in AngularJS and ASP. NET. I am trying to make this article as simple as I can. I hope you like it. Your valuable feedback is always welcome.Thanks.

Let’s start.

Please follow these steps to make this Application. Step

Create an empty MVC project

 

First, open your Visual Studio -> File -> New -> Project -> ASP.NET Web Application and click OK. Now, select MVC and OK. Now, you have created the basic project structure.

You can go though the screenshot given below for help.

You will have the directory given below.

Step 2 

Install Angular package

Install Angular package in your project, as shown in the screenshot given below.

Step 3

In Views folder loginpage, paste the code given below in login.cshtml.

  1. @ {  
  2.     ViewBag.Title = "Login Using Angular";  
  3. } < h2 > Login Using Angular < /h2> < div ng – controller = "LoginController" > < form name = "myForm"  
  4. novalidate ng – submit = "LoginForm()" > < div style = "color:green" > {  
  5.     {  
  6.         msg  
  7.     }  
  8. } < /div> < table ng – show = "!IsLoggedIn"  
  9. class = "table table-horizontal" > < tr > < td > Email / UserName: < /td> < td > < input type = "email"  
  10. ng – model = "UserModel.Email"  
  11. name = "UserEmail"  
  12. ng – class = "Submited?’ng-dirty’:”"  
  13. required autofocus class = "form-control" / > < span style = "color:red"  
  14. ng – show = "(myForm.UserEmail.$dirty || Submited ) && myForm.UserEmail.$error.required" > Please enter Email < /span> < span style = "color:red"  
  15. ng – show = "myForm.UserEmail.$error.email" > Email is not valid < /span> < /td> < /tr> < tr > < td > Password: < /td> < td > < input type = "password"  
  16. ng – model = "UserModel.Password"  
  17. name = "UserPassword"  
  18. ng – class = "Submited?’ng-dirty’:”"  
  19. required autofocus class = "form-control" / > < span style = "color:red"  
  20. ng – show = "(myForm.UserPassword.$dirty || Submited) && myForm.UserPassword.$error.required" > Password Required < /span> < /td> < /tr> < tr > < td > < /td> < td > < input type = "submit"  
  21. value = "submit"  
  22. class = "btn btn-success" / > < /td> < /tr> < /table> < /form> < /div>  
  23. @section scripts { < script src = "~/Scripts/LoginController.js" > < /script>  
  24. }  

We have to create a JavaScript login controller and added its reference, as shown below.

  1. “~/Scripts/LoginController.js”>  

Step 4 

Add Javascript Controllor

Create an Angular module in Module.js file and add the code given below.

  1. (function() {  
  2.     var myApp = angular.module("myApp", []);  
  3. })();  

Now, add another script file for Angular controller and Factory method.

Right click on Scripts folder –> add LoginController.js and write the code given below.

  1. angular.module(‘myApp’).controller(‘LoginController’function($scope, LoginService) {  
  2.     //initilize user data object  
  3.     $scope.LoginData = {  
  4.         Email: ,  
  5.         Password:   
  6.     }  
  7.     $scope.msg = "";  
  8.     $scope.Submited = false;  
  9.     $scope.IsLoggedIn = false;  
  10.     $scope.IsFormValid = false;  
  11.     //Check whether the form is valid or not using $watch  
  12.     $scope.$watch("myForm.$valid"function(TrueOrFalse) {  
  13.         $scope.IsFormValid = TrueOrFalse; //returns true if form valid  
  14.     });  
  15.     $scope.LoginForm = function() {  
  16.         $scope.Submited = true;  
  17.         if ($scope.IsFormValid) {  
  18.             LoginService.getUserDetails($scope.UserModel).then(function(d) {  
  19.                 debugger;  
  20.                 if (d.data.Email != null) {  
  21.                     debugger;  
  22.                     $scope.IsLoggedIn = true;  
  23.                     $scope.msg = "You successfully Loggedin Mr/Ms " + d.data.FullName;  
  24.                 } else {  
  25.                     alert("Invalid credentials buddy! try again");  
  26.                 }  
  27.             });  
  28.         }  
  29.     }  
  30. }).factory("LoginService"function($http) {  
  31.     //initilize factory object.  
  32.     var fact = {};  
  33.     fact.getUserDetails = function(d) {  
  34.         debugger;  
  35.         return $http({  
  36.             url: ‘/Home/VerifyUser,  
  37.             method: ‘POST’,  
  38.             data: JSON.stringify(d),  
  39.             headers: {  
  40.                 ‘content-type’‘application/json’  
  41.             }  
  42.         });  
  43.     };  
  44.     return fact;  
  45. });  

Step 5

Add New Action Method in HomeController to verify the user.

  1. public ActionResult VerifyUser(UserModel obj)  
  2. {  
  3.     DatabaseEntities db = new DatabaseEntities();  
  4.     var user = db.Users.Where(x => x.Email.Equals(obj.Email) && x.Password.Equals(obj.Password)).FirstOrDefault();  
  5.     return new JsonResult {  
  6.         Data = user, JsonRequestBehavior = JsonRequestBehavior.AllowGet  
  7.     };  
  8. }  

Step 6 

Add a Model class to the solution

Right click Models –> Add –> Class and name it UserModel.cs.

You can add usermodel, as shown below.

  1. namespace LoginUsingAngular.Models  
  2. {  
  3.     public class UserModel {  
  4.         public string Email {  
  5.             get;  
  6.             set;  
  7.         }  
  8.         public string Password {  
  9.             get;  
  10.             set;  
  11.         }  
  12.     }  
  13. }  

Take a look at what the ng-Model, ng-show and ng-submit means, which is used in login.cshtml view given above.

  • ng-Model ngModel is an Angular Directive. It is used for two way binding data from View to Controller and Controller to View.
  • ng-show ngShow allows to display or hide the elements, which are based on the expression provided to ngShow attribute.
  • ng-submit ng-submit prevents the default action of the form and binds Angular function to onsubmit events. This is invoked when the form is submitted.
  • $dirty It is an Angular built in property. It will be true, if the user interacts with the form, else false.

There are many validation properties in Angular like $invalid, $submitted, $pristine, $valid and $error.

TOP