Tip of the Day: Calculating durations with Moment

Moment has quite a nice fluent interface for some operations, others just need a little more thought.

For example, I wanted the duration of something and I had recorded the start and end time. I thought something like this, finding the difference between two dates and converting it to a duration, would work:

var duration = endTime.diff(startTime).duration().asSeconds();

However, that doesn’t work.

What you have to do is find the difference, then pass that into the duration function, like this:

var duration = moment.duration(endTime.diff(startTime)).asSeconds();

And now I get what I wanted.

DunDDD 2014: Introduction to Node.js–From Hello World to Deploying on Azure

Thank you for those that came to my talk. As promised here are the slides, code, and links given in the talk.

Slides and Code

The slide deck is available as a PDF file.

Links from the talk

Many slides have a link at the bottom, but if you didn’t catch them, here they are again.

Deploying a Node.js with Express application on Azure

By the end of the last post, there was enough of an application to deploy it, so let’s deploy it.

Prerequisites

  • An FTP client to get at the files on Azure.
  • A GitHub account (or other source control, but this walk through uses GitHub) to deploy to Azure.
  • And an Azure account – this walk through does not require anything beyond the free account.

Setting up Azure

Log in to Azure, then click the "New +" button at the bottom right and Quick Add a website

When that’s okayed a message will appear at the bottom of the page.

Once the website has been provisioned it can be modified. From the starter page or dashboard, source control deployment can be configured

Above the starter page for new websites, below the side bar on the website dashboard.

Then select the desired source control. For this example, the deployment is in GitHub

The choose the repository and branch for the deployment.

Then press the tick icon to confirm.

Once the Azure website and source control are linked, it will start deploying the site…

Once finished the message will change to indicate that it is deployed.

 

At the point the website can be viewed. However, there are issues with it – It isn’t serving some files, as can be seen here.

What went wrong?

It is rather obvious that something is wrong. Images are not being rendered, although it looks like other things are, such as the CSS.

By examining the diagnostics tools in the browser it looks like the files are simply not found. But, there is no content.

A few blog posts ago, it was noted that if Node.js didn’t know how to handle a route then it would issue a 404 Not Found, but also it would render some content so that the browser had something to display to the user.

Here is a 404 Not Found that gets as far as Node.js:

The the browser window itself is the message that Node.js renders. It is returning a 404 status code, but it has content. Also, note that there is an X-Powered-By: Express as well as the X-Powered-By: ASP.NET from the previous example. This immediately suggests that the 404 is being issued before Node.js gets a chance to deal with the request.

It is for this reason that FTP is required so that some remote administration of the site is possible.

When the site is deployed to Azure, it recognises that it is a Node.js site and will look for an entry point. Normally it looks for a file called server.js, however, it can also work out that app.js is the file it is looking for. So, normally, the entry point into the application should be server.js for installing into Azure.

Azure creates a web.config for the application which has all the settings needed to tell IIS how to deal with the website. However, it is missing some bits. It does not know how to deal with SVG files, so it won’t serve them, even although the Node.js application understands that static content resides in a certain location.

The missing part of the web.config that is needed is:

    <staticContent>
      <mimeMap fileExtension=".svg" mimeType="image/svg+xml" />
    </staticContent>

 

Accessing the files on Azure with FTP

This example uses FileZilla as the FTP Client.

First, the credentials need to be set for accessing the site via FTP. In the Website dashboard, the side bar contains a link to “Reset your deployment credentials”.

When clicked a dialog appears that allows the username and password to be set.

Once this is filled in and the tick clicked, the details for connecting via FTP will be in the side bar on the dashboard.

These details, along with the password previously created, can be used to connect via FTP.

Once connected, navigate to the location of the web.config file, which is in /site/wwwroot and transfer the file to the source code directory. The file can now be edited along with other source code and that means that when it is deployed any relevant updates can be deployed in one action, rather than requiring additional actions with FTP.

The changes to the web.config are to add the following

    <staticContent>
<mimeMap fileExtension=".svg" mimeType="image/svg+xml" />
</staticContent>

to the <configuration><system.webServer> section of the file.

Finally, add the web.config file to the repository, commit the changes to source control and push it to GitHub. It only takes a few moments for Azure to pick it up and then the portal will display a new item in the deployment history.

Refreshing the site in a browser window finally reveals the missing graphics.

The project on GitHub

This was marked as a release on GitHub as part of the Xander.Flashcards project.

Node.js with Express – Come to the dark side. We have cookies!

So far, so good. At the point the application displays a list of languages and will display the language that the user picked. However, for the flashcards to work, that selection will have to be remembered. As there is so little state, it is possible to store that in a cookie.

Express handles the creation of cookies without the need for middleware. However, in order to read back the cookies a parser is needed.

Setting and Removing the cookie

To set the cookie with the language that is received from the form on the page:

var language = req.body.language;
var cookieAge = 24*60*60*1000; // 1 day
res.cookie("flashcard-language",language,{maxAge:cookieAge, httpOnly:true});

In the above code, the language is set from the form value, as seen in the previous blog post. The cookieAge is set to be one day, after which it will expire. Finally, the cookie is added to the response object. It is named "flashcard-language".

When the route is requested the HTTP response header will look something like this:

HTTP/1.1 200 OK
X-Powered-By: Express
Set-Cookie: flashcard-language=ca; Max-Age=86400; Path=/; Expires=Mon, 17 Nov 2014 23:22:12 GMT; HttpOnly
Content-Type: text/html; charset=utf-8
Content-Length: 18
Date: Sun, 16 Nov 2014 23:22:12 GMT
Connection: keep-alive

To clear the cookie, call clearCookie and pass in the name of the cookie to clear.

res.clearCookie("flashcard-language");

The HTTP Response will then contain the request for the browser to clear the cookie:

HTTP/1.1 304 Not Modified
X-Powered-By: Express
Set-Cookie: flashcard-language=; Path=/; Expires=Thu, 01 Jan 1970 00:00:00 GMT
ETag: W/"SuG3Z498eJmmc04TIciYHQ=="
Date: Sun, 16 Nov 2014 23:29:25 GMT
Connection: keep-alive

Reading in the cookie

In order to read in the cookie some middleware is required. The changes to the app.js file are:

// Requirements section
var cookieParser = require("cookie-parser");
...
// set up section
app.use(cookieParser());

And in the route function that responds to the request, the cookie can be read back like this:

var language = req.cookies["flashcard-language"];

Code for this post

The code for this can be found here on GitHub.

Node.js with Express – Getting form data

Now that view engines are wired up and working on this application, the next area to look at is getting data back from the browser.

By default Express doesn’t do anything with form data in the request and a piece of “middleware” needs to be added to get this to work. The reason for this is that there are many ways to process data from the browser (or perhaps it is data from something that is not a browser), so it is left up to the developer how best to process that data.

The view now has a form element and a submit button. It also has an input which will contain the name of the language the user wants. This information is transmitted to the server when the user presses the submit button.

In order to read this information a piece of middleware called body-parser is added.

First, it has to be installed into the application:

npm install body-parser --save

Then the application need to know about it. The following changes are made to the app.js file:

// In the requiments section
var bodyParser = require("body-parser");
...
// In the set up section
app.use(bodyParser.urlencoded());
...
// Set up the route as an HTTP POST request.
app.post("/set-language", setLanguage);

Since body-parser can handle a few different types of encoding the application needs to know which to expect. Browsers return form data as application/x-www-form-urlencoded parser, so that’s the parser that is used by the application.

There are some limitations to body-parser but it is good enough for this application. For example, it does not handle multi-part form data.

The route function can now read the body property that body-parser previously populated.

module.exports = function(req, res){
    var language = req.body.language;
    res.send("set-language to "+language);
};

This will now return a simple message to the user with the language code that was set on the previous page.

Viewing the full source code

Rather than paste the source code at the end of the blog post, I’ve released the project on to GitHub. You can either browse the code there, or get  a copy of the repository for examining yourself. There may be changes coming, so it is best to look for the release that corresponds with this blog post.

Express for node.js – View Engines

In the last post, Express Hello World, I talked about getting started with an Express application in Node.js. In it, the output was rendered by writing HTML directly. In most situations that is undesirable and some sort of view or template engine is better suited to rendering the output.

This post follows on directly from that, so any modifications will be based on what the end result was at the end of the last post. The final output from this post will be shown at the bottom.

There are quite a few view engines out there. A popular one for express is jade, but personally I find it too far from HTML to be useful. This is especially true if you work with graphic designers whose tools will output HTML that the developer has to mould into a view for the application to run. The less rework there the better, in my opinion.

So, I’m going to use EJS in this post. It is a bit like the ASPX view engine in .NET applications, but its actually a derivative of Ruby’s ERB.

To install EJS add the "ejs" package to the application.

npm install ejs --save

Configuring the view engine

Express needs to know which view engine is going to be used, it also needs to know where you are going to store the views. So the following lines are needed in the app.js file from the previous post

app.set("view engine","ejs");
app.set("views","./views");

Changing the route to render the view in the response

As express has been told to look in a directory called "views" to get the views, the view is going to go into that directory. At this time all that is going into helloworld.ejs is the HTML that was derectly sent form the route in the last post. The helloworld.ejs file now looks like this:

<h1>Hello, World!</h1>

And the route looks like this:

module.exports = function(req, res) {
    res.render("helloworld");
};

This will now give the same output as before.

So far, this has done nothing new for us. The power of views is the ability to pass data to them and for them to render it nicely for the user.

Sending data to the view

This is relatively easy. All that needs to happen is that the response.render call needs an additional parameter which contains the information.

module.exports = function(req, res) {
    res.render("helloworld",{name:"Colin"});
};

In order to render the information the view has to be changed too.

<h1>Hello, <%= name %>!</h1>

Adding a layout

EJS, out of the box, does not support layouts (or master pages as .NET’s ASPX view engine calls them). However there is a package that can be added that adds layout support. To install:

npm install express-ejs-layouts --save

And the changes in app.js file

// in the requirements section
var ejsLayouts = require("express-ejs-layouts");
...
// in the set up section
app.use(ejsLayouts);

And that’s it. Your application can now use layouts, by default it uses the file layout.ejs in the views folder, but this can be changed if you prefer.

To demonstrate this, here is a layout:

<!DOCTYPE HTML>
<html>
    <head>
        <title>Express with EJS</title>
        <link rel="stylesheet" href="/css/bootstrap.css"/>
        <link rel="stylesheet" href="/css/bootstrap-theme.css"/>
    </head>
    <body>
        <div class="container">
            <%- body %>
        </div>
        <script type="application/javascript" src="/js/bootstrap.js"></script>
    </body>

The <%- body %> indicates where the body of the page should go. This is the view that is names in the response.render function call.

Running the application at this point now shows that the view with its layout are now being rendered. But there is a problem. The styles are not showing up correctly.

Accessing static files

By default Express will not serve static files. It needs to be told explicitly where the static files are so that it can render them. While this may seem a little bit of a pain over something like an ASP.NET MVC application, where IIS will server anything in the folder that it recognised (and it recognises a lot), it is actually somewhat comforting that it won’t serve up files by accident.

To set up a folder to be served add the following line to the app.js file:

app.use(express.static(__dirname + '/public'));

What this does is tell Express that a directory called public contains static content and files should be served directly when requested. The __dirname is the directory where the current file is located. So what this means is that public is a directory located in the same directory as the app.js file.

What is interesting here is that the path to the static resource is a full server path. That means the public files can be anywhere on the server, or addressable by the server. The path in the URL will be translated to the full server path as needed. So that means that even although I have a directory called public in my Express application, the browser doesn’t see that directory. It only sees what is in it.

So, from the browser’s perspective the bootstrap.css file is located at /css/bootstrap.css but my application sees it as …/public/css/boostrap.css.

Summary

In this post the Hello World application was advanced with the EJS view engine along with layout support and the ability for static files to be rendered to the browser.

The app.js file now looks like this:

// Requirements
var express = require("express");
var http = require("http");
var ejsLayouts = require("express-ejs-layouts");
var hello = require("./routes/hello.js");

// Set up the application
var app = express();
app.set("port", process.env.PORT || 3000);
app.set("view engine","ejs");
app.use(ejsLayouts);
app.set("views","./views");
app.use(express.static(__dirname+"/public"));
app.get("/", hello);

// Run up the server
http.createServer(app).listen(app.get("port"), function(){
    console.log("Express server listening on port " + app.get("port"));
});

The routes/hello.js file now looks like this:

module.exports = function(req, res) {
    res.render("helloworld",{name:"Colin"});
};

The views/layout.ejs file looks like this:

<!DOCTYPE HTML>
<html>
    <head>
        <title>Express with EJS</title>
        <link rel="stylesheet" href="/css/bootstrap.css"/>
        <link rel="stylesheet" href="/css/bootstrap-theme.css"/>
    </head>
    <body>
        <div class="container">
            <%- body %>
        </div>
        <script type="application/javascript" src="/js/bootstrap.js"></script>
    </body>
</html>

The views/helloworld.ejs file looks like this:

<div class="row">
    <div class="col-md-12">
        <h1>Hello, <%= name %>!</h1>
    </div>
</div>

And twitter bootstrap was installed into the public directory. The project structure now looks like this:

Express for node.js walk through – Hello World

While both Visual Studio and WebStorm have templates for node.js Express applications in this post (and and subsequent few posts) I’m going to walk through setting up an Express application so that you can see how it fits together.

Installing Express

To start with I created a folder with just a package.json file in it, then in a command line I ran npm to install Express:

npm install express –save

Express "Hello, World!"

At this stage an app.js file is created. It will bootstrap the Express application. It will configure the environment and start the HTTP listener.

In both the WebStorm and Visual Studio templates there is a line of code that looks like this:

app.set('port', process.env.PORT || 3000);

What this is doing is getting the port to run the application on. By default it will take the value in the "port" environment variable, and if not found it will default to port 3000 within the scope of the process. Although it is setting back the value of the environment variable it isn’t persisted outside of the application, so it really just serves to set up a default value without you having to check for a value and supply a default value when ever it is actually needed.

The full application, at this point is this:

// Requirements
var express = require("express");
var http = require("http");

// Set up the application
var app = express();
app.set("port", process.env.PORT || 3000);

// Run up the server
http.createServer(app).listen(app.get("port"), function(){
    console.log("Express server listening on port " + app.get("port"));
});

The HTTP module is the bit that communicates with the outside world, so we require it for our application to work. We create the server and pass it the app which is a function HTTP’s request event can call. The createServer then returns a server object. As we want to start listening immediately we can just call listen on the server, passing it the port we want to listen on, and a callback function that will be called when the server fires the listening event to indicate that it has started listening for requests.

At this point, browsing on localhost with the defined port will just result in Express responding with a terse error message. But at least it proves that it is listening and can respond

It seems to be a convention, not one that is enforced by the framework, that handlers for the routes are put in a folder called routes, so a file is created called hello.js and it simply looks like this:

module.exports = function(req, res) {
    res.send("<h1>Hello, World!</h1>");
};

req is the request, and res is the response. Very simply the response can send some content back to the browser. In this case it is hand crafted HTML (and very simple at that).

Back in the app.js file, a require line is added to bring in the hello.js module, and further down the route is added to the application.

var hello = require("./routes/hello.js");
...
app.get("/", hello);

Now the application will respond to a GET request on the root URL of the application. The function exported by hello.js will be run whenever this URL is requested by the browser.

The whole app.js file now looks like this:

// Requirements
var express = require("express");
var http = require("http");
var hello = require("./routes/hello.js");

// Set up the application
var app = express();
app.set("port", process.env.PORT || 3000);
app.get("/", hello);

// Run up the server
http.createServer(app).listen(app.get("port"), function(){
    console.log("Express server listening on port " + app.get("port"));
});

Node Package Manager – npm

The Node Package Manager is a bit like NuGet for node. It is a way to get additional functionality into node for frameworks that were not bundled with node itself.

Like NuGet it has its own website where you can browse the available packages. It is at https://www.npmjs.org/ 

Installing a package

To installing a package, just type at the command line while in your application directory:

npm install [package-name]

For example:

npm install express 

which will install the express framework (a web application framework which is very similar to Sinatra on Ruby or Nancy on .NET)

The package manager will create a directory called node_modules and put the package(s) it installs there.

Using the package in your application

As with any module you need a require statement in your application. However, even although npm has installed the package in your application’s directory you don’t need a path to the module like you would with your own modules within your application. A simple require("[package-name]") will do.

Modules and Source Control

Like other package managers you wouldn’t generally expect the packages themselves to be checked into source control. You can put a file called package.json in the root of your application to define how your application is configured. It contains some basic metadata about the application and can also include the packages that the application relies on. Full details can be found here: https://www.npmjs.org/doc/files/package.json.html. If that is a bit much and you want a quick overview, there is also a cheat sheet available.

At its most basic, you want to have a package.json with at least the opening and closing braces. This is enough for npm to update the file with the details of the package it is installing. An empty file will just cause an error.

To ensure that the packages get saved to the package.json file remember the --save switch on the command line. e.g.

npm install express --save

So, now your application can be committed to source control without having to add the packages in too. You can add the node_packages folder in your .gitignore file (or what ever your source control system requires)

When retrieving the application from source control or getting an update, you can run the install command on its own without any other parameters to tell the node package manager to read the package.json file and install all of the dependencies it finds. Like this:

npm install

Working with Modules in node.js

In my last post on node.js I showed how to set up a development environment to work with node. Now, I’m going to introduce some bits and pieces to get going a bit further than a simple hello world.

Require

First off, unless you want to be working with JavaScript files that are thousands of lines long you’ll want to modularise your code into smaller components that you can bring in to other JavaScript files as required. From the API Documentation: “Node has a simple module loading system. In Node, files and modules are in one-to-one correspondence.”

The “require” keyword allows you to do this. You must also remember that require will return an object that represents that module, so you need to assign it to a variable.

Here is an example, showing you how to require the readline module so that you can ask questions at the command line.

var readline = require("readline"),
    rlInterface = readline.createInterface({
        input: process.stdin, 
        output:process.stdout});

rlInterface.question("What is your name? ", function(answer){
    console.log("Hello, "+answer+".");
    rlInterface.close();
});

And if you run the application, it will look something like this:

$ node whatsYourName.js
What is your name? Colin
Hello, Colin.

Exports

The above is great if you want to include a node module. But if you are building your own, then there is a little more work needing done.

In your module you need to add module.exports to indicate what is being exported from the module. Since it is just a bag of properties you can create what ever you need for the module.

e.g. This is the code for myModule.js

module.exports.someValue = "This is some value being exported from the module";
module.exports.someFunction = function(a, b){
    return a+b;
};

And the consumer of that module (moduleConsumer.js):

var myModule = require("./myModule.js");
console.log("This is someValue: "+myModule.someValue)
console.log("This is the result of someFunction: ", myModule.someFunction(2,3));

Which produces the output:

$ node moduleConsumer.js
This is someValue: This is some value being exported from the module
This is the result of someFunction:  5

One thing you might notice that is different from before is that when you require code that is in your project then you need to specify the path to the JavaScript file, even if it is in the same folder. So for files in the same folder you need to prefix the name of the JavaScript file with “./“.

If you have various files that need to include a specific module, then the first call to require will create the module, then each call after that will get a cached version of the module. So, if you put in a log statement at the top of your module to say that it is being loaded, then that log statement will only be run once regardless of the number of times you require that module.

Requiring a folder

You can require a folder rather than a specific file. In this case, what node does it looks for a file in the folder called packages.json, then index.js.

packages.json is a file containing a piece of json that describes module that bootstraps the rest. e.g.

{ "name" : "my-library",
  "main" : "./lib/my-library.js" }

The “main” property describes the JavaScript file that is the entry point into the module that bootstraps the rest.

If it finds an index.js file then that file is uses as the entry point that bootstraps the rest of the folder.

Node.js – Setting up a basic development environment

Node.js is a platform for executing JavaScript outside of the browser. It runs on the Google V8 engine which effectively means it is cross platform as it can run on Windows, Mac OS X, and Linux.

Since JavaScript written for Node.js is running on one platform issues that plague browser based JavaScript does not occur on Node.js. There are no “Browser Compatibility” issues. It simply runs ECMAScript 5.

Installing

To install Node.js download the installer from the node.js homepage. The “Install” button should give you the correct installer for the platform you are on, but if not, or your are downloading for a different platform you can get a list of each download available, including source code.

One thing I really like about the windows installer is that it offers to add Node.js to the PATH. Something that more developer tools should do.

Writing JavaScript for Node.js

While you can use a basic text editor, or even a more sophisticated text editor, IDEs are also available for writing node.js.

IDE – Node.js Tools for Visual Studio

There is a plug in for Visual Studio that is in beta (at the time of writing) which allows you to create node.js projects. This is very useful if you are doing cross platform development interacting with .NET based applications, or if you are simply used to working with Visual Studio. I like this option because I’ve been working with Visual Studio since version 2.1 (that came out in the early-to-mid-90s).

To get the extension for the Node.js tools for Visual Studio, go to http://nodejstools.codeplex.com/. If you don’t want to default download (which is for the latest version of visual studio) go to the downloads tab and select the appropriate download for your version of Visual Studio.

In the future, I would expect it to be available directly in Visual Studio through the Tools->Extensions and Updates… manager.

Once installed, you can create a new node.js project in the same way you’d create any new project. The Extension will have added some extra options in to the New Project dialog.

You can set break-points, examine variables, set watches and so on. The following is an example of a very simple Hello World application with a break point set and the local variables showing up in the panel at the bottom.

IDE – JetBrains Web Storm

If you don’t already have Visual Studio it may be an expensive option. JetBrains Web Storm is another IDE that allows you to create Node.js applications with a similar set of features to the Visual Studio extension above. Personally, I find Web Storm a little clunky but it works across Windows, Mac OS X, and Linux. If you don’t already have access to Visual Studio it is a much less expensive too.

Text Editors

Besides IDEs you can always use text editors. Some are more advanced than others.

The text editor I hear most about these days is Sublime, which can be more expensive than WebStorm depending on the license needed.  Sublime is a very nice looking text editor and it has its own plug-in system so can be extended, but for roughly the same money you could get a fully featured IDE.

Summary

I’ve not really gone all that much into the text editors available that support developing for JavaScript with Note.js because I don’t feel they really add much. A fully featured IDE with refactoring support is much more important to me. Maybe installing ReSharper corrupted me.