Tag Archives: front-end

How to avoid 26 API requests on your page?

The problem

Create an applications relying on web APIs seems to be quite popular these days. There is already an impressive collection of ready-to-use public APIs (check it at http://www.programmableweb.com ) that we can consume to create mashup, or add features into our web sites.

In the meantime, it has never been so easy to create your own REST-like API with node.js, asp.net web api, ruby or whatever tech you want. It’s also very common to create your own private/restricted API for your SPA, Cross-platform mobiles apps or your own IoT device. The naïve approach, when building our web API, is to add one API method for every feature ; at the end, we got a well-architectured and brilliant web API following the Separation of Concerns principle : one method for each feature. Let’s put it all together in your client … it’s a drama in terms of web performance for the end user and there is never less than 100s requests/sec  on staging environment; Look at your page, there are 26 API calls on your home page !

I talk here about web apps but it’s pretty the same for mobile native applications .RTT and latency is much more important than bandwidth speed. It’s impossible to create a responsive and efficient application with a chatty web API.

The proper approach

At the beginning of December 2014, I attended at the third edition in APIdays in Paris. There was an interesting session –among the others- on Scenario Driven Design by @ijansch.

The fundamental concept in any RESTful API is the resource. It’s an abstract concept and it’s merely different from a data resource. A resource should not be a raw data model (the result of an SQL query for the Web) but should be defined with client usage in mind. “REST is not an excuse to expose your raw data model. “ With this kind of approach, you will create dump clients and smart APIs with a thick business logic.

A common myth is “Ok, but we don’t know how our API is consumed, that’s why we expose raw data”. Most of the times, it’s false. Let’s take the example of twitter timelines. They are list of tweets or messages displayed in the order in which they were sent, with the most recent on top. This is a very common feature and you can see timelines in every twitter client. Twitter exposes a timeline API and API clients just have to call this API to get timelines. Especially, clients don’t have to compute timelines by themselves, by requesting XX times the twitter API for friends, tweets of friends, etc …

I think this is an important idea to keep in mind when designing our APIS. generally,  We don’t need to be so RESTful (Wwhat about HATEOS ?). Think more about API usability and scenarios, that RESTfullness.

The slides of this session are available here.

Another not-so-new approach: Batch requests

Reducing the number of request from a client is a common and well-known Web Performance Optimization technique. Instead of several small images, it’s better to use sprites. Instead of many js library files, it’s better to combine them. Instead of several API calls, we can use batch requests.

Batch requests are not REST-compliant, but we already know that we should sometimes break the rules to have better performance and better scaliblity.

If you find yourself in need of a batch operation, then most likely you just haven’t defined enough resources., Roy T. Fieldin, Father of REST

What is a batch request ?

A batch request contains several different API requests into a single POST request. HTTP provides a special content type for this kind of scenario: Multipart. On server-side, requests are unpacked and dispatched to the appropriate API methods. All responses are packed together and sent back to the client as a single HTTP response.

Here is an example of a batch request:

Request

POST http://localhost:9000/api/batch HTTP/1.1
Content-Type: multipart/mixed; boundary="1418988512147"
Content-Length: 361

--1418988512147
Content-Type: application/http; msgtype=request

GET /get1 HTTP/1.1
Host: localhost:9000


--1418988512147
Content-Type: application/http; msgtype=request

GET /get2 HTTP/1.1
Host: localhost:9000


--1418988512147
Content-Type: application/http; msgtype=request

GET /get3 HTTP/1.1
Host: localhost:9000


--1418988512147--

Response

HTTP/1.1 200 OK
Content-Length: 561
Content-Type: multipart/mixed; boundary="91b1788f-6aec-44a9-a04f-84a687b9d180"
Server: Microsoft-HTTPAPI/2.0
Date: Fri, 19 Dec 2014 11:28:35 GMT

--91b1788f-6aec-44a9-a04f-84a687b9d180
Content-Type: application/http; msgtype=response

HTTP/1.1 200 OK
Content-Type: application/json; charset=utf-8

"I am Get1 !"
--91b1788f-6aec-44a9-a04f-84a687b9d180
Content-Type: application/http; msgtype=response

HTTP/1.1 200 OK
Content-Type: application/json; charset=utf-8

"I am Get2 !"
--91b1788f-6aec-44a9-a04f-84a687b9d180
Content-Type: application/http; msgtype=response

HTTP/1.1 200 OK
Content-Type: application/json; charset=utf-8

"I am Get3 !"
--91b1788f-6aec-44a9-a04f-84a687b9d180--

Batch requests are already supported by many web Frameworks and allowed by many API providers:  Asp.net web api, a node.js module, Google Cloud platform, Facebook, Stackoverflow,Twitter …

Batch support in  asp.net web api

To support batch requests in your asp.net web API, you just have to add a new custom route  :

config.Routes.MapHttpBatchRoute(
routeName: "batch",
routeTemplate: "api/batch",
batchHandler: new DefaultHttpBatchHandler(GlobalConfiguration.DefaultServer)
);

Tip : the DefaultBatchHandler doesn’t provide a way to limit the number of requests in a batch. To avoid performance issues, we may want to limit to 100/1000/… concurrent requests. You have to create your own implementation by inheriting DefaultHttpBatchHandler.

This new endpoint will allow client to send batch requests and you have nothing else to do on server-side. On client side,  to send batch requests, you can use this jquery.batch, batchjs, angular-http-batcher module, …

I will not explain all details here but there is an interesting feature provided by DefaultHttpBatchHandler : the property ExecutionOrder allow to choose between sequential or no sequential processing order. Thanks to the TAP programming model, it’s possible to execute API requests in parallel (for true async API methods)

Here is the result of async/sync batch requests for a pack of three 3 web methods taking one second to be processed.

batch

Finally, Batch requests are not a must-have feature but it’s certainly something to keep in mind. It can help a lot in some situations. A very simple demo application is available here. Run this console app or try to browse localhost:9000/index.html. From my point of view here are some Pros/Cons of this approach.

Pros Cons
Better client performance (less calls) May increase complexity of client code
Really easy to implement on server side Hide real clients scenarios, not REST compliant
Parallel requests processing on server side Should limit batch size at server level for public API
Allow to use GET, POST, PUT, DELETE, … Browser cache may not work properly

Advertisements

Front-end development with Visual Studio

Update (28/01/2015)  : My team has just released a new nuget package that combine latest versions of node, nogit and npm. Don’t hesitate to try it.

The emergence of single page applications introduces a new need for web developers: a front end build process. Javascript MV* frameworks now allow web developers to build complex and sophisticated applications with many files (js, css, sass/less, html …). We’re very far from those 3 lines of JavaScript to put “a kind of magic” on your web site.

In traditional back end development such as asp.net MVC, a compiler transforms source code written in a –human readable- programming language (C#/VB for asp.net) into another computer language (MSIL for .net). Produced files are often called binaries or executables. Compilation may contain several operations: code analysis, preprocessing, parsing, language translation, code generation, code optimization….What about JavaScript?

JavaScript was traditionally implemented as an interpreted language, which can be executed directly inside a browser or a developer console. Most of the examples and sample applications we’ll find on the web have very basic structure and file organization. Some developers may think that it’s the “natural way” to build & deploy web applications, but it’s not.

Nowadays, we’re now building entire web applications using JavaScript. Working with X files is merely different. Few frameworks like RequireJS help us to build modular JavaScript applications thanks to Asynchronous Module Definitions. Again, this is not exactly what we need here because it focuses only on scripts.

What do we need to have today to build a web site? Here are common tasks you may need:

  • Validate scripts with JSLint
  • Run tests (unit/integration/e2e) with code coverage
  • Run preprocessors for scripts (coffee, typescript) or styles (LESS, SASS)
  • Follow WPO recommendations (minify, combine, optimize images…)
  • Continuous testing and Continuous deployment
  • Manage front-end components
  • Run X, execute Y

So, what is exactly a front-end build process? An automated way to run one or more of these tasks in YOUR workflow to generate your production package.

vs-webapp-angular

Example front-end build process for an AngularJS application

Please note that asp.net Bundling & Minification is a perfect counter-example: it allows to combine & to minify styles and scripts (two important WPO recommendations) without having a build process. The current implementation is very clever but the scope is limited to these features. By the way, I will still use it for situations where I don’t need a “full control”.

I talk mainly about Javascript here but it’s pretty the same for any kind of front-end file here. SASS/LESS/Typescript/… processing is often well integrated into IDE like Visual Studio but it’s just another way to avoid using a front-end build process.

About the process

Node.js is pretty cool but it’s important to understand that this runtime is not only “Server-side Javascript”. Especially it provides awesome tools and possibilities, only limited by your imagination. You can use it while developing your application without using node.js runtime for hosting, even better, without developing a web site. For example, I use redis-commander to manage my redis database. I personally consider that Web Stack Wars are ended (PHP vs Java vs .net vs Ruby vs …) and I embrace one unique way to develop for the web. It doesn’t really matter which language & framework you use, you develop FOR the web, a world of standards.

Experienced & skilled developers know how important is automation. Most of VS extensions provide rich features and tools, like Web Essentials, but can rarely be used in an automated fashion. For sure, it could help a lot, but I don’t want this for many of my projects: I want something easily configurable, automated and fast.

Bower to manage front-end components

The important thing to note here is that Bower is just a package manager, and nothing else. it doesn’t offer the ability to concatenate or minify code, it doesn’t support a module system like AMD: its sole purpose is to manage packages for the web. Look at the search page, I’m sure you will recognize most of the available packages. Why bower and not nuget for my front-end dependencies ?

Nuget packages are commonly used to manage references in a Visual Studio Solution. A package generally contains one or more assemblies but it can also contain any kind of file … For example, the JQuery nuget package contains JavaScript files.  I have a great respect for this package manager but I don’t like to use it for file-based dependencies. Mainly because:

  • Packages are often duplicates of others package managers/official sources
  • Several packages contains too much files and you generally don’t have the full control
  • Many packages are not up-to-date and most of the times they are maintained by external contributors
  • When failing, Powershell scripts may break your project file.

But simply, this is not how work the vast majority of web developers. Bower is very popular tool and extremely simple to use.

Gulp/Grunt for the build process

Grunt, auto-proclaimed “The JavaScript Task Runner” is the most popular task runner in the Node.js world. Tasks are configured via a configuration Javascript object (gruntfile). Unless you want to write your own plugin, you mostly write no code logic. After 2 years, it now provides many plugins (tasks), approx. 3500 and the community is very active. For sure, we will find all what you need here.

Gulp “the challenger” is very similar to Grunt (plugins, cross-platform). Gulp is a code-driven build tool, in contrast with Grunt’s declarative approach to task definition, making your task definitions a bit easier to read. Gulp relies heavily on node.js concepts and non-Noders will have a hard time dealing with streams, pipes, buffers, asynchronous JavaScript in general (promises, callbacks, whatever).

Finally, Gulp.js or Grunt.js? It doesn’t really matter which tool you use, as long as it allows you to compose easily your own workflows. Here is an interesting post about Gulp vs Grunt.

Microsoft also embraces these new tools with a few extensions (see below) and even a new build process template for TFS released by MsOpenTech a few months ago.

Isn’t it painful to code in Javascript in Visual Studio ? Not at all. Let’s remember that JavaScript is now a first-class language in Visual Studio 2013 : this old-fashioned language can be used to build Windows Store, Windows Phone, Web apps, and Multi-Device Hybrid Apps. JavaScript IntelliSense is also pretty cool.

How do I work with that tool chain inside Visual Studio?

Developing with that rich ecosystem often means to work with command line. Here are a few tools, extensions, packages that may help you inside Visual Studio.

SideWaffle Template Pack

A complete pack of Snippets, Project and Item Templates for Visual Studio.

sidewaffle

More infos on the official web site http://sidewaffle.com/

Grunt Luncher

Originally a plugin made to launch grunt tasks from inside Visual studio. It has now been extended with new functionalities (bower, gulp).

gruntlauncher

Download this extension here (Repository)

ChutzpathChutzpath

Chutzpah is an open source JavaScript test runner which enables you to run unit tests using QUnit, Jasmine, Mocha, CoffeeScript and TypeScript. Tests can be run directly from the command line and from inside of Visual Studio (Test Explorer Tab).

By using custom build assemblies, it’s even possible to run all your js tests during the build process (On premises or Visual Studio Online). All the details are explained in this post on the ALM blog.

Project is hosted here on codeplex.

Package Intellisense

Search for package directly in package.json/bower.json files. Very useful if you don’t like to use the command-line to manage your package.

package_intellisense

More info on the VS gallery

TRX – Task Runner Explorertrx

This extension lets you execute any Grunt/Gulp task or target inside Visual Studio by adding a new Task Runner Explorer window. This is an early preview of the Grunt/Gulp support coming in Visual Studio “14”. It’s definitively another good step in the right direction.

Scott Hanselman blogged about this extension a few days ago.

I really like the smooth integration into VS, especially Before/After Build. The main disadvantage –in this version- is that this extension requires node.js runtime and global grunt/gulp packages. It won’t work on a workstation (or a build agent) without installing these prerequisites. Just for information, it’s not so strange to install node.js on a build agent: it’s already done for VSO agents. http://listofsoftwareontfshostedbuildserver.azurewebsites.net/

To conclude

To illustrate all these concepts and tools, I created a repository on github.  Warning, this can be considered as a dump of my thoughts. This sample is based on the popular todomvc repository. Mix an angular application and an asp.net web api in the same application may not be the best choice in terms of architecture, but it’s just to show that it’s possible to combine both in the same solution.

Points of interest

  • Front-end dependencies are managed via Bower (bower.json)
  • Portable nuget package (js, Npm, Bower, Grunt, …) created by whylee. So, I can run gulp/grunt tasks before build thanks to the custom wpp.targets.
  • Works locally as well as on TFS and Visual Studio Online.
  • Allow yourto use command line at the same time (>gulp tdd)

Does it fit into your needs? No ? That’s not a problem, just adapt it to your context, that’s the real advantage of these new tools. They are very close to developers and building a project no longer depends on a complex build template or service.