Phaser Starter Project for WebStorm and Chrome: Modular TypeScript, Code Completion, and Live Debugging

Game in Action

DEMO | CODE

Getting started with TypeScript and Phaser on OSX and Ubuntu has been a tricky proposition. My requirements for a development environment were as follows:

  • Use TypeScript as the primary language
  • Break the project into small files – preferably one module or class per file
  • Have all the benefits of TypeScript utilized by my IDE, namely code completion, error checking, and documentation for function signatures
  • File watchers that generate javascript and source maps on file save
  • Live debugging of TypeScript files

Thanks to the latest WebStorm (6.0.2 v129.541 as of this writing), Require.js, and Chrome DevTools, I have a great development environment set up to create Phaser games on OSX and Linux. Here’s how I did it:

First, grab the starter project from my GitHub account (thanks to Jesse Freeman for the inspiration).

git clone git@github.com:ericterpstra/phaser-webstorm-template.git

Then get the following four files from the Phaser ‘build’ folder:

  • phaser.js or phaser.min.js
  • phaser.d.ts
  • phaser-fx.js or phaser-fx.min.js
  • phaser-fx.d.ts

Copy those files to /phaser-webstorm-template/lib/

Then download require.js and put it in the same lib folder.

The contents of the lib folder
The contents of the lib folder

Now start WebStorm 6 (get the EAP version here). Close the existing project if it is open, and go to the welcome screen. Choose Create New Project from Existing Files. Then choose Source files are in a local directory, no Web server is yet configured and click Next.

A file selection window should appear. Find the phaser-webstorm-template folder, select it, and click the Project Root button. Then click Finish.

Find the phaser folder and make it the project root.
Find the phaser folder and make it the project root.

When the project finishes loading, open game/game.ts. After a few seconds, WebStorm should prompt you to add a ‘TypeScript’ file watcher. Click the Add Watcher link.

Click 'Add Watcher' when prompted.
Click ‘Add Watcher’ when prompted.

Once the watcher is added, go the the Webstorm menu (File menu in Linux) and open Preferences…. Click File Watchers under the Project Settings section, and double click the TypeScript entry. When the Edit Watcher window appears, add the following text to the Arguments: field:

 --target es5 --module amd
Edit the File Watcher settings.
Edit the File Watcher settings.

Click OK and return to the code editor. Open game/objects/Player.ts and also game/game.ts if it is not open already. Manually save both files to trigger the file watcher and regenerate the javascript and source map for each file. If this was successfull, the .js and .js.map files will appear underneath their respective .ts files, and can be viewed by clicking the expand arrow (see screenshot below).

Files

You should now be able to use WebStorm and have lots of assistance for TypeScript. Code completion, method signatures, documentation, error reporting, and all the other benefits of WebStorm should now be available.

Code hinting should be working in .ts files.
Code hinting should be working in .ts files.

For live debugging of code, open Google Chrome and point it to http://localhost:63342/phaser-webstorm-template/index.html and open the Chrome DevTools (Cmd-Alt-i on OSX, F12 on Linux). Click the Settings button (the little gear icon in the bottom-right corner of the DevTools window). Check the box next to Enable source maps in the General section.

Enable source maps.
Enable source maps.

Now, from the Sources tab, you should be able to open the Player.ts and game.ts files and set breakpoints and watch expressions and step through the code one line at a time! Awesome, right?

Chrome Debugging

That’s basically it. WebStorm apparently has TypeScript debugging built in as well, but I’ve had trouble getting it to work reliably. Once I do, I’ll update this post with info on live debugging from within WebStorm as well. Also in the works is a screencast and more in-depth explanation of the actual code within the project itself. Stay tuned.

Special thanks to Photon Storm for the Phaser framework, Jesse Freeman for the original Phaser project template, and Luke Hoban for help debugging my modular TypeScript.

Ghost: Just A Blogging Platform: Of the Future

Starting a blog is not difficult. Any Joe Schmoe can start plopping ideas into the public realm with just a few mouse clicks. There are many, many tools available to get a blog rolling: WordPress has dominated the self-hosted blogging landscape for years, Tumblr is ultra-popular with certain crowds, and for some reason people still use Blogger. The landscape is changing, though. Hosted blogging services like Tumblr and Posterous are traded like horses, with users left in the dust in some cases. Also, the WordPress feature set has vaulted beyond blogging tools, and is quickly becoming a full-blown content management system capable of building just about any type of website.

So what is the average blogger to do? Risk investing time and energy into a blogging service that may wind up as a minor acquisition of AOYahooSoft? Or dig through the complexity of WordPress’s ever expanding administration menus? How about neither? A new kid on the block just showed up with two fistfulls of awesome…

Ghost.

Ghost is a new open source blogging platform dedicated solely to dead-simple publishing. It’s goal it to let writers write, and have fun doing it. No crazy heirarchy of menus to wade through, no finicky wysiwyg editor mangling HTML, nothing to worry about except creating delightful posts. Every feature going into the initial release of Ghost will exist to support the prime directive of presenting a sensible, comfortable, and useful writing environment for the web.

The underlying technology powering Ghost will be an example of simplified elegance. NodeJS will act as the foundation, and the Express framework, JugglingDB ORM, and Handlebars template system will provide the scaffolding. This means that 100% of the Ghost platform will be written in a single language – Javascript.

The benefits of using one language for the full stack are obvious. I, personally, would love to focus my attention on one language when creating an application. But there are naysayers out there who would argue that NodeJS is just not the right tool for building a blogging platform, and Ghost would be better served by established frameworks like Rails, Django, or any number of PHP packages. This attitude is a bit short-sighted, in my opinion. If 37Signals thought like this, Basecampe would have been written in PHP, and Ruby might still be an esoteric language on the fringes of the web development world. It is projects like Ghost that will bring Node from marginal to mainstream.

We’ll have to wait a few months, though, as Ghost is still in its infancy. As of this writing, the Kickstarter campaign is reaching its end and is getting close to the stretch goal (after eclipsing the original goal 7 times over). Backers (including yours truly) will get early access this Summer, and the general public will get their hands on it in November. Not only will the code be open source, but a non-profit organization is being formed as we speak to provide reasonably priced blog hosting a-la WordPress.com. Because the service is formed under a non-profit, the risk of acquisition is minimal.

There is a good amount of promotional material available on the Kickstarter page, and the official website. I highly encourage taking a look, especially for anyone with even a passing interest in web publishing. Personally, I’m betting on Ghost as the next big thing, and in five years this post will be proof that I totally called it. Internet hipsters, eat your hearts out.

A simple require.js setup (with canvas and Hammer.js)

I have been reading a few tutorials on HTML5’s canvas tag, and decided to give it a whirl. Rather than bunch everything together in a huge javascript file, like most of the examples I read, I wanted to split files into sensible chunks of code. The RequireJS project has crossed my path on more than one occasion, so I figured it might be a good thing to try. I figured correctly, as it was super easy to set up, and works swimmingly.

There are plenty of great resources out there to learn all about modular javascript, as well as an interesting debate on the proper method of defining modules, but the RequireJS site itself has enough info to get a basic project off the ground. I had to do a bit of experimenting on my own to clear things up in my head, and thought I would post the results here.

The project basically draws a canvas that takes up the entire browser window (and redraws itself when the window is resized). A white dot is drawn on the screen and floats towards the bottom-right corner until the user swipes a finger (using a mobile device) or the mouse pointer (on a lap/desktop). The dot will switch directions to match the swipe. Try it out!. I had intended to get the dot to do more interesting things, but time gets away from me.

The folder structure holds three files in the root, and the rest of the folder structure looks like this:

Screen Shot 2013-05-12 at 9.20.17 PM

Click here for full source

Notice that the only file in the root is index.html. Also notice that the only javascript file referenced in index.html is src/require.js, which has a data-main attribute stuck in there, as well. That data-main attribute references the entry point to the application. RequireJS will automatically append ‘.js’ to the filename, and will run whatever is in that file.

At the top of catToy.js is the following code:

requirejs.config({
  baseUrl: 'src',
  paths: {
    lib: '../assets/lib'
  },
  shim: {
    'lib/Hammer': {
      exports: 'Hammer'
    }
  }
});

This bit of code tells RequireJS what’s what. The baseUrl is a relative filepath that will act as the root for all other file paths referenced in the config block. The paths object is a list of aliases (shortcuts) to other subdirectories that contain code. The key (lib) is the alias name, and the value (‘../assets/lib’) is the file path relative to the baseUrl.

The shim object allows you to define modules that originally weren’t intended to be used with RequireJS. In this case, the Hammer library is referenced, and will export a ‘Hammer’ module that can be used in the project.

Underneath the config block is a function that acts as the main starting point of the application. The first two lines are the most important, in terms of RequireJS.

requirejs(["modules/Stage","modules/Mover","modules/Vector2D"],
  function( Stage,  Mover, Vector2D) {
    ...// application code goes here
});

This is basically a function call that says, “Hey, RequireJS!, do this! And don’t forget to load these three modules before you do it!”

The three modules in question are Stage, Mover, and Vector2D. The first parameter of the requirejs() function is an array of modules that are needed in the function that is passed in as the second parameter. RequireJS will look up the modules, and pass them into the anonymous function, using whatever names you define.

Each of the modules used here are set up to be specifically compatible with RequireJS using a function called define(). The first line of each module is very similar to that of catToy.js:

define(function(){
  ...// code goes here
});

Yeah, that’s it. Just wrap your code in the define() function and it can be used as a RequireJS module. If the module needs to use code from another module, then pass in an array of module names, just like before.

define( ['lib/Hammer'], function(Hammer) {
  ...// code goes here
});

Live Updates in CodeIgniter with Socket.IO and Redis

CODE | DEMO*

UPDATE: This is the third of a three part series on CodeIgniter, Redis, and Socket.IO integration. Please be sure to read the other posts as well.
Part 1: A Sample CodeIgniter Application with Login And Session
Part 2: Use Redis instead of MySQL for CodeIgniter Session Data
Part 3: You are here.

Well here it is, folks, the moment I’ve been waiting for :) Real-time updates that actually work! Socket.IO has been integrated to work its magic alongside CodeIgniter. Now when a user is logged in, updates will tumble in like an avalanche with absolutely no refreshes or any user intervention at all! To catch up on what this is all about, be sure to check out A Sample CodeIgniter Application with Login And Session and Use Redis instead of MySQL for CodeIgniter Session Data before reading this.

Take a look at the video above to see the live updates in action. There are three different browser windows open (Chrome, Firefox, and IE9), each logged in with a different user. The top left browser is the Admin, which will get updated when any user posts a message. The Admin is also a member of team one, so when the admin posts a message, members of team one (bottom left) will see it. The browser on the right is team two, so she will not see anyone else’s posts, but the Admin will see what she posts.

* The demo may or may not be working depending on the state of my VPS. Most of my sample projects so far have been hosted on my shared hosting account, but due to the Redis and Node requirement I had to deploy this project to a VPS that I normally use for development and testing. If I am doing some development or testing, then Apache or Node or Redis might not be working properly – hence the video. Your best option really, is to download the code and try it yourself!

The Socket.IO library makes it painfully easy to work with NodeJS (aka, Node). Prior to this project I knew almost nothing about Node other than what had read in a few short books. I still don’t know much about Node, but I know that I like it and will continue to keep investigating it for future projects. One thing in particular that I think pretty cool in this project, is that all of the Node specific functionality (the real-time message updates) runs mostly parallel to the PHP application. So if Node decides to blow up, the application will still work, only without live updates.

Anyway, enough jibber jabber. Here’s the rundown on the changes to the application, and the highlights of the Socket.IO and Node code. Again, this is not meant to be a tutorial, but rather a show and tell and perhaps a nice piece of code for others to experiment with. Use at your own risk.

Installing Socket.IO

First things first: Install Socket.IO. I already had Node and NPM installed, but I needed a spot to create the node ‘server’ within my project. I created a folder in the project root called ‘nodejs’. Through the wonders of Node Package Management (NPM), I installed the socket.io package, as well as the node_redis package. This was done by simply navigating to the nodejs folder and running:

npm install socket.io
npm install redis

Yeah, that’s it. NPM will create a folder called ‘node_modules’ and download all the dependencies necessary to run the packages. After that I created the cisockServer.js file, and I was off to the races.

To get things working right away, I added code similar to the following, just to get things rolling. The first line instantiates Socket.IO and gets a Node server up and running on port 8080. An event handler is created for the ‘connection’ event, which fires whenever a Socket.IO client connects to the server. When this happens, Socket.IO will emit an event with an identifier of ‘startup’ and attach an object with a message. If the client is listening for the ‘startup’ event, it will receive the message.

var io = require('socket.io').listen(8080);

io.sockets.on('connection', function (socket) {
  // Let everyone know it's working
  socket.emit('startup', { message: 'I Am Working!!' });
});

To actually get the Node server fired up, it’s as easy as node --debug cisocketServer.js to get it going. I added the –debug option because I was using node-inspector for debugging and tracing. There is also an interesting project called Forever available from NodeJitsu that manages one or more Node processes. I use it on my VPS. It’s nice.

The Socket.IO Client

A server is all fine-and-dandy, but it won’t do much good without clients. I created a new javascript file in my CodeIgniter assets folder called ‘socket.js’ to hold all of my Socket.IO related client code. I wrapped most of the code in the MY_Socket namespace so it is more easily accessed by the javascript in main.js. The minimum amount of code needed to work with the server code above is what follows. It simply creates a socket connection, then listens for the ‘startup’ event from the socket connection. When the ‘startup’ event occurs, the attached message will be displayed on the console.

$(function(){

  window.MY_Socket = {

    // Instantiate the Socket.IO client and connect to the server
    socket : io.connect('http://localhost:8080'),

    // Set up the initial event handlers for the Socket.IO client
    bindEvents : function() {
      this.socket.on('startup',MY_Socket.startupMessage);
    },

    // This just indicates that a Socket.IO connection has begun.
    startupMessage : function(data) {
      console.log(data.message);
    }
  } // end window.MY_Socket

  // Start it up!
  MY_Socket.bindEvents();
});

To get this working, all that is needed are a couple more lines in header.php:

<script src="<?php echo base_url();?>/nodejs/node_modules/socket.io/node_modules/socket.io-client/dist/socket.io.min.js"></script>
<script src="<?php echo base_url();?>/assets/js/socket.js"></script>

Now upon visiting the login screen (or any screen, really), the words, “I am working!” will appear in the console.

Joining A Room

Socket.IO has a concept called rooms, which is a nice way to segment broadcasted messages from a server so only a subset of users receive the message. In this application, users will join a room based on their team numbers. Team 1 will join room 1, and so on. The exception here are admins. Admin users can be on a team, but will receive messages from all users regardless of their team. To handle this, I created another room called ‘admin’.

The room joining process starts when a user is logged in. I added a bit of jQuery code to check the current page for team badges, and if any are found, run the joinRoom function. Another way to do this would be to just put a call to MY_Socket.joinRoom() at the top of /application/views/main.php so it runs whenever the main view is loaded.

init: function () {
  ...
  if($('.userTeamBadge').children().length > 0){
    MY_Socket.joinRoom();
  }
}

So the joinRoom function does some interesting things. First it grabs the cookie named, “ci_session” and reads the value. This value is the session ID set by CodeIgniter. This ID will be used to look up some of the other session information stored in Redis by the application’s MY_Session class. When the session ID is obtained, a ‘joinRoom’ event is emitted with the ID attached. If no session ID is found, then nothing happens. The code below is part of the client code in the socket.js file.

joinRoom : function(){
  // get the CodeIgniter sessionID from the cookie
  var sessionId = readCookie('ci_session');

  if(sessionId) {
    // Send the sessionID to the Node server in an effort to join a 'room'
    MY_Socket.socket.emit('joinRoom',sessionId);
  } else {
    // If no sessionID exists, don't try to join a room.
    console.log('No session id found. Broadcast disabled.');
    //forward to logout url?
  }
}

Socket.IO will be listening for the ‘joinRoom’ event on the server. When it hears the event, it will grab the session ID, use it to construct a string that matches the corresponding session key in the Redis database, and get the data associated with that key. The results returned from Redis will contain the rest of the user’s session information, including teamId and isAdmin (indicating if the user is or is not an admin). The result is parsed into a JSON, and the teamId and isAdmin values are used to join the appropriate ‘rooms’.

For any of this to work, a Redis client must be set up to execute Redis commands. The following code is in cisockServer.js.

// Start up a Node server with Socket.IO
var io = require('socket.io').listen(8080);

// Let Node know that you want to use Redis
var redis = require('redis');

// Listen for the client connection event
io.sockets.on('connection', function (socket) {
  // Instantiate a Redis client that can issue Redis commands.
  var rClient = redis.createClient();
 
  // Handle a request to join a room from the client
  // sessionId should match the Session ID assigned by CodeIgniter
  socket.on('joinRoom', function(sessionId){
    var parsedRes, team, isAdmin;

    // Use the redis client to get all session data for the user
    rClient.get('sessions:'+sessionId, function(err,res){
      console.log(res);
      parsedRes = JSON.parse(res);
      team = parsedRes.teamId;
      isAdmin = parsedRes.isAdmin;

      // Join a room that matches the user's teamId
      console.log('Joining room ' + team.toString());
      socket.join(team.toString());

      // Join the 'admin' room if user is an admin
      if (isAdmin) {
        console.log('Joining room for Admins');
        socket.join('admin');
      }
    });

  });
});

Excellent.

Send and Receive Messages

When a user posts a new message, the data is sent to the web server using jQuery’s Ajax function. This happens in the App.postMessage function in the main.js file. If the post is successful, a callback function – App.successfulPost – is executed. In order for the post to be successful, it needs to be processed by the CodeIgniter controller method responsible for saving posts to the database. This method – main.post_message – had to be refactored so that it would not only save the message, but also respond to jQuery’s ajax request with the message wrapped up in the HTML template so it can be sent out to other users.

The HTML template responsible for rendering each individual message was separated out into its own view and saved as /application/views/single_posts.php. It was basically just cut and pasted from the main view.

<div class="otherPost well">
  <div class="otherAvatar">
    <img src="../../assets/img/avatars/<?php echo $avatar ?>.png"
         alt=""
         data-title="<span class='badge badge-info'><?php echo $teamId ?></span> <?php echo $firstName ?> <?php echo $lastName ?>"
         data-content="<?php echo $tagline ?>">
  </div>
  <div class="otherPostInfo">
    <div class="otherPostBody"><p><?php echo $body ?></p></div>
    <hr/>
    <div class="otherPostDate"><p class="pull-right"><?php echo $createdDate ?></p></div>
  </div>
</div>

In order to populate that template, CodeIgniter’s Loader.view method was used with the third parameter set to ‘true’ so it will return data instead of immediately rendering the view in the browser. The view is then loaded into the response data as a string, along with the user’s teamId value, and the HTML string that will be prepended to the user’s own message list. The following code is from /application/controller/main.php (the Main controller).

function post_message() {
  ... save message to db code ...

  if ( isset($saved) && $saved ) {
    // Gather up data to fill the message template
    $post_data = array();
    $post_data = $this->user_m->fill_session_data($post_data);
    $post_data['body'] = $saved['body'];
    $post_data['createdDate'] = $saved['createdDate'];

    // Create a message html partial from the 'single_post' template and $post_data
    $broadcastMessage = $this->load->view('single_post',$post_data,true);

    // Create an html snipped for the user's message table.
    $myMessage = "<tr><td>". $saved['body'] ."</td><td>". $saved['createdDate'] ."</td></tr>";

    // Create some data to return to the client.
    $output = array('myMessage'=>$myMessage,
                    'broadcastMessage'=>$broadcastMessage,
                    'team'=>$post_data['teamId']);

    // Encode the data into JSON
    $this->output->set_content_type('application/json');
    $output = json_encode($output);

    // Send the data back to the client
    $this->output->set_output($output);
  }
}

The response object is sent back to the jQuery callback function, and it begins the process of broadcasting the message out to all the appropriate users. This really only takes one extra line of code in App.successfulPost.

successfulPost : function( result ) {
  ...
  // Send socket.io notification
  MY_Socket.sendNewPost( result.broadcastMessage, result.team );
}

All this does is send two pieces of information to the MY_Socket.sendNewPost function. The MY_Socket.sendNewPost function will simply take the message and teamId value and send it to the Node server by emitting a Socket.IO event.

sendNewPost : function(msg,team) {
  MY_Socket.socket.emit('newPost',msg,team);
}

When the ‘newPost’ event is handled on the server, it will relay the message to the appropriate team room, and also to the ‘admin’ room.

socket.on('newPost', function (post,team,sessionId) {
  console.log('Broadcasting a post to team: ' + team.toString());

  // Broadcast the message to the sender's team
  var broadcastData = {message: post, team: team};
  socket.broadcast.to(team.toString()).emit('broadcastNewPost',broadcastData);
 
  // Broadcast the message to all admins
  broadcastData.team = 'admin';
  socket.broadcast.to('admin').emit('broadcastNewPost',broadcastData);
});

The ‘broadcastNewPost’ event is emitted twice, and will therefore be handled twice by the client. This is not normally a problem, unless there is an admin with the same teamId as the sender. Then the admin will receive the message twice, and duplicate messages will be displayed on the screen. To correct this, a little logic prevents the message from being displayed twice. The message attached to the ‘broadcastData’ object is forwarded to the App.showBroadcastedMessage function.

// on 'broadcastNewPost' update the message list from other users
updateMessages : function(data) {
  // Because the message is broadcasted twice (once for the team, again for the admins)
  // we need to make sure it is only displayed once if the Admin is also on the same
  // team as the sender.
  if( ( !userIsAnAdmin() && data.team != 'admin') ||
      ( userIsAnAdmin() && data.team === 'admin') ){
    // Send the html partial with the new message over to the jQuery function that will display it.
    App.showBroadcastedMessage(data.message);
  }
}

When the App.showBroadcastedMessage function receives the message, it appends it to the top of the list of messages from other users using simple jQuery.

showBroadcastedMessage : function(messageData) {
   $(messageData).hide().prependTo(App.$otherMessages).slideDown('slow');
   //App.$otherMessages.prepend(messageData);
   App.setElements();
   App.setupComponents();
 }

Whew. That’s it! The journey is complete.

Use Redis instead of MySQL for CodeIgniter Session Data

UPDATE: This is the second of a three part series on CodeIgniter, Redis, and Socket.IO integration. Please be sure to read the other posts as well.
Part 1: A Sample CodeIgniter Application with Login And Session
Part 2: You are here.
Part 3: Live Updates in CodeIgniter with Socket.IO and Redis

In my effort to add awesome real-time live updates to a plain ol’ CodeIgniter application, I decided to move the session information usually stored in a database table to a key-value store – namely, Redis. Not only does this alleviate some load from the MySQL database, but it also provides an easy way to expose user session data to a NodeJS server. This will be important in the future when adding Socket.IO.

The process for converting my existing application to use Redis rather than MySQL was painfully simple. It was pretty much just a handful of step-by-step instructions via the command line, and voila! PHP and Redis! BFFs 4Ever!

Here were the basic steps I followed:

1. Install Redis

Simply follow the instructions on in the Redis quickstart guide. These instructions were written for linux, and should work on most distributions. It might work on a Mac, but I’m not sure. Windows… you’re on your own. Below is the tl;dr version with just the commands.

wget http://download.redis.io/redis-stable.tar.gz
tar xvzf redis-stable.tar.gz
cd redis-stable
make

cd src
sudo cp redis-server /usr/local/bin/
sudo cp redis-cli /usr/local/bin/

sudo mkdir /etc/redis
sudo mkdir /var/redis

sudo cp utils/redis_init_script /etc/init.d/redis_6379
sudo cp redis.conf /etc/redis/6379.conf
sudo mkdir /var/redis/6379

From the Redis Quickstart Guide:

Edit the configuration file, making sure to perform the following changes:
Set daemonize to yes (by default it is set to no).
Set the pidfile to /var/run/redis_6379.pid (modify the port if needed).
Change the port accordingly. In our example it is not needed as the default port is already 6379.
Set your preferred loglevel.
Set the logfile to /var/log/redis_6379.log
Set the dir to /var/redis/6379 (very important step!)

sudo update-rc.d redis_6379 defaults
/etc/init.d/redis_6379 start

Boom. Done.

2. Install phpredis

PHP needs an extension to talk to Redis. There are a few options for this listed on the Redis site. I went with phpredis by Nicolas Favre-Felix mainly because it is required for the My_Session class used below. It’s also a great project that is still updated frequently.

To install, follow the directions in the README for your system. The default directions worked just fine for me.

git clone https://github.com/nicolasff/phpredis.git
phpize
./configure [--enable-redis-igbinary]
make && make install

Then add extension=redis.so to /etc/php5/apache2/php.ini and /etc/php5/cli/php.ini (if you are using Apache on linux). Finally, restart Apache.

Done, son.

3. Configure CodeIgniter to use Redis for session data storage

Finally, we just drop one class into the application’s library folder, and add a couple of lines to the config file to get everything working.

Really. That’s it. I’m not joking. It’s that easy.

Grab the MY_Session.php class from Ming Zhou’s gist: https://gist.github.com/zhouming/3672207 and drop it into /application/libraries. CodeIgniter will automatically use it as a subclass of Session. Most of the essential functions are overridden by My_Session so that Redis is used to store session data that will be used to authenticate user sessions while the application is running. The only thing left is to add the following two lines to /application/config/config.php:

$config['redis_host'] = 'localhost';
$config['redis_port'] = '6379';

Obvously you will need to change ‘localhost’ to the IP address of your Redis server if it is on a different machine from your application.

And that’s it! The application is Redis ready!

Note: I’ve added the MY_Session class and some installation notes to the ci_sock project in my GitHub repo, just in case the links become broken or you want an easy reference.

A Sample CodeIgniter Application with Login And Session

Code Igniter App Screenshot

CODE | DEMO

UPDATE: This is the third of a three part series on CodeIgniter, Redis, and Socket.IO integration. Please be sure to read the other posts as well.
Part 1: You are here.
Part 2: Use Redis instead of MySQL for CodeIgniter Session Data
Part 3: Live Updates in CodeIgniter with Socket.IO and Redis

I was assigned to work on a PHP project a few weeks back that utilized the CodeIgniter framework. I’ve used MVC frameworks in the past in other languages, and because CodeIgniter does not deviate too far from common MVC patterns, it was pretty easy to pick up. To get myself up to speed, I put together a sample project based of what I learned from the documentation and a few tutorials. It’s a small microblogging app with very basic user auth and CRUD functionality. The basics of it go like this…

  • User login screen with basic password authentication.
  • Each user is assigned a ‘team’ number. Users can only view posts from their teammates.
  • An admin can view all posts from all users.
  • Each user has a ‘tagline’.
  • The tagline can be edited by clicking on it and typing something new (modern browsers only, as it uses ‘contenteditable’)
  • Hovering over a user’s avatar reveals the username and tagline for that user.
  • Admins can create new user accounts.
  • A user’s own messages appear below their profile. This is limited by 5 posts, currently.
  • Careful, there is not much form validation going on at the moment.

Building it gave me a good feel for the basic mechanics of CodeIgniter and allowed me to brush up on some PHP. The real motivation behind this project, however, is to eventually work in a Socket.IO implementation to allow real time updates for the user. I’ve been eyeing NodeJS for quite some time now, but never really had cause to use it. Fortunately, the project I was working on needed a more robust and scalable ‘real time’ framework to replace a rudimentary long-polling system. Socket.IO would have been a pretty good solution, in my opinion. Unfortunately, the project was cancelled before I could get started. But since I’ve already got the ball rolling on this sample application, I figure I might as well finish, and learn a few things in case a situation like this arises again. You never know…

The Setup

The front end of the application uses Twitter Bootstrap for styling and layout, jQuery for client-side interactivity, PHP and CodeIgniter for most of the functionality, and MySQL for data storage. This is a fairly common toolset that runs on most L/W/M/AMP-style environment stacks. The code available on GitHub has everything you need to run the app yourself, provided you have a web server, MySQL, and PHP 5.3 or greater.

You’ll need to create a database (preferably called ‘cisock’), and then import cisock.sql in the root of the ‘partOne’ folder in the repository. You can do this with the following command from the command line (be sure to change ‘yourusername’ and ‘/path/to/’ to match your local setup):

mysql -u yourusername -p -h localhost cisock < /path/to/cisock.sql

Once you’ve gotten the code checked out into your web root and the SQL imported, you’ll need to do some slight configuration before getting started. In ‘part_one/application/config/config.php’ you will need to change line 17 to reflect your local URL. If you simply cloned the project directly into your ‘localhost’ web root, then no changes will likely be needed. A similar fix is necessary in ‘part_one/assets/js/main.js’ line 6. The context root of your application goes here. Again, if you cloned to your web root, the default value should be fine. Ideally, the context root should only have to be configured in one place, but it is what it is for now.

Secondly, the settings in ‘part_one/application/config/database.php’ must be set to reflect your local database configuration. The following entries are specific to your local environment:

$db['default']['hostname'] = 'localhost';
$db['default']['username'] = 'yourdatabaseusernamehere';
$db['default']['password'] = 'yourdatabasepasswordhere';
$db['default']['database'] = 'cisock'; //or use the database name you chose, if it is different

Once that is done, you should be able to navigate your browser to the /part_one/ directory and see the login screen.

To recap:
  1. Clone the repo
  2. Set up the database and import the tables from cisock.sql
  3. Edit database.php and also config.php and main.js if necessary

The Code

I tried to follow suggested CodeIgniter conventions as closely as possible. As such, the file and directory structure is pretty much as it is out of the box. I altered /application/config/routes.php to set the login controller as the entry point for the application like so:

$route['default_controller'] = "login";

The index method in /application/controllers/login.php checks to see if the user is logged in, and if not will redirect to the login screen by calling the show_login function. The code immediately below is login/index – which will run when application’s url is loaded in the browser.

function index() {
    if( $this->session->userdata('isLoggedIn') ) {
        redirect('/main/show_main');
    } else {
        $this->show_login(false);
    }
}

Because the ‘isLoggedIn’ session variable starts out as false, the show_login() function is called. The ‘false’ argument indicates that an error message is not to be shown.

function show_login( $show_error = false ) {
    $data['error'] = $show_error;

    $this->load->helper('form');
    $this->load->view('login',$data);
}

That last line there: $this->load->view('login',$data); is what opens up the login view ( /application/views/login.php ). When the user types in credentials and clicks the ‘sign-in’ button, the login_user function is called through a normal form POST as indicated by this line of code in /application/views/login.php: <?php echo form_open('login/login_user') ?>. The form_open function is part of CodeIgniter’s Form Helper, which generates the form’s HTML for you.

The login/login_user function gives us our first taste of a CodeIgniter model. It loads up an instance of a user model and calls the validate_user method on that model, passing it the email and password typed in by the user. Take a look at the code below for the entire sequence.

  function login_user() {
      // Create an instance of the user model
      $this->load->model('user_m');

      // Grab the email and password from the form POST
      $email = $this->input->post('email');
      $pass  = $this->input->post('password');

      //Ensure values exist for email and pass, and validate the user's credentials
      if( $email && $pass && $this->user_m->validate_user($email,$pass)) {
          // If the user is valid, redirect to the main view
          redirect('/main/show_main');
      } else {
          // Otherwise show the login screen with an error message.
          $this->show_login(true);
      }
  }

In the ‘user’ model, a couple of interesting things happen. First, a query is built using CodeIgniter’s ActiveRecord implementation. The username and password entered by the user are compared to the user table in the database to see if the credentials exist. If so, the corresponding record in the database will be retrieved. If that happens, the data retrieved from the database will be used to set session variables using the set_session function of CodeIgniter’s Session class. All the code for this is in /application/model/user_m.php and can be seen below.

var $details;

function validate_user( $email, $password ) {
    // Build a query to retrieve the user's details
    // based on the received username and password
    $this->db->from('user');
    $this->db->where('email',$email );
    $this->db->where( 'password', sha1($password) );
    $login = $this->db->get()->result();

    // The results of the query are stored in $login.
    // If a value exists, then the user account exists and is validated
    if ( is_array($login) && count($login) == 1 ) {
        // Set the users details into the $details property of this class
        $this->details = $login[0];
        // Call set_session to set the user's session vars via CodeIgniter
        $this->set_session();
        return true;
    }

    return false;
}

function set_session() {
    // session->set_userdata is a CodeIgniter function that
    // stores data in a cookie in the user's browser.  Some of the values are built in
    // to CodeIgniter, others are added (like the IP address).  See CodeIgniter's documentation for details.
    $this->session->set_userdata( array(
            'id'=>$this->details->id,
            'name'=> $this->details->firstName . ' ' . $this->details->lastName,
            'email'=>$this->details->email,
            'avatar'=>$this->details->avatar,
            'tagline'=>$this->details->tagline,
            'isAdmin'=>$this->details->isAdmin,
            'teamId'=>$this->details->teamId,
            'isLoggedIn'=>true
        )
    );
}

So now that the user is authenticated and their session info is set in a cookie, CodeIgniter will take an extra step and store the user’s IP address, session ID, user agent string and last activity timestamp in the database. So when the user logs out, closes the browser, or is idle for too long, the session will expire and be cleared from the database. If someone with a cookie containing the same session ID then tries to connect to the application, it will be invalid because it won’t match any of the sessions stored in the database. Check out the Session class documentation for a more thorough explanation.

So now that the user is logged in, the ‘main’ controller can do its thing. The show_main function runs just before loading the main view, and does a number of things to prepare for displaying the view to the user. The user’s details are used to change certain parts of the view, such as which posts to display (team only, or everyone) and the admin controls.

The show_main function grabs some data from the user’s session, retrieves all the user’s posted messages, counts the total number of messages posted by the user, and retrieves the messages from other users. All of this info is placed into the $data object and passed to the ‘main’ view (/application/views/main.php). Much of the heavy lifting is taken care of by ActiveRecord commands in the Post model (/application/model/post_m).

That’s about it, as far as logging in goes. Now that everything is set up using conventional CodeIgniter practices, I can begin the process of converting the server side session data to be stored in Redis, rather than in a MySQL table…

UPDATE:
See Part 2: Use Redis instead of MySQL for CodeIgniter Session Data
and Part 3: Live Updates in CodeIgniter with Socket.IO and Redis

My First jQuery Plugin: VintageTxt

VintageTxt<br />
Default

DEMO | SOURCE

Ever hearken back to the good ol’ days of the Apple II, with its monochrome screen and visible scanlines? No? Me either, really. I don’t even know what hearken means. But I do know that vintage looking stuff is cool, and people love it when all that is old is new again. So here is a jQuery plugin (new) that injects a green monochromatic scanlined fake computer screen (old) right onto any webpage. You can even make it type out a bunch of text, one character at a time – like that stupid computer from Wargames. No annoying voice-over though. And of course there is an input prompt, so you can bang away at your keyboard while pretending to break through 8 levels of top secret encryption to re-route the protocols through the mainframe and unleash your most advanced viruses… all while chanting, “enhance.”

Why make this? Fun, naturally. Plus I’ve been wondering lately what it would be like to make a jQuery plugin. I know I’m a little late to the jQuery party, and all the cool kids have move on bigger and better things, but the ability to whip up a jQuery plugin seems like one of those skills a self-respecting web developer should acquire at some point. Two helpful guides stocked the pot, and I threw in a bit of my own spicy sauce as I went along. I’m fairly satisfied with the end result, and the knowledge gained. To play around with a fully functional demo, click the link or image above. But to have some real fun, go grab the source from GitHub and VintageTxt your face off!

Construct 2 HTML5 Mobile Game: Turkey Trot of Doom!!

So it’s a week before Thanksgiving and I get an email from the hostess of the dinner I plan on attending. Included was a list of dishes assigned for guests to bring. The usual suspects were present: stuffing, cranberry sauce, pie, lots of pie…

But when I finally got to my name, a ‘holiday themed video game’ was requested.

“Ha!” I thought, funny joke, can’t be done. But then I thought some more. Perhaps it could be done. I’ve always wanted to try making one of those newfangled mobile HTML5 games that all the kids are talking about, and here was the perfect motivation. To make things even more interesting, I decided to try out Construct 2 – a drag-n-drop game maker that exports to HTML5. I haven’t used anything of the sort since Klik ‘n Play back in 1995. Construct 2 is much, much cooler.

The interface is mostly intuitive, but reading the manual is definitely necessary to do anything meaningful. After a few hours of fiddling with the sample projects and reading bits and pieces from the manual and a few tutorials I was confident enough to at least begin my own project. After I got started, many grand ideas popped into my head, but there just wasn’t time. I had an hour here, and an hour there to work, and Turkey Day was quickly approaching. In the end, I managed to get one major feature from my wishlist working – accelerometer controls. When playing the game on a mobile browser, the player moves by tilting the device. It works pretty well on a modern iPhone (4, 4s, 5) and iPads. Doesn’t work so great on Android, but you can still get the gist of it.

Rather than go into too much detail on how the game was assembled within Construct 2, I’ll just post the game file for download. There’s nothing terribly complicated going on, and the way the event sheets are organized, it reads almost like a book. You can download the file here.

To play the game, simply visit http://turkey.thebogstras.com. If you are using a mobile device, make sure it is in landscape orientation with the home button on the right. It takes a while to get used to the tilt controls, and you will probably die very quickly. If playing on a desktop browser, click the mouse where you want the character to move.

Remember, this game was never intended to be fun. It’s sole purpose was to be a game, and have some Thanksgivingy stuff in it. Fun was never a requirement.

Credits for artwork:
Boss Turkey – Puppet Nightmares
Turkey Leg – Kenj
Mini Turkey – MikariStar
Evil Turkeys – DMN666

AngularJS SignIt! – Custom directives and form controls

Note: This is a companion post to Example CRUD App – Starring AngularJS, Backbone, Parse, StackMob and Yeoman. If you haven’t read that yet, please do so, otherwise this might not make much sense.

The most prominent feature, by far, of AngularJS SignIt! is the signature pad. It’s a fixed-size canvas that can be drawn upon with a mouse pointer or finger. The code for this wonderful widget is provided as a jQuery plugin by Thomas Bradley. There are a few more features for the signature pad than I use in this app, and I encourage you to check out the full documentation if you get a chance. It’s pretty great.

In order to implement the signature pad into my form, It must be a required field, and have the data included with the other fields when the form is submitted. The sig pad works by creating objects of x/y coordinates that correspond to marks on the canvas. When drawing on the pad, all that data gets set into a hidden form field. The data from that field is what needs to get into the Angular scope, and get validated by Angular’s form validation routine. Oh, and Angular somehow needs to fire the custom validation function built into the signature pad.

Luckily, this is the type of thing directives were built for. Custom directives let you create custom html tags that get parsed and rendered by Angular. So to start out, I created a custom directive called sigpad that is used by placing <sigpad></sigpad> in my HTML. The sigpad tag then gets replaced by the directive’s template HTML. The template for the HTML5 signature pad is just the default snippet taken directly from the signature pad documentation. See below:

<div class="control sigPad">
  <div class="sig sigWrapper">
    <canvas class="pad" width="436" height="120" ng-mouseup="updateModel()"></canvas>
  </div>
</div>

The logic for the directive is defined in a separate Angular module (see directives.js for full code). Take a look at the code below, and be sure to review Angular’s documentation on directives. The Angular docs give a great conceptual overview, but the examples are a bit lacking. I reviewed the docs many, many times, and still needed some help from the mailing list to get this working correctly (thanks P.B. Darwin!).

My biggest stumbling block was not using ngModelController, and instead trying to manipulate scope data directly. When working with a form, Angular provides all sorts of awesome code to work with data and do some validation. Basically, it all went down like this…

Insert the sigpad directive into the form with

<sigpad ng-model='user.signature' clearBtn=".clearButton" name="signature" required></sigpad>

Now Angular knows to replace the sigpad element with the template defined earlier. After this happens, Angular runs the linking function which contains all the logic for the directive. The little snippet shown below is in the linking function. It uses jQuery to select the template element (passed into the linking function as ‘element’) and runs the signaturePad function from the signature pad API. This is what creates the actual, drawable canvas.

var sigPadAPI = $(element).signaturePad({
                             drawOnly:true,
                             lineColour: '#FFF'
                           });

The ng-model=’user.signature’ bit is key. This is how data is shared between the signature pad and Angular. Also, go back up and look at the template. You will see ng-mouseup="updateModel() as an attribute of the canvas element. This tells Angular to run the updateModel() function when your mouse click ends on the signature pad. The updateModel() function is defined in the linking function of the sigpad directive.

When the updateModel() function is executed, it will wait for a split second so the signature pad can finish writing data to its hidden form field, then it will assign all that data to the Angular model value of the directive. Sounds confusing, and it is. The signature pad is off doing its own thing, completely oblivious to the fact that it is in an AngularJS app. It is Angular’s responsibility to grab data from the sigpad to get that data into its own scope. That is what $setViewValue is for. It hands the signature data over to Angular, so when the form is submitted, Angular has it available in scope.

Below is the entire directive for the drawable signature pad. You can see that it relies heavily on the signature pad API, but only after certain events handled by Angular have occurred.

.directive('sigpad', function($timeout){
  return {
    templateUrl: 'views/sigPad.html',   // Use a template in an external file
    restrict: 'E',                      // Must use <sigpad> element to invoke directive
    scope : true,                       // Create a new scope for the directive
    require: 'ngModel',                 // Require the ngModel controller for the linking function
    link: function (scope,element,attr,ctrl) {

      // Attach the Signature Pad plugin to the template and keep a reference to the signature pad as 'sigPadAPI'
      var sigPadAPI = $(element).signaturePad({
                                  drawOnly:true,
                                  lineColour: '#FFF'
                                });
     
      // Clear the canvas when the 'clear' button is clicked
      $(attr.clearbtn).on('click',function (e) {
        sigPadAPI.clearCanvas();
      });
     
      $(element).find('.pad').on('touchend',function (obj) {
        scope.updateModel();
      });

      // when the mouse is lifted from the canvas, set the signature pad data as the model value
      scope.updateModel = function() {
        $timeout(function() {
          ctrl.$setViewValue(sigPadAPI.getSignature());
        });
      };      
     
      // Render the signature data when the model has data. Otherwise clear the canvas.
      ctrl.$render = function() {
        if ( ctrl.$viewValue ) {
          sigPadAPI.regenerate(ctrl.$viewValue);
        } else {
          // This occurs when signatureData is set to null in the main controller
          sigPadAPI.clearCanvas();
        }
      };
     
      // Validate signature pad.
      // See http://docs.angularjs.org/guide/forms for more detail on how this works.
      ctrl.$parsers.unshift(function(viewValue) {
        if ( sigPadAPI.validateForm() ) {
          ctrl.$setValidity('sigpad', true);
          return viewValue;
        } else {
          ctrl.$setValidity('sigpad', false);
          return undefined;
        }
      });      
    }
  };
})

And what about the tiny signatures that show up in the signatories list? Also a custom directive. This one is smaller, but still tricky. The signature is displayed as an image on-screen, but a canvas element is still required to generate the signature from raw data before it can be converted to an image.

The directive is implemented with <regensigpad sigdata={{signed.get('signature')}}></regensigpad>. The ‘signed’ value is a single signature in the signature collection pulled from the back-end when the user picks a petition. the signature data from signed is passed into the directive scope using scope: {sigdata:'@'}.

When a list of signatures is retrieved, each signature record (including first & last name, email, and signature data) goes into a table row using ngRepeat. The regensigpad directive is executed for each row. The linking function will create a canvas element and make a displayOnly signature pad from it. The signature drawing is regenerated from the data, and then the canvas is converted to PNG format.

This PNG data is then used in the pic scope value, which is bound to the ng-src of an img tag. This img tag is the directive’s template, and will be inserted into the page. The full code for this directive is below.

.directive('regensigpad',function() {
  return {
    template: '<img ng-src="{{pic}}" />',
    restrict: 'E',
    scope: {sigdata:'@'},
    link: function (scope,element,attr,ctrl) {
      // When the sigdata attribute changes...
      attr.$observe('sigdata',function (val) {
        // ... create a blank canvas template and attach the signature pad plugin
        var sigPadAPI = $('<div class="sig sigWrapper"><canvas class="pad" width="436" height="120"></canvas></div>').signaturePad({
                          displayOnly: true
                        });
        // regenerate the signature onto the canvas
        sigPadAPI.regenerate(val);
        // convert the canvas to a PNG (Newer versions of Chrome, FF, and Safari only.)
        scope.pic = sigPadAPI.getSignatureImage();
      });
    }
  };
});

But that’s not all! You might have noticed that the select box holding the names of each petition looks kinda fancy, and allows you to type stuff to filter the list. This fancy form control is the select2 widget which is based of the Chosen library.

I didn’t have to write my own directive for it though. The Angular-UI project has already done the honors. Angular-UI is an open-source companion suite for AngularJS. It provides a whole pile of custom directives, and even a few extra filters. Many of the directives are wrappers for other widgets and open source projects, like CodeMirror, Google Maps, Twitter Bootstrap modal windows, and many more. It’s definitely worth looking into for any AngularJS project.

AngularJS SignIt! – Interchangeable Parse, StackMob and Backbone Services

Note: This is a companion post to Example CRUD App – Starring AngularJS, Backbone, Parse, StackMob and Yeoman. If you haven’t read that yet, please do so, otherwise this might not make much sense.

The AngularJS SignIt! application basically has three different interactions with a web service – fetch petitions, save a signature, and fetch a list of signatures based on the selected petition. That’s it – a get, a save, and a query. Initially, I was only using Parse.com to store data, so it was possible to include Parse specific objects and methods in my controller to save and get data.

But then I remembered I have a StackMob account just sitting around doing nothing, and thought I should put it to good use. So now I have two (slightly) different options to store my signatures. Rather than jumbling up my controller with code specific to StackMob and Parse, I created a module to abstract the Parse and StackMob APIs into their own services. These services could then hide any code specific to Parse or StacMob behind a common interface used by the controller.

With the back-end(s) abstracted, all the controller needs to worry about is calling saveSignature, getPetitions, and getSignatures. Below is a severely truncated version of the Main Controller that shows the three methods in use. Notice there is no mention of Parse or StackMob.

var MainCtrl = ngSignItApp.controller('MainCtrl', function($scope,DataService) {

  // GET A LIST OF SIGNATURES FOR A PETITION
  $scope.getSignatures = function getSignatures (petitionId) {
    DataService.getSignatures(petitionId,function (results) {
      $scope.$apply(function() {
        $scope.signatureList = results;
      });
    });
  };

  // SAVE A SIGNATURE
  $scope.saveSignature = function saveSignature() {  
    DataService.saveSignature($scope.user, function() { //user is an object with firstName, lastName, email and signature attributes.
      $scope.getSignatures($scope.select2); //select2 is the value from the petition dropdown
    });  
  };

  // GET ALL PETITIONS
  DataService.getPetitions(function (results) {
    $scope.$apply(function() {
      $scope.petitionCollection = results;
      $scope.petitions = results.models;
    });
  });

});

If you look closely, you’ll see that each service method is prefixed with DataService. This is the injectable that provides either the StackMob service, or Parse service to the controller. Each of those services has an implementation of the getSignatures, saveSignature, and getPetitions. Take a look:

angular.module('DataServices', [])
// PARSE SERVICE
.factory('ParseService', function(){
    Parse.initialize("<PLEASE USE YOUR OWN APP KEY>", "<PLEASE USE YOUR OWN API KEY>");
    var Signature = Parse.Object.extend("signature");
    var SignatureCollection = Parse.Collection.extend({ model: Signature });
    var Petition = Parse.Object.extend("petition");
    var PetitionCollection = Parse.Collection.extend({ model: Petition });

    var ParseService = {

      // GET ALL PETITIONS
      getPetitions : function getPetitions(callback) {
        var petitions = new PetitionCollection();
        petitions.fetch({
          success: function (results) {
              callback(petitions);
          }
        });
      },

      // SAVE A SIGNATURE
      saveSignature : function saveSignature(data, callback){
        var sig = new Signature();
        sig.save( data, {
                  success: function (obj) {callback(obj);}
        });
      },

      // GET A LIST OF SIGNATURES FOR A PETITION
      getSignatures : function getSignatures(petitionId, callback) {
        var query = new Parse.Query(Signature);
        query.equalTo("petitionId", petitionId);
        query.find({
          success: function (results) {
            callback(results);
          }
        });
      }
   
    };

    return ParseService;
})
// STACKMOB SERVICE
.factory('StackMobService', function(){
    // Init the StackMob API. This information is provided by the StackMob app dashboard
    StackMob.init({
      appName: "ngsignit",
      clientSubdomain: "<PLEASE USE YOUR OWN SUBDOMAIN>",
      publicKey: "<PLEASE USE YOUR OWN PUBLICKEY>",
      apiVersion: 0
    });

    var Signature = StackMob.Model.extend( {schemaName:"signature"} );
    var SignatureCollection = StackMob.Collection.extend( { model: Signature } );
    var Petition = StackMob.Model.extend( {schemaName:"petition"} );
    var PetitionCollection = StackMob.Collection.extend( { model: Petition } );

    var StackMobService = {
     
      getPetitions : function getPetitions(callback) {
        var petitions = new PetitionCollection();
        var q = new StackMob.Collection.Query();
        petitions.query(q, {
          success: function (results) {
              callback(petitions.add(results));
          },
          error: function ( results,error) {
              alert("Collection Error: " + error.message);
          }
        });        
      },    

      saveSignature : function saveSignature(data, callback){
        var sigToSave = new Signature();
        sigToSave.set({
          firstname: data.firstName,
          lastname: data.lastName,
          petitionid: data.petitionId,
          email: data.email,
          signature: JSON.stringify(data.signature) //Also, StackMob does not allow arrays of objects, so we need to stringify the signature data and save it to a 'String' data field.
        });

        // Then save, as usual.
        sigToSave.save({},{
          success: function(result) {
            callback(result);
          },
          error: function(obj, error) {
            alert("Error: " + error.message);
          }
        });
      },

      getSignatures : function getSignatures(petitionId, callback) {
        var signatures = new SignatureCollection();
        var q = new StackMob.Collection.Query();
        var signatureArray = [];

        q.equals('petitionid',petitionId);

        signatures.query(q,{
          success: function(collection) {
            collection.each(function(item) {
              item.set({
                signature: JSON.parse(item.get('signature')),
                firstName: item.get('firstname'),
                lastName: item.get('lastname')
              });
              signatureArray.push(item);
            });
            callback(signatureArray);
          }
        });
      }
   
    };
    // The factory function returns StackMobService, which is injected into controllers.
    return StackMobService;
})

This is an abridged version of the DataServices module. To see the full code, as well as many more comments explaining the code, head over to GitHub. The main point to observe here is that each service has slightly different code for getSignatures, getPetitions, and saveSignature. Also, each service has its own initialization code for its respective back-end service. The controller could care less though, because as long as the service methods accept and provide data in the right format, it’s happy.

But how does the controller know which service to use? Well, if you paid attention to the controller code, you’ll see that ‘DataService’ is injected, which is not defined yet. In the full code, there is a service defined all the way at the bottom of the file. It looks like this:

.factory('DataService', function (ParseService,StackMobService,BackboneService,$location) {
  var serviceToUse = BackboneService;
  if ( $location.absUrl().indexOf("stackmob") > 0 || $location.absUrl().indexOf("4567") > 0 ) serviceToUse = StackMobService;
  if ( $location.path() === '/parse' ) serviceToUse = ParseService;

  return serviceToUse;
});

All the other services (ParseService, StackMobService, and BackboneService) are injected into this service. In case you are wondering, BackboneService is yet another back-end service that can be used in place of the others – see the full code for details. The code above simply examines the URL and decides which service get injected via DataService. If ‘parse’ appears as the url path (e.g. www.example.com/app/#/parse), then ParseService is returned. StackMob requires that HTML apps be hosted on their servers, so just check the domain name for ‘stackmob’ and return the StackMob service. If neither of these conditions occur, then the BackboneService is returned, and no data is saved.

In retrospect, I think what I’ve got here is the beginnings of an OO-like interface – where a set of functions are defined to form a contract with the client ensuring their existence. And the implementations of that interface are a set of adapters (or is it proxies?). If I had to do this over, I would use a proper inheritance pattern with DataService as the abstract parent, and the other services as the implementations or subclasses. One of these days I’ll get around to refactoring. One day…

Full Source: https://github.com/ericterpstra/ngSignIt