Mozilla

Found 448 results for “html5”

Sort by:

View:

  1. HTML5: The difference between an App and a Page.

    HTML5 is only one part of the”Stack”

    HTML5 is really more than one thing. In the strictest sense, HTML5 is fifth major revision of the W3C specification of the markup language that is used to create web pages.

    But in a practical sense, HTML5 is far more than that. For developers, HTML is a wave of technologies that are evolving and gaining browser implementations along roughly the same time-line as the markup language itself. Many of the related technologies are being spearheaded by the Web Hypertext Application Technology Working Group (whatwg.org), but still other technologies in this genre are driven from elsewhere. For example WebGL, the 3D rendering engine for Web Apps, is driven at Khronos, and Mozilla’s WebAPI team is leading a collection of relevant API work including a comprehensive array of device-specific APIs. And the W3C was a Device APIs Working Group

    Mozilla is also investing heavily in an initiative to facilitate the use of standards-based Web technologies to build applications that can be installed and used in the same way a “native” applications would be.

    There are many reasons that doing so might be desirable, not the least of which is the ability to build much or all of your application for use on many different devices and platforms using a single set of development technologies.

    Developers can already build HTML Applications (with help from the HTML5 Apps documentation ) and submit their applications to the Mozilla Marketplace.

    App versus Page ?

    When people start talking about building apps with HTML5, one of the early questions that comes to the conversation is “What’s the difference between an app and a page?”

    This is bit of a difficult question to answer because of the very wide range of answers which can all be true, but which may be completely different.

    While your HTML5 app should be more than a simple HTML5 page, technically speaking it doesn’t have to be. Using the steps outlined in the guide “Getting Started with making (HTML5) apps”, you could simply create an apps manifest file and submit your HTML page as an app.

    When an Open Web app runs, the Mozilla Web Runtime installs local copies of the HTML5 markup and all of the other web assets that are specified using an HTML5 Cache Manifest ( read here ).

    If a Web page is dynamically generated, as in an ASP.NET, PHP, etc., application, any cached version represents a snapshot of a single point in time. When submitted as an app, the page actually “becomes” an app, getting a launcher item similar to native apps on whatever the host operating system is. When the app is launched, the local assets are used if there is no Internet connection or if the caching mechanism determines that the assets have not changed since they were last downloaded.

    By doing the above we have not added any additional functionality to our application. All we have done by declaring our page to be an app is to get the run-time to copy the bits locally and to run those local bits if the device is off line or if the bits haven’t changed.

    Does this get us anything ?

    Is there value in just this simple step ?

    Yes, it gets us three valuable improvements:

    • Users can see our page even if they are not connected because the run-time renders the cached version if there is no Internet connection.
    • Processor cycles and bandwidth consumption from our infrastructure or hosting are greatly reduced because new bits are only fetched when they are needed.
    • Performance improves because resources are loaded locally by default.

    Many organizations are starting with this approach because very little technical effort is necessary and it can be done very quickly. In most cases all you need to do is to create an app manifest file ( more info here ) and to submit  the app to the marketplace. Developers can submit apps now at the Mozilla marketplace Developer Submission site so that they will be available to consumers when the “buy side” of the marketplace opens later this year.)

    But if that’s all you do with your app, you are missing the opportunity.

    Take a look at http://caniuse.com/ for a snapshot view of technologies that are available to developers using the modern web runtime.

    Whether we are building a real Open Web App from scratch or we are converting an HTML page to an app and then adding modern features, HTML5 era technologies let us really start to developer client / server applications that utilize the browser and its host resources, thereby reducing our server requirements while improving the end user’s experience.

    Developers can do things like :

    • Read and write data locally and sync with a server when appropriate.
    • Run multiple threads of execution at the same time.
    • Exchange content with other applications, or share logic with them.
    • Interact with the device on which their app is running.
      • Touch user interface
      • GPS
      • Camera
      • Richly integrate multimedia.
      • And much more…..

    Since apps’ source code is deployed to a single URL and is updated on the device on which the app runs only when updates are necessary, versioning is low cost and friction free. This makes it easy to start by making a page and then iteratively adding features one after another.

    Lets do it !

    Since this model makes it so easy to get started, why not give at a try?

    Follow the guidance here : https://developer.mozilla.org/en/Apps/Getting_Started

    And submit your app here : https://marketplace.mozilla.org/

  2. BrowserQuest – a massively multiplayer HTML5 (WebSocket + Canvas) game experiment

    It’s time for some gaming action with a new HTML5 game demo: BrowserQuest, a massively multiplayer adventure game created by Little Workshop (@glecollinet & @whatthefranck) and Mozilla.

    Play the game: browserquest.mozilla.org
    BrowserQuest

    BrowserQuest is a tribute to classic video-games with a multiplayer twist. You play as a young warrior driven by the thrill of adventure. No princess to save here, just a dangerous world filled with treasures to discover. And it’s all done in glorious HTML5 and JavaScript.

    Even better, it’s open-source so be sure to check out the source code on GitHub!

    Watch a screencast:

    A multiplayer experience

    BrowserQuest screenshot

    BrowserQuest can be played by thousands of simultaneous players, distributed across different instances of the in-game world. Click on the population counter at any time to know exactly how many total players are currently online.

    Players can see and interact with each other by using an in-game chat system. They can also team up and fight enemies together.

    BrowserQuest is a game of exploration: the more dangerous the places you go, the better the rewards.

    Powered by WebSockets

    WebSockets are a new technology enabling bi-directional communication between a browser and a server on the web.

    BrowserQuest is a demo of how this technology can be used today to create a real-time multiplayer game in a single webpage. When you start to play, your browser opens up a WebSocket connection to one of several load-balanced game servers. Each server hosts multiple world instances and handles the player synchronization and game logic within all instances. Because the server code is running on Node.js, both the server and client codebases share a small portion of the same JavaScript source code.

    Server code is available on Github.

    BrowserQuest screenshot

    Built on the Web platform

    BrowserQuest makes extensive use of different web technologies, such as:

    • HTML5 Canvas, which powers the 2D tile-based graphics engine.
    • Web workers, allowing to initialize the large world map without slowing down the homepage UI.
    • localStorage, in which the progress of your character is continually saved.
    • CSS3 Media Queries, so that the game can resize itself and adapt to many devices.
    • HTML5 audio, so you can hear that rat or skeleton die!

    Available everywhere

    Since BrowserQuest is written in HTML5/JavaScript, it is available across a lot of different browsers and platforms. The game can be played in Firefox, Chrome and Safari. With WebSockets enabled, it’s also playable in Opera. Moreover, it’s compatible with iOS devices, as well as tablets and phones running Firefox for Android.

    BrowserQuest screenshot

    The mobile versions are more experimental than the desktop experience, which has richer features and performance, but it’s an early glimpse of what kind of games will be coming to the mobile Web in the future. Give it a try with your favorite mobile device!

    Join the adventure

    Want to be part of BrowserQuest? Create your own character and venture into the world. Fight enemies by yourself or with friends to get your hands on new equipment and items. You might even stumble upon a couple of surprises along the way…

  3. Making the Dino roar – syncing audio and CSS transitions

    It started with Brian King setting up our Google+ page using this round MDN logo by John Slater. I thought this looks cool and reminded me of the famous MGM intro so I wondered if I could turn it into an intro for our video tutorials (not sure if we will do that though). And, some photoshop and sound work later and with a sprinkle of HTML5 audio and CSS transitions, here we are (source on GitHub):

    I started with the sound. If you need Creative Commons licensed sounds, Freesound is a good resource. So I took Chinese Fanfare by Nick-Nack and Roar by CGEffex and put them together in Audacity.

    Saving them as OGG and MP3 gave me an audio element that I could tie into. All I needed was to listen to the timeupdate event and compare the currentTime to trigger the animations. The animations (rotation of the dino and opening and closing of the jaw) are CSS transitions triggered by classes on the parent element. The main trick was to store both the dino and the jaw inside a div and transition them separately. The jaw animation also needed a change in transformation origin as we don’t rotate the image around its center.

    If you got seven minutes to spare, here is a blow-by-blow screencast explaining what is going on:

  4. Announcing the February Dev Derby Winners

    Touch events help you make websites and applications more engaging by responding appropriately when users interact with touch screens. A user touching a screen is very different from a user clicking a mouse button, so special care must be taken to ensure that touch-enabled Web applications respond to touch screen interactions in ways that users expect.

    Last month, creative developers from around the web competed in the Touch Events Dev Derby to show off just how powerful touch events are. We had some really great entries (many so much fun I could barely stop playing with them). Our expert judgesRemy Sharp, Chris Coyier, and Chris Heilmann—looked over the entries and are proud to announce three winners and a runner-up. Please join us in congratulating these outstanding contributors!

    1st: Pinch That Frog! by Danny.D
    2nd:
    Kite Flying by ltch
    3rd: The Face Builder by boblemarin

    Runner up:
    Twistron by seva

    But let’s not forget about everyone else who submitted to this Derby. Each and every one of these people is pushing the web forward and deserves a huge amount of praise for doing so.

    Want to see your name here next month? We are now accepting demos related to CSS 3D Transforms (March), HTML5 audio (April), and Websockets (May). Get an early start by submitting today!

  5. Video, Mobile, and the Open Web

    [Also posted at brendaneich.com.]

    I wrote The Open Web and Its Adversaries just over five years ago, based on the first SXSW Browser Wars panel (we just had our fifth, it was great — thanks to all who came).

    Some history

    The little slideshow I presented is in part quaint. WPF/E and Adobe Apollo, remember those? (Either the code names, or the extant renamed products?) The Web has come a long way since 2007.

    But other parts of my slideshow are still relevant, in particular the part where Mozilla and Opera committed to an unencumbered <video> element for HTML5:

    • Working with Opera via WHATWG on <video>
      • Unencumbered Ogg Theora decoder in all browsers
      • Ogg Vorbis for <audio>
      • Other formats possible
      • DHTML player controls

    We did what we said we would. We fought against the odds. We carried the unencumbered HTML5 <video> torch even when it burned our hands.

    We were called naive (no) idealists (yes). We were told that we were rolling a large stone up a tall hill (and how!). We were told that we could never overcome the momentum behind H.264 (possibly true, but Mozilla was not about to give up and pay off the patent rentiers).

    Then in 2009 Google announced that it would acquire On2 (completed in 2010), and Opera and Mozilla had a White Knight.

    At Google I/O in May 2010, Adobe announced that it would include VP8 (but not all of WebM?) support in an upcoming Flash release.

    On January 11, 2011, Mike Jazayeri of Google blogged:

    … we are changing Chrome’s HTML5 <video> support to make it consistent with the codecs already supported by the open Chromium project. Specifically, we are supporting the WebM (VP8) and Theora video codecs, and will consider adding support for other high-quality open codecs in the future. Though H.264 plays an important role in video, as our goal is to enable open innovation, support for the codec will be removed and our resources directed towards completely open codec technologies.

    These changes will occur in the next couple months….

    A followup post three days later confirmed that Chrome would rely on Flash fallback to play H.264 video.

    Where we are today

    It is now March 2012 and the changes promised by Google and Adobe have not been made.

    What’s more, any such changes are irrelevant if made only on desktop Chrome — not on Google’s mobile browsers for Android — because authors typically do not encode twice (once in H.264, once in WebM), they instead write Flash fallback in an <object> tag nested inside the <video> tag. Here’s an example adapted from an Opera developer document:

    <video controls poster="video.jpg" width="854" height="480">
     <source src="video.mp4" type="video/mp4">
     <object type="application/x-shockwave-flash" data="player.swf"
             width="854" height="504">
      <param name="allowfullscreen" value="true">
      <param name="allowscriptaccess" value="always">
      <param name="flashvars" value="file=video.mp4">
      <!--[if IE]><param name="movie" value="player.swf"><![endif]-->
      <img src="video.jpg" width="854" height="480" alt="Video">
      <p>Your browser can't play HTML5 video.
     </object>
    </video>
    

    The Opera doc nicely carried the unencumbered video torch by including

     <source src="video.webm" type="video/webm">
    

    after the first <source> child in the <video> container (after the first, because of an iOS WebKit bug, the Opera doc said), but most authors do not encode twice and host two versions of their video (yes, you who do are to be commended; please don’t spam my blog with comments, you’re not typical — and YouTube is neither typical nor yet completely transcoded [1]).

    Of course the ultimate fallback content could be a link to a video to download and view in a helper app, but that’s not “HTML5 video” and it is user-hostile (profoundly so on mobile). Flash fallback does manage to blend in with HTML5, modulo the loss of expressiveness afforded by DHTML playback controls.

    Now, consider carefully where we are today.

    Firefox supports only unencumbered formats from Gecko’s <video> implementation. We rely on Flash fallback that authors invariably write, as shown above. Let that sink in: we, Mozilla, rely on Flash to implement H.264 for Firefox users.

    Adobe has announced that it will not develop Flash on mobile devices.

    In spite of the early 2011 Google blog post, desktop Chrome still supports H.264 from <video>. Even if it were to drop that support, desktop Chrome has a custom patched Flash embedding, so the fallback shown above will work well for almost all users.

    Mobile matters most

    Android stock browsers (all Android versions), and Chrome on Android 4, all support H.264 from <video>. Given the devices that Android has targeted over its existence, where H.264 hardware decoding is by far the most power-efficient way to decode, how could this be otherwise? Google has to compete with Apple on mobile.

    Steve Jobs may have dealt the death-blow to Flash on mobile, but he also championed and invested in H.264, and asserted that “[a]ll video codecs are covered by patents”. Apple sells a lot of H.264-supporting hardware. That hardware in general, and specifically in video playback quality, is the gold standard.

    Google is in my opinion not going to ship mobile browsers this year or next that fail to play H.264 content that Apple plays perfectly. Whatever happens in the very long run, Mozilla can’t wait for such an event. Don’t ask Google why they bought On2 but failed to push WebM to the exclusion of H.264 on Android. The question answers itself.

    So even if desktop Chrome drops H.264 support, Chrome users almost to a person won’t notice, thanks to Flash fallback. And Apple and Google, along with Microsoft and whomever else might try to gain mobile market share, will continue to ship H.264 support on all their mobile OSes and devices — hardware-implemented H.264, because that uses far less battery than alternative decoders.

    Here is a chart of H.264 video in HTML5 content on the Web from MeFeedia:

    MeFeedia.com, December 2011

    And here are some charts showing the rise of mobile over desktop from The Economist:

    The Economist, October 2011

    These charts show Bell’s Law of Computer Classes in action. Bell’s Law predicts that the new class of computing devices will replace older ones.

    In the face of this shift, Mozilla must advance its mission to serve users above all other agendas, and to keep the Web — including the “Mobile Web” — open, interoperable, and evolving.

    What Mozilla is doing

    We have successfully launched Boot to Gecko (B2G) and we’re preparing to release a new and improved Firefox for Android, to carry our mission to mobile users.

    What should we do about H.264?

    Andreas Gal proposes to use OS- and hardware-based H.264 decoding capabilities on Android and B2G. That thread has run to over 240 messages, and spawned some online media coverage.

    Some say we should hold out longer for someone (Google? Adobe?) to change something to advance WebM over H.264.

    MozillaMemes.tumblr.com/post/19415247873

    Remember, dropping H.264 from <video> only on desktop and not on mobile doesn’t matter, because of Flash fallback.

    Others say we should hold out indefinitely and by ourselves, rather than integrate OS decoders for encumbered video.

    I’ve heard people blame software patents. I hate software patents too, but software isn’t even the issue on mobile. Fairly dedicated DSP hardware takes in bits and puts out pixels. H.264 decoding lives completely in hardware now.

    Yes, some hardware also supports WebM decoding, or will soon. Too little, too late for HTML5 <video> as deployed and consumed this year or (for shipping devices) next.

    As I wrote in the newsgroup thread, Mozilla has never ignored users or market share. We do not care only about market share, but ignoring usability and market share can easily lead to extinction. Without users our mission is meaningless and our ability to affect the evolution of open standards goes to zero.

    Clearly we have principles that prohibit us from abusing users for any end (e.g., by putting ads in Firefox’s user interface to make money to sustain ourselves). But we have never rejected encumbered formats handled by plugins, and OS-dependent H.264 decoding is not different in kind from Flash-dependent H.264 decoding in my view.

    We will not require anyone to pay for Firefox. We will not burden our downstream source redistributors with royalty fees. We may have to continue to fall back on Flash on some desktop OSes. I’ll write more when I know more about desktop H.264, specifically on Windows XP.

    What I do know for certain is this: H.264 is absolutely required right now to compete on mobile. I do not believe that we can reject H.264 content in Firefox on Android or in B2G and survive the shift to mobile.

    Losing a battle is a bitter experience. I won’t sugar-coat this pill. But we must swallow it if we are to succeed in our mobile initiatives. Failure on mobile is too likely to consign Mozilla to decline and irrelevance. So I am fully in favor of Andreas’s proposal.

    Our mission continues

    Our mission, to promote openness, innovation, and opportunity on the Web, matters more than ever. As I said at SXSW in 2007, it obligates us to develop and promote unencumbered video. We lost one battle, but the war goes on. We will always push for open, unencumbered standards first and foremost.

    In particular we must fight to keep WebRTC unencumbered. Mozilla and Opera also lost the earlier skirmish to mandate an unencumbered default format for HTML5 <video>, but WebRTC is a new front in the long war for an open and unencumbered Web.

    We are researching downloadable JS decoders via Broadway.js, but fully utilizing parallel and dedicated hardware from JS for battery-friendly decoding is a ways off.

    Can we win the long war? I don’t know if we’ll see a final victory, but we must fight on. Patents expire (remember the LZW patent?). They can be invalidated. (Netscape paid to do this to certain obnoxious patents, based on prior art.) They can be worked around. And patent law can be reformed.

    Mozilla is here for the long haul. We will never give up, never surrender.

    /be

    [1] Some points about WebM on YouTube vs. H.264:

    • Google has at best transcoded only about half the videos into WebM. E.g., this YouTube search for “cat” gives ~1.8M results, while the same one for WebM videos gives 704K results.
    • WebM on YouTube is presented only for videos that lack ads, which is a shrinking number on YouTube. Anything monetizable (i.e., popular) has ads and therefore is served as H.264.
    • All this is moot when you consider mobile, since there is no Flash on mobile, and as of yet no WebM hardware, and Apple’s market-leading position.

  6. Announcing the January Dev Derby Winners

    HTML5 orientation allows web developers to read the motion and orientation data of devices to create more engaging and more interactive web experiences.

    Recently, creative developers from around the world demonstrated just how powerful orientation can be in the January Dev Derby. After careful consideration, our three new judges—Remy Sharp, Chris Coyier, and Chris Heilmann—are proud to announce that they have decided on three winners and two runners-up. Please join us in congratulating these contributors!

    First Place: Help UFO by michal.b
    Second Place:
    Pinball Maze by CarsonMcDonald
    Third Place:
    Orientation Tetris by nestoralvaro

    Runners Up
    Exploration by seva
    Street Orientation by nestoralvaro

    And let’s not forget about our other exceptional contributors. Each and every one of these participants is is pushing the web forward and deserves a great deal of praise as a result.

    Want to see your name here next month? We are now accepting demos related to CSS 3D Transforms (March), HTML5 audio (April), and Websockets (May). Get an early start by submitting today!

  7. Developing a simple HTML5 space shooter

    Experimenting with modern web technologies is always fun. For my latest project, I came up with the following requirements:

    • Not a complex game, rather a proof-of-concept
    • Space shooter theme
    • Canvas-based rendering, but no WebGL
    • Re-use of existing sprites (I am no artist)
    • Rudimentary audio support
    • AI/Bots
    • Working network multiplayer

    I worked on this project only in my spare time; after approx. two weeks, the work was done: Just Spaceships! This article describes some decisions and approaches I took during the development. If you are interested in the code, you can check the relevant Google Code project.

    Animation & timing

    There are two basic ways to maintain an animation: requestAnimationFrame and setTimeout/setInterval. What are their strengths and weaknesses?

    • requestAnimationFrame instructs the browser to execute a callback function when it is a good time (for animation). In most browsers this means 60fps, but other values are used as well (30fps on certain tablets). Deciding the correct timing is up to the browser; the developer has no control over it. When the page is in background (user switches to another tab), animation can be automatically slowed down or even completely stopped. This approach is well suited for fluent animation tasks.
    • setTimeout instructs the browser to execute your next (animation) step after a given time has passed; the browser will try to fulfill this request as precisely as possible. No slowdowns are performed when the page is in background, which means that this approach is well suited for physics simulation.

    To solve animation-related stuff, I created a HTML5 Animation Framework (HAF), which combines both approaches together. Two independent loops are used: one for simulation, the second one for rendering. Our objects (actors in HAF terminology) must implement two basic methods:

    /* simulation: this is called in a setTimeout-based loop */
    Ship.prototype.tick = function(dt) {
    	var oldPosition = this._position;
    	this._position += dt * this._velocity;
    	return (oldPosition != this._position);
    }
     
    /* animation: this is called in a requestAnimationFrame loop */
    Ship.prototype.draw = function(context) {
    	context.drawImage(this._image, this._position);
    }

    Note that the tick method returns a boolean value: it corresponds to the fact that a (visual) position of an object might (or might not) change during the simulation. HAF takes care of this – when it comes to rendering, it redraws only those actors that returned true in their tick method.

    Rendering

    Just Spaceships makes extensive use of sprites; pre-rendered images which are drawn to canvas using its drawImage method. I created a benchmarking page which you can try online; it turns out that using sprites is way faster than drawing stuff via beginPath or putImageData.

    Most sprites are animated: their source images contain all animation frames (such as in CSS sprites) and only one frame (rectangular part of the full source image) is drawn at a time. Therefore, the core drawing part for an animated sprite looks like this:

    Ship.prototype.draw = function(context) {
    	var fps = 10;          /* frames per second */
    	var totalFrames = 100; /* how many frames the whole animation has? */
    	var currentFrame = Math.floor(this._currentTime * fps) % totalFrames;
     
    	var size = 16; /* size of one sprite frame */
     
    	context.drawImage(
    		/* HTML &lt;img&gt; or another canvas */
    		this._image,
     
    		/* position and size within source sprite */
    		0, currentFrame * size, size, size,
     
    		/* position and size within target canvas */
    		this._x, this._y, size, size
    	);
    }

    By the way: the sprites for Just Spaceships! have been taken from Space Rangers 2, my favourite game.

    The golden rule of rendering is intuitive: redraw only those parts of the screen which actually changed. While it sounds pretty simple, following it might turn out to be rather challenging. In Just Spaceships, I took two different approaches for re-drawing sprites:

    • Ships, lasers and explosions use technique known as “dirty rectangles”; when an object changes (moves, animates, …), we redraw (clearRect + drawImage) only the area covered by its bounding box. Some deeper bounding box analysis must be performed, because bounding boxes of multiple objects can overlap; in this case, all overlapping objects must be redrawn.
    • Starfield background has its own layer (canvas), positioned below other objects. The full background image is first pre-rendered into a large (3000×3000px) hidden canvas; when the viewport changes, a subset of this large canvas is drawn into background’s layer.

    Sound

    Using HTML5 audio is a piece of cake – at least for desktop browsers. There were only two issues that needed to be taken care of:

    1. File format – the format war is far from being over; the most universal approach is to offer both MP3 and OGG versions.
    2. Performance issues on certain slower configurations when playing multiple sounds simultaneously. More particularly, my Linux box exhibited some slowdowns; it would be best to offer an option to turn the whole audio off (not implemented yet).

    At the end of the day, the audio code looks basically like this:

    /* detection */
    var supported = !!window.Audio &amp;&amp; !audio_disabled_for_performance_reasons;
     
    /* format */
    var format = new Audio().canPlayType("audio/ogg") ? "ogg" : "mp3";
     
    /* playback */
    if (supported) { new Audio(file + "." + format).play(); }

    Multiplayer & networking model

    Choosing a proper networking model is crucial for user experience. For a realtime game, I decided to go with a client-server architecture. Every client maintains a WebSocket connection to central server, which controls the game flow.

    In an ideal world, clients would send just keystrokes and the server would repeat them to other clients. Unfortunately, this is not possible (due to latency); to maintain consistency and synchronicity between players, it is necessary to run the full simulation at server and periodically notify clients about various physical attributes of ships and other entities. This approach is known as an authoritative server.

    Finally, clients cannot just wait for server packets to update their state; players need the game to work even between periodical server messages. This means that browsers run their version of the simulation as well – and correct their internal state by data sent by server. This is known as a client-side prediction. A sample implementation of these principles looks like this:

    /* physical simulation step - client-side prediction */
    Ship.prototype.tick = function(dt) {
    	/*
    		assume only these physical properties:
    			acceleration, velocity and position
    	*/
    	this._position += dt * this._velocity;
    	this._velocity += dt * this._acceleration;
    }
     
    /* "onmessage" event handler for a WebSocket data connection;
    	used to override our physical attributes by server-sent values */
    Ship.prototype.onMessage = function(e) {
    	var data = JSON.parse(e.data);
     
    	this._position = data.position;
    	this._velocity = data.velocity;
    	this._acceleration = data.acceleration;
    }

    You can find very useful reading about this at Glenn Fiedler’s site.

    Multiplayer & server

    Choosing a server-side solution was very easy: I decided to go with v8cgi, a multi-purpose server-side javascripting environment, based on V8. Not only it is older than Node, but (most importantly) it was created and maintained by myself ;-).

    The advantage of using server-side JavaScript is obvious: game’s server runs the identical code that is executed in browser. Even HAF works server-side; I just turned off its rendering loop and the simulation works as expected. This is a cool demonstration of client-server code sharing; something we will probably see more and more in the upcoming years.

    Modulus

    In order to make the game more interesting and challenging, I decided that the whole playing area should wrap around – think of the game universe as of a large toroidal surface. When a spaceship flies far to the left, it will appear from the right; the same holds for other directions as well. How do we implement this? Let’s have a look at a typical simulation time step:

    /* Variant #1 - no wrapping */
    Ship.prototype.tick = function(dt) {
    	this._position += dt * this._velocity;
    }

    To create an illusion of a wrapped surface, we need the ship to remain in a fixed area of a given size. A modulus operator comes to help:

    /* Variant #2 - wrapping using modulus operator */
    Ship.prototype.tick = function(dt) {
    	var universe_size = 3000; // some large constant number
    	this._position += (dt * this._velocity) % universe_size;
    }

    However, there is a glitch. To see it, we have to solve this (elementary school) formula:

    (-7) % 5 = ?

    JavaScript’s answer is -2; Python says 3. Who is the winner? The correctness of the result depends on the definition, but for my purpose, the positive value is certainly more useful. A simple trick was necessary to correct JavaScript’s behavior:

    /* Returned value is always &gt;= 0 &amp;&amp; &lt; n */
    Number.prototype.mod = function(n) {
    	return ((this % n) + n) % n;
    }
     
    /* Variant #3 - wrapping using custom modulus method */
    Ship.prototype.tick = function(dt) {
    	var universe_size = 3000; // some large constant number
    	this._position += (dt * this._velocity).mod(universe_size);
    }

    Lessons learned

    Here is a short summary of general tips & hints I gathered when developing JS games in general, but mostly during Just Spaceships development:

    • Know the language! It is very difficult to create anything without properly understanding the programming language.
    • Lack of art (sprites, music, sfx, …) should not stop you from developing. There are tons of resources for getting these assets; the need for original art is relevant only it later stages of the project.
    • Use the paper, Luke! You know what my favorite development tools are? A pen and a sheet of squared paper.
    • If you are not 100% sure about the whole game architecture, start with smaller (working) parts. Refactoring them later to form a larger project is natural, useful and easy.
    • Collect feedback as soon as possible – at the end of the day, it is the users’ opinion that matters the most.

    TODO

    As stated before, Just Spaceships is not a complete project. There is definitely room for improvements, most notably:

    • Optimize the network protocol by reducing the amount of data sent.
    • Offer more options to optimize canvas performance (decrease simulation FPS, turn off background, turn off audio, …).
    • Improve the AI by implementing different behavior models.

    Even with these unfinished issues, I think the game reached a playable state. We had quite a lot of fun testing its multiplayer component; I hope you will enjoy playing it!