Featured Demos

Sort by:


  1. Firefox 4 Demos: 3 new demos!

    Firefox 4 RC is here!

    And as promised last week, more demos are coming on our Web O’ Wonder website! Here is another round with 3 new demos. Take a look!

    360° Video

    Navigate into a panoramic video.

    Try it yourself

    Source code

    Motivational Poster Editor

    Create your own motivationnal poster.

    Try it yourself

    Source code

    Letter Heads

    By @simurai

    Human silhouette built with CSS3 shadows.

    Try it yourself

    Source code

  2. Firefox 4 Web Demos: announcing Web O' Wonder

    Firefox 4 is almost here. Check out some awesome Web demos on our brand new demo web site: Web’O Wonder. Screencast here.

    Update: 3 more demos:

    Web ‘O Wonder

    Firefox 4 is almost here, and comes with a huge list of awesome features for web developers.

    In order to illustrate all these new technical features, we put together several Web demos. You’ll see a couple of demos released every week until the final version of Firefox 4. You can see the first 3 demos online now on our brand new demo web site: Web’O Wonder:

    Web O Wonder

    Note: Some of these demos use WebGL, and will work only on compatible hardware. Make sure you are using the latest drivers for your graphics card.

    Open Demos

    These demos haven’t been designed exclusively for Firefox 4. We believe that the Web is not the property of any single browser, which is why we made sure that these demos work on other modern browsers, like Chrome, when possible.

    The other big step forward for HTML5 demos is that these are Open Source. Help us improve them! The source code of every single demo is available on Github and we encourage you to fork them and play.

    New technologies and new horizons for 400 million users

    In these demos, we are experimenting with new technologies, trying to imagine what could be the future of the Web. HTML5, CSS3, WebGL, SVG, blazing fast JavaScript and new APIs. Sounds exciting, but what does it really mean for the user?

    It means new possibilities, new ways to interact with web applications. It means richer and more beautiful applications. It means a smoother and faster experience and this is what we want to demonstrate with these demos.

    We hope to inspire web developers to use these new technologies. This is just the beginning of a new generation of websites. Web developers now have the tools to change the face of the internet and here you have a few blueprints to start from.

    Thank you!

    A lot of people were involved in the development of these demos. The Mozilla community is full of talented people that did a lot of incredible work to make these demos happen. I just want to say a big thank you to all these talented people (and make sure to check out the credits on the demos!).

    Oh, look! A screencast:

  3. Firefox 4 Beta: Latest Update is Here, with WebGL

    The new Firefox 4 Beta is here, and comes with WebGL activated by default. You can download this new beta here:

    Flight of the Navigator is a WebGL + Audio API demo developed by a team of Mozilla volunteers.

    You can see the demo online here (you need a WebGL compatible browser).

    More information about this demo on David Humphrey’s blog (one of the developer of this demo) and more about WebGL in Firefox Beta8 on Vlad’s blog.

    The screencast:

  4. Firefox 4: hardware acceleration

    The latest Firefox 4 beta has just been released. It comes with Direct2D acceleration activated by default.

    What is hardware acceleration?

    Hardware acceleration” is basically using the GPU when it’s possible (instead of the CPU). This makes page-drawing operations faster.

    There’s two different levels of acceleration going on:

    • Content acceleration speeds up rendering the actual page content, such as the text, images, CSS borders, etc. Content acceleration is also used to speed up the 2D HTML canvas. On Windows Vista/7, we use Direct2D for this and it has been activated in this new beta.
    • Compositing acceleration speeds up putting together chunks of already-rendered content (layers) together for final display in the window, potentially applying effects such as transforms along the way. For example, if you had a HTML5 video that was playing, and it had CSS effects applied to it that was making it rotate and fade in and out, that would be using compositing acceleration to make that go fast. (This feature is not activated by default yet.)

    >> Run the test yourself: Hardware Acceleration Stress Test. <<
    Credits for the photos: Paul (dex).

    Hardware Acceleration by operating system:

    These optimizations are available only if you have compatible hardware and the associated drivers.

    Operation Linux Windows XP Windows Vista/7 Mac OS X
    Content XRender None Direct2D Quartz1
    Compositing OpenGL Direct 3D Direct 3D OpenGL

    [1]: Quartz is basically CPU-only. QuartzGL (GPU acceleration for the Quartz 2D API) is not activated in Firefox for now (nor in other browsers).

    Important note: Don’t confuse hardware acceleration with WebGL. WebGL is an OpenGL-like API for Javascript to draw 3D objects into a <canvas> element. Obviously, WebGL is itself hardware accelerated since it uses OpenGL (or Direct3D through ANGLE on Windows if no OpenGL drivers are present).

    We need help!

    Help us to improve hardware acceleration in Firefox: Install the Grafx Bot extension (details here and add-on here).

    Firefox’s hardware acceleration interacts with a machine’s graphics hardware via DirectX or OpenGL, depending on platform. These interactions tend to be very sensitive to the graphics environment on the system (e.g., the specific video card(s) on the system, how much VRAM is available, the version of the video driver, the OS version, etc). In fact, there are so many permutations of the relevant factors that we can’t test them all internally.

    Grafx Bot runs a suite of automatic tests on your machine that exercises interesting aspects of hardware acceleration (for about 5 to 20 minutes). At the end of the tests, you can send your results to Mozilla (with anonymous video configuration information), where the data will be collected and analyzed, and hopefully lead to bug fixes and more reliable code for hardware acceleration than we’d otherwise have.

    We need help from the community, so we can get exposure on as many unique hardware environments as possible.

  5. Firefox 4 Beta 2 is here – Welcome CSS3 transitions

    As we have explained before, Mozilla is now making more frequent updates to our beta program. So here it is, Firefox Beta 2 has just been released, 3 weeks after Beta 1.

    Firefox 4 Beta 1 already brought a large amount of new features (see the Beta 1 feature list). So what’s new for web developers in this beta?

    Performance & CSS3 Transitions

    The two major features for web developers with this release are Performance improvements and CSS3 Transitions on CSS3 Transforms.

    This video is hosted by Youtube and uses the HTML5 video tag if you have enabled it (see here). Youtube video here.

    Performance: In this new Beta, Firefox comes with a new page building mechanism: Retained Layers. This mechanism provides noticeable faster speed for web pages with dynamic content, and scrolling is much smoother. Also, we’re still experimenting with hardware acceleration: using the GPU to render and build some parts of the web page.

    CSS3 Transitions on transforms: The major change for web developers is probably CSS3 Transitions on CSS3 Transformations.

    CSS3 Transitions provide a way to animate changes to CSS properties, instead of having the changes take effect instantly. See the documentation for details.

    This feature was available in Firefox 4 Beta 1, but in this new Beta, you can use Transitions on Transformation.

    A CSS3 Transformation allows you to define a Transformation (scale, translate, skew) on any HTML element. And you can animate this transformation with the transitions.

    See this box? Move your mouse over it, and its position transform: rotate(5deg); will transform transform: rotate(350deg) scale(1.4) rotate(-30deg); through a smooth animation.
    #victim {
      background-color: yellow;
      color: black;
      transition-duration: 1s;
      transform: rotate(10deg);
      /* Prefixes */
      -moz-transition-duration: 1s;
      -moz-transform: rotate(5deg);
      -webkit-transition-duration: 1s;
      -webkit-transform: rotate(10deg);
      -o-transition-duration: 1s;
      -o-transform: rotate(10deg);
    #victim:hover {
      background-color: red;
      color: white;
      transform:  rotate(350deg) scale(1.4) rotate(-30deg);
      /* Prefixes */
      -moz-transform:  rotate(350deg) scale(1.4) rotate(-30deg);
      -webkit-transform:  rotate(350deg) scale(1.4) rotate(-30deg);
      -o-transform:  rotate(350deg) scale(1.4) rotate(-30deg);

    CSS 3 Transitions are supported by Webkit-based browsers (Safari and Chrome), Opera and now Firefox as well. Degradation (if not supported) is graceful (no animation, but the style is still applied). Therefore, you can start using it right away.


    I’ve written a couple of demos to show both CSS3 Transitions on Transforms and hardware acceleration (See the video above for screencasts).

    This demo shows 5 videos. The videos are Black&White in the thumbnails (using a SVG Filter) and colorful when real size (click on them). The “whirly” effect is done with CSS Transitions. Move you mouse over the video, you’ll see a question mark (?) button. Click on it to have the details about the video and to see another SVG Filter applied (feGaussianBlur).
    This page shows 2 videos. The top left video is a round video (thanks to SVG clip-path) with SVG controls. The main video is clickable (magnifies the video). The text on top of the video is clickable as well, to send it to the background using CSS Transitions.
    This page is a simple list of images, video and canvas elements. Clicking on an element will apply a CSS Transform to the page itself with a transition. White elements are clickable (play video or bring a WebGL object). I encourage you to use a hardware accelerated and a WebGL capable browser. For Firefox on Windows, you should enable Direct2D.


    Creative Commons videos:

    The multicolor cloud effect (MIT License)

  6. Beyond HTML5: experiments with interactive audio

    This is a re-post of an important post from David Humphrey who has been doing a lot of experiments on top of Mozilla’s extensible platform and doing experiments with multi-touch, sound, video, WebGL and all sorts of other goodies. It’s worth going through all of the demos below. You’ll find some stuff that will amaze and inspire you.

    David’s work is important because it’s showing where the web is going, and where Mozilla is helping to take it. It’s not enough that we’re working on HTML5, which we’re about finished with, but we’re trying to figure out what’s next. Mozilla’s platform, Gecko, is a huge part of why we’re able to experiment and learn as fast as we can. And that’s reflected with what’s possible here. It’s a web you can see, touch and interact with in new ways.

    David’s post follows:

    I’m working with an ever growing group of web, audio, and Mozilla developers on a project to expose audio spectrum data to JavaScript from Firefox’s audio and video elements. Today we show what we did at www2010.

    I’m in Raleigh, North Carolina, with Al MacDonald for the www2010 conference. We’re here to present our work on exposing audio data in the browser. Over the past month Corban, Charles, and a bunch of other friends have been working with us to refine the API and get new types of demos ready. We ended-up with 11 demos, some of which I’ve shown here before. Here are the others.

    The first was done by Jacob Seidelin, and shows many cool 2D visualizations of audio using our API. You can see the live version on his site, or check out this video:

    The second and third demos where done by Charles Cliffe, and show 3D visualizations using WebGL and his CubicVR engine. These also show off his JavaScript beat detection code. Is JavaScript fast enough to do real-time analysis of audio and synchronized 3D graphics? Yes, yes it is. The live versions are here and here, and here are some videos:

    The fourth demo was done by Corban Brook and shows how audio data can be mixed live using script. Here he mutes the main audio, plays it, passes the data through a low pass filter written in JavaScript, then dumps the modified frames into a second audio element to be played. It’s a technique we need to apply more widely, as it holds a lot of potential. Here’s the live version, and here’s a video (check out his updated JavaScript synthesizer, which we also presented):

    The fifth and sixth demos were done by Al (with the help of many). When I was last in Boston, for the Processing.js meetup at Bocoup, we met with Doug Schepers from the W3C. He loved our stuff, and was talking to us about ideas that would be cool to build. He pulled out his iPhone and showed us Brian Eno’s Bloom audio app. “It would be cool to do this in the browser.” Yeah, it is cool, and here it is, written in a few hundred lines of JavaScript and Processing.js (video 1, video 2):

    This demo also showcases the awesome work of Felipe Gomes, who has a patch to add multi-touch DOM events to Firefox. The method we’ve used here can be taken a lot further. Imagine being able to connect multiple browsers together for collaborative music creation, layering other audio underneath, mixing fragments vs. just oscillators, etc. We built this one in a week, and the web is capable of a lot more.

    One of the main points of our talk was to emphasize that what we’re talking about here isn’t just a concept, and it isn’t some far away future. This is real code, running in a real browser, and it’s all being done in HTML5 and JavaScript. The web is fast enough to do real-time audio processing now, powerful enough and expressive enough to create music. And the community of digital music and audio hackers, visualizers, etc. are hungry for it. So hungry that they are seeking us out, downloading our hacked builds and creating gorgeous web audio applications.

    We want to keep going, and we need help. We need help from those within Mozilla, the W3C, and other browsers to get this stuff into shipping browsers. We need the audio, digital music, accessibility, and web communities to come together in order to help us build js audio libraries and more sample applications. Yesterday Joe Hewitt was talking on twitter about how web browser vendors need to experiment more with non-standard APIs. I couldn’t agree more, and here’s a chance for people to put their money where their mouth is. Let’s make audio a scriptable part of the open web.

    I’m currently creating new builds of our updated patch for Firefox, and will post links to them here when I’m done. You can read more about the technical details of our work here, and get involved in the bug here. You can talk more with me on irc in the processing.js channel (I’m humph on moznet), or talk to me on twitter (@humphd) or by email. One way or another, get in touch so you can help us push this forward.

  7. Real-time server visualization with canvas and processing.js

    This is a guest blog post by Logan Welliver, Chief Creative at Cloudkick. He is a graphic designer by training and a web designer in practice.

    Cloud management company Cloudkick has released a real-time server monitoring visualization based on canvas and processing.js, that was co-developed with Alastair McDonald of processing.js fame. The product is designed to let users keep a finger on the pulse of their infrastructure, quickly identify problem nodes, and visualize aggregate performance with an easy-to-digest interface.

    The tool uses canvas and processing.js to plot servers as stylized circles in 3 dimensions, with axes mapped to one of three performance metrics: CPU usage, memory usage, and ping latency. Each server’s radius is determined by it’s relative prowess (i.e. an EC2 extra large is bigger than 256mb Slice), and colors are customizable via the Cloudkick dashboard. Each server sparkles when the monitoring system returns data, and servers with problems identify themselves by flashing an angry red.

    Canvas and processing.js take care of all the presentation, powered by a slew of back-end services that do everything from monitoring servers to pushing data in real-time back to the user.

    Here’s a brief overview of the back-end architecture: instances of the Cloudkick Agent (running on individual servers) report metrics to an endpoint, which talks to Reconnoiter, which then publishes messages to RabbitMQ. An internal service called Livedata consumes these messages, finds the ones applicable to an account, and publishes messages back to RabbitMQ. Orbited consumes these messages and sends them to the browser. From agent to browser, the round-trip time is less than a second.

    Cloudkick has partnered with Mozilla to provide the visualization for their servers. You can see how they’re behaving in a live demo of the visualization here: infrastructure in Cloudkick Viz.

    Get the visualization for your own servers. Cloudkick is offering 20% off for the first 100 Mozilla Hacks readers, using promo code “mozhacks01”.

  8. an HTML5 offline image editor and uploader application

    Many web applications use image uploaders: image hosting websites, blog publishing applications, social networks, among many others. Such uploaders have limitations: you can’t upload more than one file at a time and you can’t edit the image before sending it. A plugin is the usual workaround for uploading more than one image, and image modifications are usually done on the server side, which can make the editing process more cumbersome.

    Firefox 3.6 offers many new Open Web features to web developers, including even more HTML5 support. This post describes how to create a sophisticated image editor and uploader built using Open Web technologies.

    See below for a video of the demo with some background.

    Hosted on hacks, publishes to twitpic

    Our web application uploads pictures to twitpic, an image hosting service for Twitter.

    Note that code for this application is actually hosted on the hacks web server but can still upload to Twitpic. Uploading to Twitpic is possible because they opened up their API to Cross Domain XMLHttpRequests for applications like this. (Thank you twitpic!).

    Web Features

    The demo uses the following features from HTML5 included in Firefox 3.6:

    • HTML5 Drag and Drop: You can drag and drop items inside of the web page and drag images from your desktop directly into the browser.
    • HTML5 localStorage: to store the image data across browser restarts
    • HTML5 Application Cache: This allows you to create applications that can be used when the browser isn’t connected to the Internet. It stores the entire app in the browser and also gives you access to online and offline events so you know when you need to re-sync data with a server.
    • HTML5 Canvas: The HTML5 Canvas element is used through this demo to edit and render images.
    • Cross-Origin Resource Sharing to host an application at one site and publish data to another.

    What’s in the demo?

    See the live demo to try the image uploader for yourself. You will need Firefox 3.6 and a twitter account.

    Here’s a full list of the things that this application can do:

    • You can drag images from your desktop or the web into the application.
    • You can see a preview of each image you want to upload.
    • You can drag previews to the trash to delete an image.
    • Images are automatically made smaller if they are bigger than 600px wide.
    • You can edit any of the images before uploading. This includes being able to rotate, flip, crop or turn an image black and white.
    • If you edit an image it’s saved locally so you can still edit when you’re offline. If you close the tab, restart Firefox or your computer they will be there when you load the page again so you can upload when you’re re-connected.
    • It will upload several files at once and provide feedback as the upload progresses.
    • The HTML5 Offline Application Cache makes the application load very quickly since it’s all stored offline.

    Under the Hood

    Let’s quickly go over all of the technology that we’re using in this application.


    Twitpic was nice enough to open their API to allow XMLHttpRequests from any other domain. This means that you can now use their API from your own website and offer your own image uploader.

    If you’re running a web site with an API you want people to use from other web sites, you can do that with Cross-site HTTP requests. In order for you to support it you will need to add an HTTP header to responses from your web server that says which domains you allow. As an example, here’s how twitpic allows access from all domains:

    Access-Control-Allow-Origin: *

    It’s important to note that opening your API does have security implications so you should be careful to understand those issues before blindly opening an API. For more details, see MDC documentation on CORS.

    Drag and Drop

    Drag and Drop is a mechanism with two important features:

    • Dragging files from your Desktop to your web page.
    • Native Drag and Drop inside your web page (not just changing the coordinates of your elements).

    The image uploader uses Drag and Drop to allow the user the add files from the Desktop, to remove files (drag them to the trash) and to insert a new image into a current image.

    For more on Drag and Drop, see previous hacks articles, in particular how to use Drag and Drop in your application.

    Canvas to Edit Images

    Once images have been dragged and dropped into the web page, the image uploader lets you edit them before uploading. This is possible because images are actually copied to canvas elements via the File API.

    In this case, the editing process is really basic: rotate, flip, add text, black and white, crop. However, you can imagine offering many other features in your version of the editor (see Pixastic for example, or this inlay feature here).

    Using canvas and the File API also let you resize the image before sending it. Here, every image is converted to a new image (canvas) that is less than 600px.

    localStorage: Save Local Data

    It’s possible to store local data persistently in a web page using localStorage, up to 5Mb of data per domain.

    In the image uploader, localStorage is used to store images and credentials. Since images are actually canvas, you can store them as data URLs:

    var url = canvas.getContext("2d").toDataURL("image/png");
    localStorage.setItem("image1", url);

    LocalStorage support means that you can edit an image, close Firefox, switch off your computer, and the edited image will still be there when you restart Firefox.


    If you add a manifest file listing all remote files needed to display your web application it will work even when you aren’t connected to the Internet. A nice side effect is that it will also make your application load much faster.

    Here, the html element refers to a manifest file:

    <html manifest="offline.manifest">

    And the manifest file looks like:

    # v2.4

    You can also catch offline and online events to know if the connection status changes.
    For more information see our last article about offline.


    Firefox 3.6 allows millions of people to take advantage of modern standards, including HTML5.
    The image uploader described here shows how a web page should really be considered as an application since it interacts with your Desktop and works offline.

    Here are a few tips for writing your next application using Open Web technologies:

    Allow Cross-XMLHttpRequest:
    If it makes sense for your service, allow people to access your API from a different domain, you’ll be amazed at the apps people will come up with.

    Allow multiple input:
    Let people Drag files to your application and use <input type="file" multiple=""> so they can select several files at once. In this demo, we use a multiple input which is visible only in the mobile version, but for accessibility consideration, don’t forget to use it to propose an alternative to Drag’n Drop.

    Use native Drag and Drop:
    Drag and Drop mechanisms are usually simulated (updating coordinates on the mousemove event.) When you can, use the native mechanism.

    Use the File API
    To pre-process a file before even talking to a server.

    Support offline
    Store data and use a manifest to make your application data persistent while offline.

    Use Canvas
    Canvas is the most widely implemented HTML5 element. It works everywhere (even if it has to be simulated), use it!

    Think “Client Side”: HTML5, CSS3 and the new powerful JavaScript engines let you create amazing applications, take advantage of them!

    We look forward to seeing the great new applications you’ll come up with using Open Web technologies!

  9. offline web applications

    The network is a key component of any web application, whether it is used to download JavaScript, CSS, and HTML source files and accompanying resources (images, videos, …) or to reach web services (XMLHttpRequest and <forms>).

    Yet having offline support for web applications can be very useful to users. Imagine, for example, a webmail application that allows users to read emails already in their inbox and write new messages even when they are not connected.

    The mechanism used to support offline web applications can also be used to improve an application’s performance by storing data in the cache or to make data persistent between user sessions and when reloading and restoring pages.

    Demo: a To Do List Manager

    To see an offline web application in action, watch Vivien Nicolas’ demo (OGV, MP4), which shows a to do list manager working online and offline on an N900 running Firefox.

    You can also check out the live demo of the application.

    Creating your Own Offline Application

    For a web application to work offline, you need to consider three things:

    Let’s see how to use each of these components.

    Storage: Persistent Data

    DOM storage lets you store data between browser sessions, share data between tabs and prevent data loss (for example from page reloads or browser restarts). The data are stored as strings (for example a JSONified JavaScript object) in a Storage object.

    There are two kinds of storage global objects: sessionStorage and localStorage.

    • sessionStorage maintains a storage area that’s available for the duration of the page session. A page session lasts for as long as the browser is open and survives over page reloads and restores. Opening a page in a new tab or window causes a new session to be initiated.
    • localStorage maintains a storage area that can be used to hold data over a long period of time (e.g. over multiple pages and browser sessions). It’s not destroyed when the user closes the browser or switches off the computer.

    Both localStorage and sessionStorage use the following API:

    window.localStorage and window.sessionStorage {
      long length; // Number of items stored
      string key(long index); // Name of the key at index
      string getItem(string key); // Get value of the key
      void setItem(string key, string data); // Add a new key with value data
      void removeItem(string key); // Remove the item key
      void clear(); // Clear the storage

    Here is an example showing how to store and how to read a string:

    // save the string
    function saveStatusLocally(txt) {
      window.localStorage.setItem("status", txt);
    // read the string
    function readStatus() {
       return window.localStorage.getItem("status");

    Note that the storage properties are limited to an HTML5 origin (scheme + hostname + non-standard port). This means that window.localStorage from is a different instance of window.localStorage from For example, can’t access the storage of

    Are We Offline?

    Before storing data, you may want to know if the user is online or not. This can be useful, for example, to decide whether to store a value locally (client side) or to send it to the server.

    Check if the user is online with the navigator.onLine property.
    In addition, you can be notified of any connectivity changes by listening to the online and offline events of the window element.

    Here is a very simple piece of JavaScript code, which sends your status to a server (à la twitter).

    • If you set your status and you’re online, it sends the status.
    • If you set your status and you’re offline, it stores your status.
    • If you go online and have a stored status, it sends the stored status.
    • If you load the page, are online, and have a stored status, it sends the stored status.
    function whatIsYourCurrentStatus() {
      var status = window.prompt("What is your current status?");
      if (!status) return;
      if (navigator.onLine) {
      } else {
    function sendLocalStatus() {
      var status = readStatus();
      if (status) {
    window.addEventListener("load", function() {
       if (navigator.onLine) {
    }, true);
    window.addEventListener("online", function() {
    }, true);
    window.addEventListener("offline", function() {
      alert("You're now offline. If you update your status, it will be sent when you go back online");
    }, true);

    Offline Resources: the Cache Manifest

    When offline, a user’s browser can’t reach the server to get any files that might be needed. You can’t always count on the browser’s cache to include the needed resources because the user may have cleared the cache, for example. This is why you need to define explicitly which files must be stored so that all needed files and resources are available when the user goes offline: HTML, CSS, JavaScript files, and other resources like images and video.

    The manifest file is specified in the HTML and contains the explicit list of files that should be cached for offline use by the application.

    <html manifest="offline.manifest">

    Here is an example of the contents of a manifest file:


    The MIME-Type type of the manifest file must be: text/cache-manifest.

    See the documentation for more details on the manifest file format and cache behavior.


    The key components you should remember to think about when making your application work offline are to store the user inputs in localStorage, create a cache manifest file, and monitor connection changes.

    Visit the Mozilla Developer Center for the complete documentation.

  10. interactive file uploads with Drag and Drop, FileAPI and XMLHttpRequest

    In previous posts, we showed how to access a file through the input tag or through the Drag and Drop mechanism. In both cases, you can use XMLHttpRequest to upload the files and follow the upload progress.


    If you’re running the latest beta of Firefox 3.6, check out our file upload demo.


    XMLHttpRequest will send a given file to the server as a binary array, so you first need to read the content of the file as a binary string, using the File API. Because both Drag and Drop and the input tag allow you to handle multiple files at once, you’ll need to create as many requests as there are files.

    var reader = new FileReader();
    reader.onload = function(e) {
      var bin =;
      // bin is the binaryString

    Once the file is read, send it to the server with XMLHttpRequest:

    var xhr = new XMLHttpRequest();"POST", "upload.php");
    xhr.overrideMimeType('text/plain; charset=x-user-defined-binary');

    You can choose to be notified when specific events, such as error, success, or abort, occur during the request (see the MDC documentation for more details).

    Following the Upload Progress

    The progress event provides the size of the uploaded portion of the binary content. This allows you to easily compute how much of the file has been uploaded.

    Here’s an example showing the percentage of the file that has been uploaded so far:

    xhr.upload.addEventListener("progress", function(e) {
      if (e.lengthComputable) {
        var percentage = Math.round((e.loaded * 100) /;
        // do something
    }, false);