The particle server

In my previous blog post, I wrote about the concept of my project using particle. Now I will explain what I had to do to increase the data rate transfer of my modules (remember, my goal is to get data  with the closest data transfer of 1 [ms] ).

First, I installed the local Api server ( github.com/spark/spark-server).

Then I had to register all of my photon's public key on my server and the server public key on my photons.

Using this command :

particle keys server local_server_key.pub.pem IP_ADDRESS

Then, I launched the server to see if my photons were responding with something like this :

Connection from: 192.168.1.159, connId: 1
on ready { coreID: '48ff6a065067555008342387',
 ip: '192.168.1.159',
 product_id: 65535,
 firmware_version: 65535,
 cache_key: undefined }
Core online!

So from here all was working fine but what I also needed to use there is JS library to get data from OAuth. The thing is that you have to do a lot of configurations if you want to make it works but in this project it was not the goal. I had to test as quickly as possible. So I did what you usually do not have to do with a library installed via npm.

In the file “node_modules/particle-api-js/lib/Default.js” I replaced :

'use strict';

Object.defineProperty(exports, "__esModule", {
    value: true
});

exports.default = {
    baseUrl: 'https://api.particle.io',
    clientSecret: 'particle-api',
    clientId: 'particle-api',
    tokenDuration: 7776000 
};

module.exports = exports['default'];
//# sourceMappingURL=Defaults.js.map

By :

'use strict';


Object.defineProperty(exports, "__esModule", {
    value: true
});

exports.default = {
    baseUrl: 'https://localhost:8080',
    clientSecret: 'particle',
    clientId: 'particle',
    tokenDuration: 7776000 
};

module.exports = exports['default'];
//# sourceMappingURL=Defaults.js.map

And then you have a server where you can create OAuth users accounts and use them from a local app.

The spark firmware

The second part is about the firmware of the photon. In the spark protocol library, I had to remove some lines of code that was fixing a data limit rate per second.

So removed those lines :

if (now - recent_event_ticks[evt_tick_idx] < 1000) {
   // exceeded allowable burst of 4 events per second
   return false;
}

And finally the longest part of the whole thing was to build a complete and clean firmware and upload it to the photon without breaking it (with a bad firmware uploaded to the device, electronic components burn).

So you have to install dfu utils, put your photon in dfu mode and follow the next steps :

  • Within the “firmware/main” folder, type 
make clean all PLATFORM=photon program-dfu

It will generate the new firmware and will upload it to the photon.

  • Restart you local server
  • Test the code you want to use and see a very big difference ( about 20 / 30 [ms] to send, receive and process data. Before it was 70 / 80 [ms] )

So from here you can get what you want. You just have an idea in your head and you can transform your “local” projects by a “wireless” projects ;)

In conclusion it's a bit complicated to configure the environment to use the API in a local network but it let's you use and remove all unnecessary code processing  to have the best performances.

The web api that particle.io offers is really great ! It's really simple to use. Another alternative would be to use TCP / UDP protocols and launch a server that is listening to a defined port (about 50 / 60 [ms] to send, receive and process data but with some lag).

Another goal to this post was to record a little demo video. Unfortunately, my app is not finished but I'll release on my youtube channel the video when my app will work ! So stay tuned ;)