technology from back to front

Posts Tagged ‘JavaScript’

JS/CORBA Adapter

What is it?

The JS/CORBA Adapter provides a mechanism for arbitrary Javascript objects to interact with each other transparently in a distributed Javascript system using CORBA. The keywords here are:

  • distributed – multiple Javascript engines execute on different machines
  • transparently – interaction with remote Javascript objects is no different from interacting with local objects
  • arbitrary – *any* Javascript object can participate in the interaction

Apart from building distributed systems of Javascript objects, the JS/CORBA Adapter also provides an easy mechanism for other CORBA systems to access Javascript objects – any Javascript object can be “CORBA-enabled” dynamically without the need to define any IDL etc.

Where do I get it?

The project is hosted at SourceForge: http://www.sourceforge.net/projects/jscorba/ You will also need a copy of Rhino, a Javascript engine written in Java which is part of the Mozilla project: http://www.mozilla.org/rhino/

How does it work?

The file Javascript.idl should give you a clue. It basically maps the Scriptable interface to a CORBA interface. Scriptable is implemented by all Javascript objects and is the interface used by the Javascript engine for all interactions with Javascript objects. There is also a Function interface, which inherits from Scriptable and is used for things than can be “called” and act as “constructors”. This too is mapped to a corresponding CORBA interface.

The job of the JS/CORBA Adapter is to “export” local objects to CORBA by creating CORBA objects that implement the Scriptable CORBA interface, and to “import” remote objects, by wrapping CORBA references to remote objects in locally created Javascript proxy objects. Thus, a remote invocation traverses the following objects:

client JS engine -> 
ScriptableClient -> 
CORBA ref -> 
client ORB--network--> server ORB -> 
ScriptableServer -> 
Javascript object -> 
server JS engine

Exporting and importing can be done explicitly. This is primarily used for bootstrapping purposes.

How do I use it?

Follow the instructions in INSTALL.txt in order to build/install JS/CORBA Adapter. There are some examples in the “examples” directory that should give you a pretty good idea how things work. Basically, a “typical” scenario works along the following lines:

  • Create an ORB and POA with the right policies
  • Create an instance of the JS/CORBA Adapter. You can have more than one instance in which case you will get more than one “object identity space” (see above), i.e. you can create multiple CORBA wrappers for local Javascript objects and multiple Javascript wrappers for CORBA references to remote Javascript objects. Not sure why anyone would want to do this, but you can :)
  • Create some local Javascript objects and export them to the CORBA world using the “exportObj” method on the JS/CORBA adapter.
  • Store the resulting object references in some naming/directory service or write the IORs to files.
  • Other Javascript engines can now get a reference to the exported objects by following steps 1)+2) and then do a naming/directory query (or read the IOR files) followed by executing the “importObj” method on their JS/CORBA adapter.
  • The other Javascript engines can now invoke methods on the imported objects. Arguments and results are automatically exported/imported.

The examples illustrate how to do all this.

Garbage Collection

I bet you have been dying to ask how the Javascript engine knows when it is safe to destroy a local Javascript object that has been “exported”. The short answer is: it doesn’t. The long answer is: The exporting of an object (explictly or implicitly) does not prevent it from being garbage-collected. This may sound like a dangerous thing to do, but is actually not that bad – it means you can control what becomes garbage and what doesn’t by simply removing/keeping local pointers to the objects. The alternative would be to implement a proper distributed garbage collection (which is very hard, but I might still do it at some point) or go for an ugly compromise like in RMI.

You do need to be careful with implicitly exported objects, e.g. objects passed as arguments in calls to remote objects. For example, in

   remote.foo(compute());
  

the result of compute() might get gc-ed after the call return (NB: It will *not* get gc-ed before that)before the call returns unless it is referred to from somewhere else. If you cannot be sure that an argument is referred to from elsewhere, the safe thing to do is to assign it to a local variable that doesn’t go out of scope until any potential invocations on it have been completed, eg.

   {
   var tmp = compute();
   remote.foo(tmp);
   ...
   }
  

The JS/CORBA adapter provides a pair of functions, “root” and “unroot” that disable/enable garbage-collection of an object. These functions are particularly useful if the lifetime of an object is not defined by its scope, e.g.

   {
   var tmp = compute();
   root(tmp);
   remote.foo(tmp);
   }

   ....tmp survives scope exit...
   in a method on the tmp obj:

   {
   unroot(this);
   }
    

Note that “root” can be called multiple times on the same object. The object is re-enabled for garbage-collection only after the same number of “unroot” calls have been made. Both “root” and “unroot” operate on *local* objects only, i.e. they are *not* an attempt to provide a means of distributed reference counting.

Credits

This software has been developed with assistance from Redfig Ltd. and LShift Ltd.

Finally

Any questions/suggestions? Drop me a line Matthias Radestock.

by
sophie
on
17/01/14

Grunt uglify file specs

I struggled a bit finding relevant examples of Gruntfile configuration for Uglify, so having solved a few specific problems myself, here’s what I came up with.

This is just a snippet from the whole Gruntfile of course, and contains half-decent comments already, though I’ll provide some extra explanations below to point out the most interesting bits.

// Variables used internally within this config.
conf: {
  app: 'app',
  dist: 'dist',
  // Just our own custom scripts.
  script_files: ['scripts/*.js'],
  // All scripts that should be minified into final result.
  // Ordering is important as it determines order in the minified output and hence load order at runtime.
  // We don't include jquery (though we could) as it's better to get it from Google where possible.
  minify_js_files: [
      'scripts/vendor/modernizr/modernizr.custom.js',
      '<%= conf.script_files %>',
      'scripts/polyfills/**/*.js']
},

uglify: {
  options: {
    banner: '/*! <%= pkg.name %> <%= grunt.template.today("yyyy-mm-dd") %> */\n',
    sourceMap: '<%= conf.dist %>/scripts/source.map.js',
    sourceMapRoot: '/scripts',
    sourceMappingURL: '/scripts/source.map.js',
    sourceMapPrefix: 2
  },
  // For dev, effectively just concatenate all the JS into one file but perform no real minification.
  // This means that the HTML is the same for dev and prod (it just loads the single .js file) but
  // debugging in the browser works properly in the dev environment. It should work even when fully
  // minified, given the source maps, but practice shows that it doesn't.
  dev: {
    options: {
      report: false,
      mangle: false,
      compress: false,
      beautify: true
    },
    files: [{
      expand: true,
      cwd: '<%= conf.app %>',
      src: '<%= conf.minify_js_files %>',
      dest: '<%= conf.dist %>/scripts/main.min.js',
      // Because we want all individual sources to go into a single dest file, we need to use this
      // rename function to ensure all srcs get the same dest, otherwise each would get a separate
      // dest created by concatenting the src path onto dest path.
      rename: function(dest, src) { return dest; }
    }]
  },
  prod: {
    options: {
      banner: '/*! <%= pkg.name %> <%= grunt.template.today("yyyy-mm-dd") %> */\n',
      report: 'min',
      mangle: true,
      compress: true
    },
    files: '<%= uglify.dev.files %>'
  }
},

Use of a rename function for configuring file srcs and dests

I was really struggling to come up with src/dest configuration for Uglify that pushed all of my source files into a single minified dest file. To be fair, this is trivially easy in the common case, as you can simply use files: { ‘dest_path’: ['file_1', 'file2'] }.

However I have my list of source files in <%= conf.minify_js_files %> and the paths therein do not include the root app/ directory, because this works out for the best in various other Grunt tasks (not shown) where I use cwd in the files block to supply that root dir. Unfortunately, without my custom rename function, a separate dest is calculated for each src file, by concatenating the src path with the dest, so instead of one minified JS file we get lots of individual files sprayed over all sorts of unintended locations. The trivial rename function I’ve used overrides those calculated dest locations to our originally intended single dest. Where different srcs have the same dest, the grunt-contrib-uglify plugin has the good sense to assume you want to merge their output. And hence we get the result we want. To be clear, this is only complicated because I want to use cwd in the file config rather than using the simpler approach.

Re-using files blocks in multiple places

You can share common options amongst multiple targets by putting them at the top level of the task and overriding/extending as required in the specific targets. However you can’t do this when specifying files. In my case I want to specify the same files for both dev and prod Uglify targets, so I specify them in full for dev then use Grunt’s templating facility to refer to them from prod with files: ‘<%= uglify.dev.files %>’.

Theoretically I could have put the definition in the conf block at the top, but it’s specific to Uglify and only used there so I prefer it to be local to the Uglify task. It seems obvious now that I can refer back to it like this, but at the time I struggled to see how to achieve it. I think I had a blind spot for the generic nature of the templating mechanism, having only used it in a rigid way for config previously, and still being very new to Gruntfiles.

Uglify may break JS debugging

I found that my minified JS files could not be successfully debugged in any browsers. I could see the original un-minified code thanks to the source maps and I could set breakpoints, but they usually wouldn’t trigger, or if I did break (e.g. with JS ‘debugger’ command in my code) it was impossible to get variable values. All very frustrating.

Whilst I’m developing I use Grunt’s watch task to serve my files and to auto-process modifications on the fly, so in that mode I turn off all the actual minification features of Uglify and effectively just concatenate the files together into one. Because it’s still the same single file with the same name as in production, I can use identical static HTML to include my JS script. The source maps are still there and allow me to see the individual files in the browser debugger.

by
Sam Carr
on
08/01/14

Habitat

The bespoke CMS allows an editorial team, spread across Europe, to contribute to the site and manage local translations to ensure that the site remains permanently up-to-date.

by
LShift
on
14/10/13

DeBeers

One example of a consumer-focused site built on EPiServer is the Forevermark website which promotes the marking of authentic diamonds and provides functions for consumers to validate their Forevermarked stones through a secure Web Service

by
LShift
on

CII

As a trusted technical partner, LShift is currently advising many areas of the business. We are leading the effort to ensure that the strategic migration away from the AS400 system toward .NET meets the organisation’s critical performance and scalability requirements.

by
LShift
on

BBM

This product has been a commercial and critical success – with thousands of core licences now being used. PCPlus tested it and gave it their Editor’s Choice award. Customers appreciate the way the rental model helps them run a professional business from day one without a large initial outlay. The model also allows them to keep things simple at first, but leaves them the option to move to more sophisticated versions of software when required.

Over various releases the suite has included the following:

  • Sage Line 50
  • Sage Payroll
  • Sage ACT! for Sage Line 50
  • Intuit Quickbooks Pro
  • Intuit Customer Manager
  • Mindleaders Software Skills
  • Mindleaders Business Skills
  • Mindleaders Business Skills Videos
  • Palo Alto Marketing Plan Pro
  • Palo Alto Business Plan Pro
  • BePro Staff and Heath and Safety online tools
by
LShift
on

BBC

LShift created a series of middle and front-tier applications serving multimedia components from various distributed locations. The core lesson modules were created predominantly using DHTML at the front-end to speed up and give a richer, more interactive experience. A system for transforming XML or the multimedia components from the content databases into the module was built using JavaScript.

The results of the initial proof of concept have been well received, with the overall feedback suggesting that this approach would be effective at providing more flexibility for the student and more support for the teachers. The various literacy, numeracy and maths applications have been tested across the whole of the National Curriculum and age groups and the project is currently undergoing government review.

by
LShift
on

Search

Categories

Feeds

Archives

2000-14 LShift Ltd, 1st Floor, Hoxton Point, 6 Rufus Street, London, N1 6PE, UK+44 (0)20 7729 7060   Contact us