Haylyn - Collaborative 3D Semantic Web Visualization and Analytics

Semantic Starry Sky using Binary RDF (HDT) Streaming over WebSockets

The image is a screen shot of a star field centered on our sun that was created from an RDF Turtle representation of the Hipparcos Star Catalog, which contains 118,218 stars, using the WebGL Haylyn client.  The RDF visual data was generated on a Haylyn server and was originally streamed as RDF Turtle over WebSockets.  HTTP polling was abandoned in Haylyn in 2011 due to poor performance and was switched to WebSockets.  However, in the use case of Hipparcos, the visualization data is in excess of 500,000 triples and load time from server to an actual visualization was aproximately 55 seconds which does not contribute to an enjoyable user-exeprience.

Text-based serializations of RDF (Turtle, N-Triples, RDF/XML, or the latest W3C recommendation - JSON-LD) are very handy since both human and machine alike can read them.  But there is a cost factor to this human readablilty.  In order to send server-based RDF to the Haylyn WebGL client, the binary memory structures on the server need to be serialized out to Turtle (or originally N-Triples) and then sent to the client where they have to be parsed into the client-side memory structures before being usable.  Text-based serializations are bulky but can be compressed to achieve a more compact size but at a cost of computational time both in the compression and the de-compression phase on the client.  Enter binary RDF.  Being unavailable, a javascript implementation of Binary RDF (HDT - a W3C member submission) was created and retrofitted into the Haylyn WebGL client.   Java libraries for HDT are available on the HDT web site and were added to the Haylyn server.  The RDF Turtle/WebSockets in Haylyn has been kept and the server/client can switch between text Turtle or Binary HDT during any particular session.  Further comparisons will be made.  The initial integration of HDT into Haylyn is not as efficient as it could be, that said, the initial timings are as follows to handle 2,034,211 triples:

TIme to Generate (milliseconds) Size in RAM (bytes)
Turtle 5,681 84,106,871
Binary HDT 7,935 14,154,034
Turtle with GZIP compression 7,733 13,367,910
Binary HDT with GZIP compression 8,168 7,977,923

These timings DO NOT include transfer time to client nor time for the client to digest this data.  Compression is performed with the Apache Compression library.  The major advantage of HDT is that it does not need to be interpreted on the client, but can be queried as it is a usable indexed memory structure.  Further optimizations of the javascript HDT implementation and server-size java interfacing will be done to improve performance.

The above is not quite the visual xanadu that is the Google Chrome 100,000 stars WebGL experiment which uses the Astronomy Nexus HYG dataset - a combination of catalogs which includes Hipparcos, but, the author of the 100,000 stars experiement lamented, "I feel like I've gotten to the point where my data was mixing too much with my code." Haylyn is data-driven with nothing specifically in code about stars.   Additional datasets are planned to augment the above visualization and will include exoplanet data, constellation data, planetary data, and publication data. 

If you cannot wait, please try out these excellent viewers for this type of data: Stellarium and/or the iPhone-based Exoplanet by Hanno Rein.

Viewing the Semantic Web Through the Oculus Rift

A couple of days ago, my developer version of the Oculus Rift Virtual Reality (VR) headset arrived.  It came in a very slick plastic case and hooked up to my computer in minutes.  I installed the software and ran the demonstration "world" and stuck my head into the Oculus Rift - Amazing!  The quality and immersive sense was better than I thought it would be.  A couple of years ago, I had ordered another virtual reality (VR) headset from a different company and, well, after five minutes of playing with that one, I shipped it back.  It was like looking at the screen down a five foot tunnel.  It had head tracking and it was neat for a few minutes, but I was never going to use it in any practical sense.  So, I shipped it back.

The Oculus Rift does not suffer from this tunnel effect!  Definately a keeper.  I wanted to interface the Oculus Rift with my Haylyn Project (formerly known as Nexus.  I'll go into the reasons for the name change some other time).  In the past couple of months, I had re-written the Haylyn WebGL/HTML5 client and changed the WebGL libraries from GLGE to Three.js.  GLGE was excellent to work with, but the project has been inactive for too long for my comfort and Three.js has a fairly active user and developer base.  The question now is how to access the Oculus Rift from within the browser?  After a little bit of searching, I downloaded and installed the vr.js code by Ben Vanik which includes a NPAPI plugin that works with Chrome and Firefox.  Ben Vanik provides several demos for using the vr.js with the Oculus Rift including a Three.js version.  The image in this article shows the dual screen effect seen without the Oculus Rift Headset on.  The image depicts the aggregated http traffic to my VIVO site http://reach.suny.edu as linked data that I did as a poster at the 4th VIVO National Conference last week.  See poster here.

With the Oculus Rift Headset, graphs and other visuals can be seen in 3D and with the added bonus of head-tracking, you can immerse yourself in a world of data and look up, down, left, right, diagonally, etc just by moving and turning your head.  Haylyn WASD functionality allows you to move around the scene of RDF linked data.  The "endless plane" and cubes in the image are objects that I added from the vr.js demo.  They looked good but have nothing to do with the colored RDF graph.  The vr.js libraries also work with Sixense's Razer Hydra which is a motion capture system for your hands.  Sixense is also working on a wireless version of the Hydra called the STEM System.  I can't imagine the mouse and keyboard being the pinnacle of computer/human interface technology.  Hey, W3C Device APIs Group, can we add the Rift?! The thought of being able to reach into the 3D scene and grab and manipulate triples and data with my hands...Some distractions are too cool to ignore...must...order.....Squirrel!

Subscribe to Front page feed