A few posts ago, I wrote about protecting gmail using RSA SecurID. This post details how the idea is extended to protect any website using the Google Friend Connect way. The video below show how a RSA SecurID gadget embedded into a page using Google Friend Connect works. This demo also shows how the RSA SecurID authentication service could be availed as a service by smaller websites.
The gadget loads a SecurID protected page and a login page hosted as a service. The page loads up a new window with the login page where the user types in credentials. Once the login is successful, a authentication token is set in the main website's page that other gadgets can use. Checking for actual login would be done using a protocol similar to OAuth where "isAuthenticated" request would be made to the RSA Authentication service to check for authentication.
The demo was also extended for hosted RSA Adaptive Authentication solutions that websites can easily plug into their websites.
Fetching the numbers from SecurID for a web page
It started out as a joke but ended up invoking a quite an interest in the idea. Though I would not be permitted to discuss the idea in public due to confidentiality clauses, I thought it would be fun to jot down the way it was implemented in a 2 hour span for the RSA Hack day.
The idea required the random numbers generated by RSA securID for using it elsewhere. Interestingly, the algorithm is protected and there is no direct and simple API to get the numbers (due to obvious security reasons).
This was a smashup challenge, and a hack to demonstrate would be welcome. Since there was no direct way to get the numbers and reverse engineering the token was a lot harder, we decided to pick the numbers from a software token.
All the seeds for a user were installed in a couple of software token instances. When a user logs in and the token code is requested for, we load the required token in the software token. The software token is manipulated using macros. On the background, Snagit is set to pick up screenshots every fifteen minutes of the area showing the token. The screenshot served using a servlet on Tomcat that is called every 10 seconds. The servlet also deletes all but the latest image to keep the size of the folder in check. This image passed to an online OCR service that returns the required numbers.
A long way to get the numbers, but the implementation was fun. sheer hack !! :)
The idea required the random numbers generated by RSA securID for using it elsewhere. Interestingly, the algorithm is protected and there is no direct and simple API to get the numbers (due to obvious security reasons).
This was a smashup challenge, and a hack to demonstrate would be welcome. Since there was no direct way to get the numbers and reverse engineering the token was a lot harder, we decided to pick the numbers from a software token.
All the seeds for a user were installed in a couple of software token instances. When a user logs in and the token code is requested for, we load the required token in the software token. The software token is manipulated using macros. On the background, Snagit is set to pick up screenshots every fifteen minutes of the area showing the token. The screenshot served using a servlet on Tomcat that is called every 10 seconds. The servlet also deletes all but the latest image to keep the size of the folder in check. This image passed to an online OCR service that returns the required numbers.
A long way to get the numbers, but the implementation was fun. sheer hack !! :)
Updates to Ubiquity Scripts - Parser 2
Ubiquity Firefox extension upgraded its Parser to support a richer set of nouns and i18n. This rendered a couple of my ubiquity functions unusable. The commands I wrote could be found here and here. I have been able to port the "bookmark to delicious" command to the new version, but the "linkify" command seems to have problems.
Porting the "bookmark to delicious" command was simple, I just had to change the noun definition. Since the page does not really take any arguments, the only notes are those selected on the page. Thus, only the command name had to be changed to get it working.
Converting the "linkify" command was trickier though. The preview in the newer version seems too slow to be used to any interactivity. Hence, the user cannot really choose the search result that suits the page context. Looks like we would have to remove the use of preview pane and create a popup instead. This popup would let the user click on a link that would be the hyperlink for the selected text. This new UI would also let user link words with arbitrary search terms and looks through the pages of the search results. Watch out this space for the upgrade that I plan to work on, during the weekend.
Porting the "bookmark to delicious" command was simple, I just had to change the noun definition. Since the page does not really take any arguments, the only notes are those selected on the page. Thus, only the command name had to be changed to get it working.
Converting the "linkify" command was trickier though. The preview in the newer version seems too slow to be used to any interactivity. Hence, the user cannot really choose the search result that suits the page context. Looks like we would have to remove the use of preview pane and create a popup instead. This popup would let the user click on a link that would be the hyperlink for the selected text. This new UI would also let user link words with arbitrary search terms and looks through the pages of the search results. Watch out this space for the upgrade that I plan to work on, during the weekend.
Dynamically provisioning data centers with enVision and VI SDK
One of the most important characteristics of a cloud deployments is the ability to scale dynamically when severs seem loaded. Some of the metrics used for scaling include CPU, Memory and Bandwidth utilization. However, in most cases, these metrics are local to a specific system. The dynamic provisioning of additional capacity is also reactive to peak demands. This directly translates to loss of response for a short interval of time when the capacity is being allocated.
Enterprise deployments usually are a collection of heterogeneous system with well studied patterns of stress propagation. These stress patterns usually progress from geographic location to another or from one type of server to another (web server to database, etc). Hence, a provisioning system on a global scale would allow a proactive provisioning system, adding computing capacity to the correct type of servers.
One of the ideas we presented at the RSA sMashup challenge was to demonstrate this dynamic provisioning on a global scale. We picked up RSA envision to collect logs from servers deployed on Virtual Machines hosted by VMWare ESX Server. The trigger that provisions more machines are configured into the reports and alerts at EnVision. The alerts call a batch file that contains VI SDK commands to create and start servers.This file takes care of cloning the machine, bringing it up, etc.
Thus, envision alerts when it notices loads on servers that in turn provisions more servers that are prepared for the load when it progress to them.
Enterprise deployments usually are a collection of heterogeneous system with well studied patterns of stress propagation. These stress patterns usually progress from geographic location to another or from one type of server to another (web server to database, etc). Hence, a provisioning system on a global scale would allow a proactive provisioning system, adding computing capacity to the correct type of servers.
One of the ideas we presented at the RSA sMashup challenge was to demonstrate this dynamic provisioning on a global scale. We picked up RSA envision to collect logs from servers deployed on Virtual Machines hosted by VMWare ESX Server. The trigger that provisions more machines are configured into the reports and alerts at EnVision. The alerts call a batch file that contains VI SDK commands to create and start servers.This file takes care of cloning the machine, bringing it up, etc.
Thus, envision alerts when it notices loads on servers that in turn provisions more servers that are prepared for the load when it progress to them.
Encrypting Data before Storage on Cloud
With the cloud offering almost limitless storage, most data owners end up trusting the cloud provider for confidentiality and integrity of data. There are cases when it would be desirable to encrypt data before it leaves our systems to the cloud. Many enterprise deployments are already equipped with key management solutions and this could be roped in to manage keys used to encrypt data stored on the cloud.
For the sMashup, we hooked up RSA Key Manager and EMC Atmos cloud storage. The result was a transparent API layer over the existing Atmos API. Here is how the code looks for encrypting while uploading and the reverse while downloading data. The files are available here.
The code shows how files could be uploaded and downloaded. The code could also be used as an API to encrypt and decrypt byte streams making it a stand alone API. Since it is built on top of the existing ATMOS api, it becomes easy to rope it into existing projects. Here is the demo that we used for the 90 second presentation.
For the sMashup, we hooked up RSA Key Manager and EMC Atmos cloud storage. The result was a transparent API layer over the existing Atmos API. Here is how the code looks for encrypting while uploading and the reverse while downloading data. The files are available here.
The code shows how files could be uploaded and downloaded. The code could also be used as an API to encrypt and decrypt byte streams making it a stand alone API. Since it is built on top of the existing ATMOS api, it becomes easy to rope it into existing projects. Here is the demo that we used for the 90 second presentation.
Update to Reddit Bar
Here is a quick update to the reddit bar greasemonkey script that I had written. Looks like the id of the title is now removed and this led the greasemonkey script to stop functioning. A quick fix and it is back to normal.
Instead of relying on the id now, we iterate through all the tags that have the "title" class. This also selects the hyperlink to the article. To find this from the array of "title" class tags, we do a simple match to check if the tag has the document's title as the innerHTML. The title of the window has the "name of the story : Reddit". If the node is a link and the innerHTML does have the appropriate part of the document title, it is returned as the targetURL which should be used for the iFrame. Back to normal, I can not upmod stories right from the comments page. You can check out the script here.
Instead of relying on the id now, we iterate through all the tags that have the "title" class. This also selects the hyperlink to the article. To find this from the array of "title" class tags, we do a simple match to check if the tag has the document's title as the innerHTML. The title of the window has the "name of the story : Reddit". If the node is a link and the innerHTML does have the appropriate part of the document title, it is returned as the targetURL which should be used for the iFrame. Back to normal, I can not upmod stories right from the comments page. You can check out the script here.