Rise of the Web Workers - NationJS

Slides from the presentation at NationJS.

Links

Slides


Abstract

Modern web applications are awesome. And complicated. The Javascript libraries that power them today do a lot of work to abstract out the hard parts. Whether using constructs like Virtual DOM, or fancy change detection algorithms, the amount of work that the Javascript library does is only increasing.
Unfortunately, all this work now competes for the same resources that the browser needs, to do things like render a page, or apply styles. In many cases, this makes the browser slow, preventing the web application from attaining its full, smooth experience.
Web workers have been in the browser for a while, but they have mostly been used for engaging demos like adding mustaches in a cat video :)
In this talk, we will explore how mainstream Javascript libraries like React or Angular use Web Workers to get great performance. We will look at quantitative numbers from hundreds of test runs that conclusively show how Web Workers can make apps faster. Finally, we will also look at practical examples to convert existing examples, and the potential limitations of this approach.

User Interaction Sync for ReactNative

Test ReactNative on multiple devices simultaneously - github:Maya-kai
As a developer creating user interfaces, it is important to ensure that the screens and controls we build work well across the wide range of devices that our customers use. On the web, most well written sites have responsive layouts that can be tested on different viewports using browser developer tools.

Testing Responsive Layouts

When testing mobile sites on real devices, tools like BrowserSync reduce having to perform repetitive interactions on every device by simultaneously mirroring clicks, typing and scrolls across multiple screens. This method is particularly useful when trying the app on a wide assortment of Android devices with different screen sizes, capabilities and operating systems.
This would also be useful for testing native mobile applications and I had written about using BrowserSync for a Cordova/Phonegap application. One of the comments on the post suggested using this multi-screen, mirrored-interaction testing for ReactNative apps too.
While Cordova applications are native mobile applications, they have a full screen WebView which can be used by BrowserSync to listen to top level "document" or "window" events. There is no WebView, "document" or "window" in ReactNative and listening to or simulating native events could get tricky. This podcast explains how ReactNative implements its own event sub-system and we could tap into this model to achieve the same result.

Demo

I created Maya-Kai (மாய கை), a project that helps you test ReactNative applications across multiple devices. "Maya-Kai" literally means "Magic Hand" in Tamil and this invisible hand copies events on one device and mirrors it on other devices.



Adding it to your project is as simple as importing the package and starting it up. The project is open source on github with an MIT license. More instructions on how to use it are on the README in the github repository.

Architecture and internals

ReactNative's event system has a concept of event plugins. These plugins are internal and provide handlers for events like touch, change or inputs. When maya-kai is imported to the main index.ios.js or index.android.js, it appends an extra plugin that simply listens to all events and broadcasts them to a web socket server.
Here is the architecture, read the steps in the numbered order.


The event is then broadcast to all other connected devices. A listener on the other devices receives the event and injects it using ReactNativeEventEmitter._receiveRootNodeIDEvent. The events are serialized using a JSON Object Graph library (called JSOG) rather than JSON.stringify since they contain cycles. 

Record and Replay

While broadcasting and playing was the primary use case, the events could be recorded into a file and replayed back at a later time using the exact same method. This enables the record/replay scenario that can be useful for navigating to common screens during development, or even perform UI integration testing. Check out the clients folder to see how simple it was to implement them. 

Next Steps

Most integration and UI testing for ReactNative apps today are done using Appium. However, the way to select elements on the page typically would be using testID, accessibility Ids or names - something that is not natural to React. 
ReactJS test frameworks like Enzyme on the other hand work at a component level. Since the events recorded in this approach are at a component level, this could possibly be a way to perform integration testing too, without having to rely on the Selenium JSONWire protocol. I am playing with the idea to create an integration testing framework, and would love to hear if you have suggestions or want to collaborate. 
I am also trying maya-kai on all the ReactNative applications I can lay my hands on, to ensure that it works for all events and covers all edge cases. If you have a ReactNative app or know of one and want to see how it works with maya-kai, please leave a comment below and we can try it out together :)

Time Travel (Debugging) with ReactNative

One of the things I love about React and ReactNative is the emphasis on DX (Developer Experience), in addition to the amazing UX (User Experience). I was particularly fascinated by the Time Travel Debugging feature in Redux.
Being a fan of Time Travel (the last 3 books I read were stories of time travel; I am re-watching Dr. Who these days), I started experimenting with bringing true Time Travel to ReactNative. Recently, folks from the Chakra team showed off support for time travel debugging in Node at the NodeSummit. In a previous blog post, I had also explained how I was using a node process to enable VSCode debug ReactNative apps.
Putting these together, I was able to enable time travel debugging for ReactNative using VSCode and Chakra core - here is a demo.




Link: https://www.youtube.com/watch?v=waiZsNI4SYA

What is Time Travel Debugging ?

In addtition to typical debugging actions like stepping into or stepping over code, time travel also allows developers to "step back" into previous statements, inspecting state of variables backward in time. This is typically achieved by first recording user actions and then replaying them with a debugger attached.

Chakra TTD ? 

To record a new debug session in Chakra, the -TTRecord flag is passed. Due to this flag, all the states of the app at various points in time are recorded and finally written into a folder when the node process exits. The node process is then started with a -TTReplay flag that replays the actions. VSCode can attach to that node process as a debugger and has support to step back into statements, in addition to the usual debug workflows. The Chakra Core page has more information about how to get the latest builds and try it out.

Chakra and ReactNative

When the developer selects "Debug JS Remotely" from the menu, ReactNative is put into a proxy mode. The packager then simply opens Chrome and runs all the ReactNative code on Chrome. Chrome Dev tools are attached to this Chrome page and features like breakpoints or watches are possible.
In the VSCode ReactNative debugger, we replace Chrome with a Node process that VSCode can attach to, as a debugger.

Try it today

As VSCode simply runs Node, I just needed to replace Node with Node-Chakra. Alternatively, here is a version of the code extracted from the Chrome debugger. You can download this and run it using Node or Node Chakra. When starting debug from the app, all instructions are now executed on this process. There are many IDEs and tools that can attach to and debug node apps - any one of them could be used.

Check out the ReactNative VSCode extension that I am working on, or follow me on twitter for updates on this experiment :)