React Native - Under the hood - Connect.tech 2016

React Native under the hood
Presented at Connect.tech 2016, Atlanta GA

Abstract:  ReactNative is a way to create native applications for iOS, Android and Windows devices in JavaScript. In this session, we will look under the covers at the technology that enables ReactNative to drive native user interface components. We will look at a typical developer workflow and the way ReactNative is tied to developer tools. We will round it off with ways to deliver updates to your app without needing to submit updates to the app store.




Direct Link : https://doc.co/D8fsNT

Velocity NY 2016 - Using Chrome Traces to Measure Web and Performance

Velocity Conference, New York, September 19-22



Title
Using Chrome traces to measure rendering performance of web pages and mobile apps

Abstract
Ten years ago, increasing the performance of a website usually meant tweaking the server-side code to spit out responses faster. Today, it is mostly about ensuring that content is delivered to the user as fast as possible. However, it is still very hard to measure the user experience in terms of the smoothness and runtime performance of a website.
Chrome has excellent devtools that help fixing rendering performance issues in web pages. Parashuram Narasimhan demonstrates how to create scripts that use the same source of information as the devtools to automatically measure metrics like frame rates, paint times, and layout calculations and explains how this method is being used to continuously track rendering performance of web apps. Topics include:
  • How to leverage the same source that Chrome DevTools use to collect trace information about a web app’s performance
  • Understanding the events in a Chrome trace and tips and tricks to parse and aggregate them into usable metrics
  • How to plug this back into any web performance system like WebPagetest or Speedcurve so that this information can be monitored continuously

Rise of the Web Workers - NationJS

Slides from the presentation at NationJS.

Links

Slides


Abstract

Modern web applications are awesome. And complicated. The Javascript libraries that power them today do a lot of work to abstract out the hard parts. Whether using constructs like Virtual DOM, or fancy change detection algorithms, the amount of work that the Javascript library does is only increasing.
Unfortunately, all this work now competes for the same resources that the browser needs, to do things like render a page, or apply styles. In many cases, this makes the browser slow, preventing the web application from attaining its full, smooth experience.
Web workers have been in the browser for a while, but they have mostly been used for engaging demos like adding mustaches in a cat video :)
In this talk, we will explore how mainstream Javascript libraries like React or Angular use Web Workers to get great performance. We will look at quantitative numbers from hundreds of test runs that conclusively show how Web Workers can make apps faster. Finally, we will also look at practical examples to convert existing examples, and the potential limitations of this approach.

User Interaction Sync for ReactNative

Test ReactNative on multiple devices simultaneously - github:Maya-kai
As a developer creating user interfaces, it is important to ensure that the screens and controls we build work well across the wide range of devices that our customers use. On the web, most well written sites have responsive layouts that can be tested on different viewports using browser developer tools.

Testing Responsive Layouts

When testing mobile sites on real devices, tools like BrowserSync reduce having to perform repetitive interactions on every device by simultaneously mirroring clicks, typing and scrolls across multiple screens. This method is particularly useful when trying the app on a wide assortment of Android devices with different screen sizes, capabilities and operating systems.
This would also be useful for testing native mobile applications and I had written about using BrowserSync for a Cordova/Phonegap application. One of the comments on the post suggested using this multi-screen, mirrored-interaction testing for ReactNative apps too.
While Cordova applications are native mobile applications, they have a full screen WebView which can be used by BrowserSync to listen to top level "document" or "window" events. There is no WebView, "document" or "window" in ReactNative and listening to or simulating native events could get tricky. This podcast explains how ReactNative implements its own event sub-system and we could tap into this model to achieve the same result.

Demo

I created Maya-Kai (மாய கை), a project that helps you test ReactNative applications across multiple devices. "Maya-Kai" literally means "Magic Hand" in Tamil and this invisible hand copies events on one device and mirrors it on other devices.



Adding it to your project is as simple as importing the package and starting it up. The project is open source on github with an MIT license. More instructions on how to use it are on the README in the github repository.

Architecture and internals

ReactNative's event system has a concept of event plugins. These plugins are internal and provide handlers for events like touch, change or inputs. When maya-kai is imported to the main index.ios.js or index.android.js, it appends an extra plugin that simply listens to all events and broadcasts them to a web socket server.
Here is the architecture, read the steps in the numbered order.


The event is then broadcast to all other connected devices. A listener on the other devices receives the event and injects it using ReactNativeEventEmitter._receiveRootNodeIDEvent. The events are serialized using a JSON Object Graph library (called JSOG) rather than JSON.stringify since they contain cycles. 

Record and Replay

While broadcasting and playing was the primary use case, the events could be recorded into a file and replayed back at a later time using the exact same method. This enables the record/replay scenario that can be useful for navigating to common screens during development, or even perform UI integration testing. Check out the clients folder to see how simple it was to implement them. 

Next Steps

Most integration and UI testing for ReactNative apps today are done using Appium. However, the way to select elements on the page typically would be using testID, accessibility Ids or names - something that is not natural to React. 
ReactJS test frameworks like Enzyme on the other hand work at a component level. Since the events recorded in this approach are at a component level, this could possibly be a way to perform integration testing too, without having to rely on the Selenium JSONWire protocol. I am playing with the idea to create an integration testing framework, and would love to hear if you have suggestions or want to collaborate. 
I am also trying maya-kai on all the ReactNative applications I can lay my hands on, to ensure that it works for all events and covers all edge cases. If you have a ReactNative app or know of one and want to see how it works with maya-kai, please leave a comment below and we can try it out together :)

Time Travel (Debugging) with ReactNative

One of the things I love about React and ReactNative is the emphasis on DX (Developer Experience), in addition to the amazing UX (User Experience). I was particularly fascinated by the Time Travel Debugging feature in Redux.
Being a fan of Time Travel (the last 3 books I read were stories of time travel; I am re-watching Dr. Who these days), I started experimenting with bringing true Time Travel to ReactNative. Recently, folks from the Chakra team showed off support for time travel debugging in Node at the NodeSummit. In a previous blog post, I had also explained how I was using a node process to enable VSCode debug ReactNative apps.
Putting these together, I was able to enable time travel debugging for ReactNative using VSCode and Chakra core - here is a demo.




Link: https://www.youtube.com/watch?v=waiZsNI4SYA

What is Time Travel Debugging ?

In addtition to typical debugging actions like stepping into or stepping over code, time travel also allows developers to "step back" into previous statements, inspecting state of variables backward in time. This is typically achieved by first recording user actions and then replaying them with a debugger attached.

Chakra TTD ? 

To record a new debug session in Chakra, the -TTRecord flag is passed. Due to this flag, all the states of the app at various points in time are recorded and finally written into a folder when the node process exits. The node process is then started with a -TTReplay flag that replays the actions. VSCode can attach to that node process as a debugger and has support to step back into statements, in addition to the usual debug workflows. The Chakra Core page has more information about how to get the latest builds and try it out.

Chakra and ReactNative

When the developer selects "Debug JS Remotely" from the menu, ReactNative is put into a proxy mode. The packager then simply opens Chrome and runs all the ReactNative code on Chrome. Chrome Dev tools are attached to this Chrome page and features like breakpoints or watches are possible.
In the VSCode ReactNative debugger, we replace Chrome with a Node process that VSCode can attach to, as a debugger.

Try it today

As VSCode simply runs Node, I just needed to replace Node with Node-Chakra. Alternatively, here is a version of the code extracted from the Chrome debugger. You can download this and run it using Node or Node Chakra. When starting debug from the app, all instructions are now executed on this process. There are many IDEs and tools that can attach to and debug node apps - any one of them could be used.

Check out the ReactNative VSCode extension that I am working on, or follow me on twitter for updates on this experiment :) 

Using Cordova plugins in ReactNative

A tool use Cordova plugins with React Native - link

As ReactNative is maturing into a stable platform to create mobile applications for iOS and Android devices, developers are starting to need more native modules to leverage device APIs like bluetooth or the camera. Apache Cordova (formerly called Phonegap) is a similar runtime can be used to build mobile applications, but displays the user interface in a using a WebView. Apache Cordova also has an large eco system of plugins that enable these webview based applications to call native code.
In a previous blog post, I had written about the project where a ReactNative application could use the exiting Cordova plugins to call device APIs.
The project has evolved and supports many more plugins and features. Here is a kitchen sink like application with many Cordova plugins are added to the ReactNative applications.



The project has now been updated to support ReactNative 0.19+, and can also use Cordova 6.0+ plugins.
All these changes, including the  instructions on how to add the react-native-cordova-plugin into a ReactNative project are at the README file. 

Some of the big changes include
  1. Support for plugins that run on initialization. For example, the cordova-device-plugin runs as soon as the app is initialized and makes a window.device object available, populated with attributes about the device.
  2. Added support for event listeners so that plugins like geolocation or cordova-plugin-device-orientation can now subscribe to get updates when the compass changes. 
  3. ReactNative 0.18+ changed the MainActivity and how native modules are included. These changes are now available in the plugin. 
Currently, the plugin adapter only supports Android and I am learning iOS to add iOS support too. Check out the project on github and open an issue to ask questions about the integration or to help with fixing a bug.

Using Webworkers to make React faster

Tl;Dr; ReactJS is faster when Virtual DOM reconciliations are done on a Web Worker thread. Check out the difference at the demo page

A typical ReactJS application consists of two parts - the React library responsible for most of the complex Virtual DOM calculations, and React-Dom that interacts with the browser's DOM to display contents on the screen. Both these are added to the page using script tags and run in the main UI thread.
In a blog post a few weeks ago, I had written about an experiment where I tried to run the React Virtual DOM calculations in a Web worker instead of main UI thread of the browser. I had also run performance measurements to understand the impact of parameters like node count or parallel workers on frame rates.

Recap of previous results

The frame rate numbers in themselves were not conclusive from the previous implementation. It was observed that the real benefit of Web Workers only surfaced when there were sufficiently large number of nodes to change. In fact, the performance of Web Workers was worse than the normal React implementation when the node count as small as in a typical for most applications.

Updates and new results

The reason that the Web-workers case is slow was due to the time spent passing and processing messages between Web Workers and the main UI thread. I was trying to solve this problem by trying to find an optimal batch size so that the message processing time is much less than actual DOM manipulation. While tweaking the batch size did not yield great benefits, I got a couple of good suggestions from folks on the internet.  
  1. The first suggestion was to use transferable objects instead of using JSON data to pass messages. The DOM manipulation instructions I was passing between the worker and the UI thread did not have a fixed structure. Thus, I would have to implement a custom binary protocol to make this work.
  2. The second suggestion was to simply use JSON.stringify when passing messages. I guess this is similar to transferable objects, just that in this case, it is a big blob of 8-bit characters. There is also a comment about this by one of the IndexedDB authors.
By 'stringifying' all messages between the worker and the main thread, React implemented on a Web worker faster than the normal react version. The perf benefit of the Web Worker approach starts to increase as the number of nodes increases. 

I wrote an automation script to calculate the frame rates using browser-perf, and here is the chart. The tests were run on Desktop Chrome on a Macbook pro, and a Nexus Android device.
As the number of nodes get to more than 100, the difference is not very visible. To make the difference explicit, here is the same chart with the frame rates in a logarithmic scale when running on desktop chrome.

As you can see from the charts, the React Worker version is at least as fast as, if not faster than the normal version. The difference starts to get more pronounced as the number of nodes increases.
A good experiment should be reproducible, and you can use these instructions to run the tests and collect the information, or simple use Chrome's FPS meter to see the difference in the worker and normal pages.

A real world app

While it worked well on an articifial app like DBMonster, it is also important to test this idea on typical real world apps. I wrote a todo app that also serves as an example to show the changes needed in a react app to make it work with Web workers. The changes are not many and we basically need to separate React and React-DOM into the worker and main threads respectively.

Browser Events

A web worker does not have access to the browser DOM and hence cannot listen to click or scroll events. Presently, React has an event system with a top level event listener that listens to all events, converts them into synthetic events and sends it over to listeners that we define in the Virtual DOM (in JSX files).
For our webworker case, I re-use this event listener and subscribe to all events. Thus, all events are handled in the main thread, converted to synthetic events and then passed over to the worker. This also means that all the calculations to create synthetic events happens in the main thread. A potential improvement would be passing the raw events over to the worker and calculating synthetic events and bubbling on the worker.
The other issue is about semantics like preventDefault() or stopPropogation(), as also described in the pokedox article. Responding to event in a browser is synchronous while passing messages and getting a result back from a web worker is asynchronous. Thus, a way is needed to determine if we need to prevent default even before the event handler running on a worker can tell us.
At the moment, I simply prevent all default actions, but there are two options here to ensure correct behavior. As vjeux suggests, we could use a pure function that can be serialized and sent to the main UI thread from the worker. Another option would be to prevent the current event and raise another event in case preventDefault is not called.
I am still exploring the options and as other frameworks start offloading work to web workers, I am sure we could come up with a pattern.

Next Steps

The tests conclusively tell me that Web Workers are always better. May be we are in an era where Web Workers are finally used by mainstream Javascript framework to offload all expensive computations.
My implementation may have some gaps and I would like to try it out on more real world apps. If you have an app suggestion and would like to try it out, I would love to work with you. You can either ping me, or head over to the github repo to send in pull requests !

Writing a custom debugger for ReactNative

ReactNative enables us to build mobile apps that have the elegance of a native use interface while taking advantage of a fast, web like development process. The creative use of Chrome devtools to debug the JavaScript code is definitely a big plus in the workflow of a developer. While I love Chrome for debugging, I still prefer to set breakpoints or watch variable right from within my editor. This way, I still benefit from editor features like syntax highlighting and autocomplete, support for my backend system and simply having lesser windows cluttering my desktop.
Over the past few weeks, I was experimenting with ways to add debugging to my editor, and this post is an explanation of how to add custom debuggers to ReactNative. Our team is planning to add debugging to a bunch of other features that we plan to release as an extension for VSCode.

ReactNative Debugger today


Before writing a custom debugger, it is useful to appreciate how the existing setup works. I found an article that has an excellent explanation, though it is for an older version. The biggest change from the article is the use of a web worker in order to provide an isolated sandbox for the running scripts.
I created an "old-style" UML sequence diagram, hoping to capture most of the concepts without going too deep into the details.



The full SVG file may be easier to read.  Most of the messages have a direct correspondence to methods in the source code.

Path to a Custom Debugger

When trying to implement a custom debugger, I considered the following approaches
  1. Attaching a Javascript debugger directly to the Javascript VM packaged with the app on the device. This is probably the most accurate debugger since you are debugging the code running in its real environment. I believe that the NativeScript debugger uses this approach, but it was a little hard to implement.
  2. Create a parallel JSDebuggerWebSocketClient class to send messages to a process that I write, instead of sending it to the packager. While my process would have all the necessary debug hooks, I would still need to get source files and source maps from the packager.
  3. Simply attach a debugger to the running Chrome process. This seemed like the simplest case, but I was not a fan of having Chrome open and using it to just execute Javascript.
I finally settled on an variation of the third approach where instead of opening Chrome, I open a headless Node process and attach a debugger to that. Instead of launching Chrome, my node process would simply need to open a web socket connection to the packager, and the debug process would now be redirected to the new Node process. Most editors already have excellent support for debugging Node.

Refining the debugger

Since the packager now proxies to the Node process instead of Chrome, some improvements are needed in the Node process
  • In case of the Chrome debugger, ReactNative modules are loaded using the webworker construct - "importScripts". A Node process does not have a simple way to load scripts from a web server. Thus, we had to implement a way to download the code, and "require" it using runInNewContext. The sandboxed context also allows code isolation that the Webworker provides.
  • Sourcemaps also have to be downloaded and changed so that they point to source files on the local system. 
  • For websockets capability in the node process, we could use the websocket npm module that provides an excellent, w3c compliant interface which could be used as a drop in replacement.
  • Instead of requiring the user to shake the phone to enter into the debug mode, we could run adb shell am broadcast -a "com.rnapp.RELOAD_APP_ACTION" --ez jsproxy=true to enable proxy mode on the app. 
However, we still suffer from one problem. ReactNative hardcodes the fact that Chrome needs to be launched when debugging starts. If Chrome connects to the packager's websocket fast enough, our Node process will not work. 
Here is a pull request that looks at an environment variable and then launches a custom process, instead of defaulting to Chrome. This is similar to the way custom editors can be launched from ReactNative. I hope that the pull request is merged soon, so that custom debuggers can be added. 

The final product

Putting all of this together, a demo video of the capabilities is up on youtube. We plan to release it as a part of VSCode+ReactNative extension. In addition to debugging, you would also have support for Javascript and JSX syntax highlighting, autocomple, and ways to call ReactNative commands from within VSCode.
You can also signup for a preview. If you have additional feature requests or ideas that you think we should implement, please ping me and our team would love to talk to you.