Debugging create-react-native-app with VSCode

Expo and Facebook recently released a command line tool called create-react-native-app that makes getting started with React Native easy. With this tool, you can start developing mobile applications for iOS and Android without having to install the SDKs for each of the platforms. This is very similar to the Phonegap Developer App or Ionic View for Cordova apps and works by uploading only the JavaScript part of the application to the player app. Consequently, you can develop and debug iOS apps from a Windows machine without even needing to connect the phone via a USB.
VSCode already supports debugging Expo apps and has support to attach to a packager that is already running.




To get started download VSCode and install the React Native tools extension for VSCode. The extension gives you the ability to debug source code right from inside the editor, supports syntax highlighting and completion and has code snippets for popular React constructs.

Time travel debugging

The extension uses Node to debug the Expo app. If we replace Node with node-chakracore, we also get the ability to use time travel debugging. To try out time travel debugging with create-react-native-app and VSCode

  1. Download node-chakra nightly builds - it supports Mac, Windows and Linux :)
  2. Grab the debugger code and save it as debugger.js - this is the same code that runs when a React Native app is debugged on Chrome
  3. To start recording a trace
    1. Create a folder called logs, adjacent to debugger.js
    2. Run <path-to-node-chakra>/bin/node --record debugger.js
  4. To replay a debugging session 
    1. Start up VSCode, head to the debug pane and create a new configuration to debug a node project
    2. Add the additional key value pairs to the configuration (as shown in the video). This basically sets the node executable to be ChakraCore, points it to the location of the logs that are used for time travel, and sets up source maps
    3. Hit the debug button - you will now see "Reverse Continue" and "Step back" in addition to the usual debug workflow controls
The time travel debugging part is still experimental, but if it sounds fun, let me know, and I would love to make it a part of the extension.Thanks to the amazing folks on the Chakra team for making this possible !

ReactConf - Web like Release and Development Agility

ReactConf 2017, Santa Clara

Links from the slides

Mobile Center and CodePush

VSCode for React Native

User Gesture Mirroring


Slides available on Docs.com - https://doc.co/19Qwun

Exponent Apps - testing on Multiple Form factors

Want to test your exponent app on multiple screen sizes ? Check out maya-kai
Exponent is a great way to build mobile apps - you get all the benefits of React Native, with none of the pain of installing the Android SDK or XCode. Since Exponent is now integrated into VSCode, my developer workflow pretty much involves firing up the VSCode editor and debugging the app running on exponent in my devices.
Despite an efficient authoring workflow, testing my app across devices is still a little cumbersome. With the need to support different devices and screen sizes, I still have to perform the typical test scenarios manually over all the devices.

User gesture mirroring for React Native

Last year, I had blogged about a project that lets you mirror user gestures across devices for React Native apps. This has the same functionality that browsersync has for websites. This library lets you interact with just one device and as the test scenario progresses, you can view the user controls across other devices with different screen sizes. Though this library was for originally meant for React Native, the code is all JavaScript and uses React's Event Model. This blog post explores the idea of trying out the library with apps on Exponent.

Exponent + Maya-kai

Surprisingly, no changes were required to make it work with Exponent! I simply npm install maya-kai --save and import it in main.js using var mk=require('maya-kai'); mk.start(); The devices (2 phones, 1 iPad) were on the LAN, so I just needed to ensure that the maya-kai server was accessible to them.

To automate this even more, I added a task that also launches my app on all the connected Android devices when I start the tests.

adb devices | egrep '\t(device|emulator)' | cut -f 1 | xargs -t -J% -n1 -P5 adb -s % shell am start -a android.intent.action.VIEW -d exp://<exponent URL>

I was unable to find an equivalent for iOS though.

Other developer workflow actions like live reload and hot module replacement also worked on all devices. General developer is also faster since there is no native app "install" process here.

Wishlist for Exponent 

  • If user gesture mirroring is be something built into Exponent, additional setup steps would not be needed anymore.
  • It would also be good to automate the process of launching the app for iOS devices when starting tests.  
  • There is also no dev vs production mode, so I am currently using 2 different apps, one for testing, and the other one being the real, published app, without Maya-kai embedded.

Conclusion

I think that this workflow is great for quickly testing multiple form factors. I could simply publish my exponent app, launch it on multiple devices and interact with them. This is also a neat way to share screen with remote clients - simply enable gesture mirroring and you can walk someone through you app's workflow.
I am also looking at integrating this with Appetize, enabling a virtual device wall, something that I did for Cordova applications.