Sauce Labs' Jason Huggins: App Testing Is for the (Angry) Birds
"I wanted a tool that I could give to all of my developers and not have to worry about license fees. Open source means that I can very quickly and easily get it to all of my developers," said Jason Huggins, cofounder and CTO of Sauce Labs. "It reduces the friction of getting contributions for people. If it is free to download and is free to use, that means you can skip the part where you talk to the sales guys."
05/14/13 5:00 AM PT
Jason Huggins took Web browser and website testing to new levels. Pushed by several Aha! moments, he recognized a pressing need for automation in applications testing.
He also discovered that no existing proprietary software provided cross-platform features. What did not exist in a marketable box, Huggins built as an out-of-the-box open source solution using the Selenium software he created.
Not only did he develop his free software into a well-populated community to support and further develop the Selenium Project, he also expanded his concept of automated cross-platform app testing to include advanced features from another open source project. In the process, Huggins invented a robot that plays Angry Birds, with the ultimate goal of automating device testing.
In August 2008 Huggins launched Sauce Labs to advance his Selenium Project. Sauce Labs created a test cloud environment for testing Web apps against nearly 100 browser combinations used to run about two million tests each month.
In this interview Jason Huggins discusses why these open source projects are best suited for the emerging field of mobile app testing.
LinuxInsider: What led to starting up Sauce Labs?
Jason Huggins: Two moments were definitive in my realizing that this activity would make a good company. I was working as a consultant using Selenium to do browser testing. As happens with technology, when you solve one problem in testing, a secondary problem comes up that needs a new solution. The process of testing apps in different browsers was taking too long. It was in excess of 20 minutes. The app developers were starting to complain. If it takes too long, they were going to sever the contract.
LI: How did you solve the problem and save the contract?
Huggins: All of the tests were running on one machine. So I suggested that we get four machines and divide up the tests so we could divide and conquer. So then the tests each took five minutes, and all the developers were happy again. That was a karma moment. I began wondering if those results were repeatable.
LI: What was your second karma event that convinced you to start Sauce Labs?
Huggins: Around the same time Google was recruiting me. Internally at Google they were working on Google Docs and stumbled upon that same problem. They were running a complicated series of tests that was taking over an hour. So I applied that same bright idea and divided up the tests among a group of machines. The entire billable time was drastically reduced.
LI: So your focus was on developing a software project more than going into an official business?
Huggins: With all of Google apps running on what we called the Selenium farm (a bank of computers running Selenium for testing), I started wondering if we had enough for a standalone company. I wondered if this was a successful solution to a problem. If there were lots of people having this problem, was there a thing there?
Another thing developed that became a key enabler. Amazon came out with its cloud, the Amazon Web Services (AWS). That was something that could enable Sauce Labs to exist. Since then, we built out our own cloud and have our own data center.
LI: How much competition or demand for application testing, either proprietary or open source-based, existed five or six years ago?
Huggins: Specifically, I needed a tool that would help me test these Java scripts and complicated features in Firefox and in Internet Explorer. You can't just do that on a Macintosh computer and a Windows computer. Regardless of open source or proprietary, there were no solutions to let me test on any browser on any operating system. And on top of that was kind of a third requirement to do it on my favorite programming language or any language for that matter. The proprietary tools were specifically Windows-based, and you could only do it with Visual Basic.
LI: What gives open source products for testing an edge over proprietary products?
Huggins: Selenium's success came from its ability to test Firefox and IE on Windows or Mac or Linux and be able to drive it from Ruby or Python. It was a lot of work for that flexibility. But the specific thing is that if there was a proprietary tool at the time that could do all the things Selenium does, I would have just used that. I have other things to do with my life. But back then and still true today, the proprietary tools from a features point of view were too incomplete. So we had to go create an open source tool.
LI: How surprised were you over the early success of the Selenium Project?
Huggins: I would have thought we would become famous for the project we created Selenium for. I did not think this testing thing was actually going to take off. It did.
LI: Were there other advantages to developing the open source tool rather than a proprietary one?
Huggins: At that point with me being at Google, there was more of a desire to do open source kinds of things. I wanted a tool that I could give to all of my developers and not have to worry about license fees.
Open source means that I can very quickly and easily get it to all of my developers and anyone else involved in the project. It reduces the frictions of getting contributions for people. If it is free to download and is free to use, that means you can skip the part where you talk to the sales guys, and you could skip a whole bunch of other stuff.
The other factor is the open source community. The Selenium Project has a really big community now. There are hundreds of thousands of people with Selenium experience now. That is a huge benefit for open source.
LI: What differentiates the Selenium and Appium projects?
Huggins: Appium was developed to test not just websites but to also test mobile apps, websites and native applications all using the same tool. It still sort of talks the same Selenium language. So someone who knows how to use Selenium is still using the same libraries and the same AVIs to drive it.
But the capabilities we provided at the back end specifically targeted mobile as the platform. But with Appium it is the native apps and the Web apps. We were specifically targeting the iPhone and iOS. We will be branching that out to include Android a little bit later. Selenium was always just about website testing.
LI: Tell me about your robot's role in device testing.
Huggins: I have always been kind of a tinkerer interested in mechanical and electronics things. I have been dabbling with that for a long time. Officially, it is a fun side project. The robot was a nice thing to merge my interests in mechanical and electrical things with testing.
I had this notion while working with Selenium that it was sort of like a robot trapped inside a computer screen. A robot and Selenium are similar in the sense that you are telling them to go to a particular place and touch something or do something. Instead of having Selenium being kind of a virtual finger touching dots on a screen, I decided to make a robot that was kind of like a real finger clicking a button.
LI: That sounds like another out-of-the-box solution, like splitting tests over a bank of computers.
Huggins: It is a little less ridiculous when you are putting in mobile testing on tablets and phones and things like that. The goal of Selenium is to have it come as close to doing the things a real user would do. With these new mobile interfaces, you are touching a phone, and you are doing swipes and drags, not just typing on a keyboard.
LI: Is using a robot an effective logical approach?
Huggins: It is a different paradigm. So there is a different level of testing. I am still exploring if actual robots are potentially useful to do this kind of testing. There are other things to investigate as well, like what happens to your app when you rotate the device 90 degrees, or what happens when you shake the thing. There are always interface gestures that you can still simulate on the desktop, but there is a real need for developers now to be really confident that their gestures will work when people are actually using them on a real device.
So when you start with that assumption, then it was logical to say, "Okay, we have a real app on a real device, so how do I test this in as real a scenario as possible?"
LI: How did developers react to this testing approach?
Huggins: I used an iPad-style device and effectively built a robot around that, and moved it around the screen to actually touch buttons. As far as the developer is concerned, whether you use Selenium or Appium or the robot to run the test, it is essentially the same thing.
Testing is notorious for being a dry subject. And yet robots are really fun and interesting. There is nothing different from what the robot does and what Selenium and Appium do to test your iPad app. So embodying Appium and Selenium in a robot, all of a sudden a dry subject is now popping off the screen.
LI: How responsive have app developers been to what Selenium and Appium provide to code testing?
Huggins: A couple of people on the project have taken the testing protocol to make it an official W3C specification. Web browser vendors would adopt the Web driver stack, and then it would become the job of a browser vendor to support this kind of automation, as opposed to having all the heavy lifting done by the other side.
LI: How far along is that adoption process?
Huggins: That already has been adopted on the Opera Project. Firefox has been working with it as has the Google Chrome team as well. The long-term goal for the project is to have a certification. We had a first Selenium Conference to jump start traction. We are now having a third one this year. The Selenium Project is also a member of the Software Freedom Conservancy. These things were never really on our agenda when we started. But achieving them is a sign of our maturity as a project.