Extensive experience in designing and implementing highly scalable system. Bauhaus inspired approach on system design - simplicity, sensible, and honest workmanship.
Research at the intersection of three massive trends: Machine learning, Cloud computing, and Crowdsourcing. Focusing on making sense of Big Data by using massive warehouse-scale computer (WSC), and specialized Hadoop like MapReduce tool – Spark and many other lab developed technologies – Mesos, Shark (Hive), and Tachyon (HDFS in memory).