CSAIL PhD has made seven robots, and still finds time to meditate.

Fourth-year EECS student Julian Straub studies how robots can better understand their complex surroundings.
Fourth-year EECS student Julian Straub studies how robots can better understand their complex surroundings.
Bookmark and Share

For Julian Straub, one man’s trash truly became his treasure when a microcontroller sparked a keen interest in the field of robotics and artificial intelligence. The German native is a fourth-year EECS student studying how robots can better understand their surroundings. Straub studied electrical engineering at the Technical University of Munich and completed a dual-masters program at the Georgia Institute of Technology before coming to MIT.

How early did you get the computer science bug?

I became fascinated with robots in high school when my dad gave me a microcontroller to play with, left over from one of his projects, and I thought, what am I going to do with that? I was doing programming before, but after that I became really excited about building robots.  I loved that it was at the intersection of computer science, electrical engineering, and mechanical engineering-- and how what you programmed interacted with the world.

Then I did a robotics competition in high school that spurred my interest in simultaneous localization and mapping (SLAM), which means that you are trying to determine where you are and what the environment looks like at the same time. During the competition, I was trying to bring ping pong balls to the other side of a field, where you only had two passageways to roll them through.  There was a moment when what I was working on wasn’t functioning robustly, and I thought to myself, there must be a better way! At that point I became very interested in AI.

What are you working on at CSAIL?

I am looking at the problem of understanding what is going on in the environment, with applications in robotics, augmented reality, and virtual reality. I’m trying to capture notions of regularity in the environment. Making robots capable of dealing with the uncertainty and complexity of the real world; the problem of learning a representation of the world from various sensor inputs of the robot. It goes back to the beginnings of the robot driving around and not knowing where it is.

What effect do you think your area of work will have on the world in the next decade?

There are all sorts of robots and computer vision systems emerging from labs and companies right now. Autonomous cars, virtual and augmented reality devices - they all need this interface with the real world to know where they are, what’s going on around them, and where they can go. 

For a car, you want to know where buildings are so you don’t crash into them. For virtual reality, your headset needs to have an idea of the environment so you don’t walk into a wall. For augmented reality, you need to have a model of the world to be able to augment it. Algorithms trying to understand what the environment looks like are going to have a huge impact by enabling those technologies.

Beyond those areas there is also potential in medical applications. For example, I am currently working with fellow graduate student Karthik Rajagopal, two UROPs and professor Barzilay on lymphedema detection from 3D scans of a patient's arm.

What is your favorite thing about doing research at CSAIL?

The people! Being surrounded by like-minded, energetic people that I can learn a lot from is very motivating. For example, I learned OpenGL in two days by sending an email to the vision and graphics mailing list. I had very useful replies within three hours, and the following discussions helped me pick up this new knowledge quickly.

What is the biggest challenge you face in your work?

It’s a tradeoff between how much your model can capture of the environment, and the computational limitations, especially when constantly getting new data. What approximations can I make to the model and still be able to say something important about the environment? It’s about estimating uncertainties, handling model complexity, and what you can do in real-time on a mobile system.

Tell us about a recent “Only at MIT” moment.

I love the independent activity period (IAP). It’s one month of getting to try something completely outside of your field.  I took a mindfulness workshop, and it was amazing. It is a specific flavor of meditation that gets you into the meditative state in 30 minutes. It changed my life by changing how I deal with stress - I now meditate for an hour every morning.

This openness to new things, teaching them at a school, and having a dedicated time for such activities -- that’s unique to MIT.

Favorite place to gets news about computer science?

I read Hacker News. I have an app on my phone and I read that every day. It's an efficient way of getting a lot of diverse news that’s not just limited to computer science.

If you could tell your younger self one thing what would it be?

“Go vegan” and “meditate”! I guess that’s two things, but they’ve both had a big impact on me and, I think, made me a better person.