Tuesday 5 November 2019

What it takes to be a Deep Learning Engineer

By Akash James


The race for AI will dwarf any other race relative to the mystic realm of technology”.

No, I’m not quoting anyone, just saying what I often tell myself.

Most people just see technology as a creature comfort, but throw on my shoes and put on my spectacles and you’ll see art that just lures you in to be an artist. The wheel was the best invention in my opinion and has stayed that way since 3500 B.C. Fast forward over 5 millennia later and we still use this humble yet irreplaceable invention. But hey, why not create a new contender to the wheel (My narcissism is preceding me right now!), something that interweaves into human existence; like a cybernetic triple helical DNA structure where the third strand would be of Extended Intelligence. Yes, I didn’t say Artificial Intelligence, rather Extended where our capabilities have been enhanced with our own creation. Intelli-ception, maybe?

When I began my engineering, I had a plethora of technologies to amalgamate my consciousness in. I’ve had my fair share of experience with android apps, robotics and the Internet of Things, but just as I was walking through this Odin’s Vault of technology, I stumbled upon the Infinity Gauntlet of Artificial Intelligence, Deep Learning. With my eyes immobilized on it, I went ahead to wield the gauntlet and snap something awesome into existence once I had all my infinity stones. Of course, the infinity stones is just an analogy to things like Neural Networks, Algorithms, Math and so on. Boy, oh boy, getting the infinity stones is no joke.

After completing engineering with a bunch of projects that had deep learning coursing through its CUDA cores, I joined Integration Wizards Solutions. Again, with an Azure Hackathon as a stepping stone, I was bestowed with the opportunity to flex my fingers with the gauntlet and weave solutions laced with deep learning. This is where I exploited Object Detectors to arguably detect a variety of object instances that would verify compliance, MTCNNs to recognize people and keypoint detection for Pose Estimation. This product is what we call, IRIS.

Initially, it began with training models and getting our algorithms to work in a controlled environment; Proof-of-Concept, what my folks at work and a lot of you call it. But then the production level stuff began. At times it felt like you’re in a cave and you need to create a miniaturized arc reactor in a fortnight. Train models, code the business logic, design functionality, unit test, optimize, refactor and scale for load are some of the steps in chronological order. Being a Deep Learning engineer requires a lot of ingenuity and rationing of your time.

C'mon, I need 21 minutes every day to watch my favourite anime. Where else will I derive the power of will made of steel that enables me to not give up?

Given the nature of trial-and-error when training models, it takes a lot of clever decisions (what we call hacks) with respect to dataset augmentation and hyper-parameter tuning to trick neural nets to do what we want them to do. Sorcery it is! Scaling is where all the roadblocks begin. One challenge we faced was creating a pipeline that could accommodate 200-odd cameras for real-time object detection inference.

There was a need for speed and accuracy was a priority. The resultant was a neural network that was very demanding. We countered this with 6 NVidia RTX 2070s, a Flask Server powered by Gunicorn, TensorFlow and a pinch of awesomeness. We used TensorRT to run an optimized frozen INT8 graph at 100+ fps.

When deploying this, you don't want to accidentally create Ultron that goes rogue and raises false alarms (No strings attached is a bad thing, trust me). With a tad bit of Computer Vision techniques into the mix, we were able to solve the false alarms. Another project required us to combine tracking and detection together for Intrusion Detection. Detection was GPU intensive and tracking was CPU extensive, a balance was needed to share the load and run in the most optimal manner.

This experience led me to believe that mastering the art of Deep Learning involves mastering other elements of technology. Be it writing APIs that serve inference, multithreaded code for increased throughput or networking to handle a multitude of input sources.

Every day there is a call for code, a new mountain to conquer, a new challenge to accomplish. With new infinity stones I collect, it brings me one step closer to completing the masterpiece I envision, a contender that'll give the humble wheel a run (rather, a roll) for its money, all built on the shoulders of Extended Intelligence. *snap*

No comments:

Post a Comment