Our MediaWallah Careers Q&A offers one-on-ones talking MediaWallah, career paths, and life beyond work with MediaWallah teammates. Today’s guest: Tulio Hernandez, Senior Data Engineer
What’s your role at MediaWallah?
My official title is Senior Data Engineer. In this role I’ve designed and implemented a robust data ETL/ELT infrastructure pipeline to manage and automate the import of raw cookie and mobile data, data cleansing and transformation, and exporting datasets based on customer’s specifications, all configured primarily in a cloud environment.
When did you join MediaWallah, and how has the company changed since you first joined?
I joined back in April of 2019. From the technology perspective, my mission was to ramp up our automation to support a rapidly growing customer base. After the automation infrastructure was operational by the end of 2019, the company is now able to deliver better dataset products at a faster rate than ever. Additionally, as new business and customers come in and the economy continues to recover from the pandemic, I see the company expanding with new hires.
What are some ways you’ve advanced your own skill set since you’ve joined the company?
While I was very familiar with relational databases upon starting at MediaWallah, I had very little experience with cloud databases like Snowflake. There are similarities but there are also vast differences. Handling import/export jobs directly from inside the Snowflake database is truly a game changer. I’ve also advanced my skill set by directly interfacing with the customer base. In my previous jobs in hedge funds or investment banks, tech people were not allowed to interact with the customer base, so I’m very thankful for that opportunity.
What do you do in the day-to-day?
Well, I start my day with a long and arduous 40-55 seconds commute from my kitchen to the main command center — in my living room! I begin with my priority list, which mainly consists of urgent infrastructure modifications or fixes, fulfilling pressing requests from customers about exports or correcting data issues, and finalizing deployments of new automation processes. Next I review our monitoring feeds to ensure that the import/exports and service automation processes are healthy, and review any pending emails that require my immediate attention. After that it’s on to work on onboarding new customers and configuring them into the automation workflow. In between I squeeze time to participate in customer meetings, code review, code deployment to production, more modifications/enhancements to the data pipeline, creating new datapipe lines, and managing the databases in our Snowflake data infrastructure.
What are some things that excite you most about working for MediaWallah?
No matter what the underlying business, technology or benefits a company offers, ultimately you have to interact with its people throughout the work day. And I have to say that it is a pleasure to deal with each and everyone in the MediaWallah work space. From the CEO, to the sales/marketing people, to tech and product people, everyone is very smart, hard working, and fun to be with.
What’s something people might not know about you?
I liked music from an early age. When I was 4, I drove my parents nuts when I took their record collection, placed them on the floor, and started dancing on top of them. As a teenager, I learned rudimentary guitar playing from my sister. Subsequently I took classical guitar lessons for about a year but my instructor was so strict that I had to quit. I did learn enough though to play in local coffee bars, weddings and keep it as a hobby. But I haven’t played in over 10 years so my chops are a bit rusty.
What advice would you give to someone looking to advance their data engineering career?
I like this question because it deals with making the field of data engineering more efficient. There is a love affair with Python in the industry which may not be a healthy one. Some companies built their entire data ETL/ELT workflow in Python and that makes the entire end-to-end process inefficient and difficult to maintain. I found that building the data workflow using a combination of Bash, SQL, Python and the ETL/ELT tools from the underlying database is a much better, fluid, and efficient approach. So the advice is: don’t lump the entire data workflow into Python. Modern cloud analytical databases have built-in tools to perform the data workflow more efficiently and less costly. ⬩