Article

A senior data engineer in search of her true potential

Meet Neha Jaiswal, she wakes up early and has a full-blown morning routine right from self- care to family and work care. She stays in Bhopal, India, which is also known as the city of lakes. She loves spending time with her family. She also indulges in gardening, making oil paintings, and watching TV shows in her spare time. She’s travelled extensively across the length and breadth of India. Her friends describe her as jovial and caring.
 
Neha joined Ascendion in August 2021 and the tech stacks she uses are Python, SQL, Google Cloud platform, Hive, Hadoop and R. In this intriguing conversation, we speak to Neha about what drives her and how this stint has helped her to discover her true potential.
 

How it all began

Hello, I am Neha. From Childhood, I was always fond of computers and its technologies. As I grew older, my elder brother suggested that I explore more options, figure out my own places of interest and do my research to make an informed decision. That’s when I was introduced to the world of technology. I was fascinated by it and that led to the journey of
pursuing Computer Science.
 
During my graduation at GITAM University, I relished studying Java and Python. I used to spend a lot of my time exploring coding libraries and testing things out. This helped me land my first job as a Python Developer in one of the top MNC.
 
In 2021, when I was interviewing for Ascendion, it was a very different experience. I had offers from 2-3 companies but what sealed the deal with Ascendion was that even before I joined the company the leadership was approachable to provide all the clarifications I needed.

My requirements, goal alignment and many other questions that I had were very patiently addressed, which I deeply appreciated. Thus, I took up the job at Ascendion as a Data Engineer.

ASC&I-Blog Inner-Mid-Neha

 
One of my early projects that made me remarkably popular with everyone was on a huge project. Our team was formed such that I was the only one from offshore. One of the problems that the client had was with data transfer to the pipeline. My role in the project was as a developer. I automated the loading of the client data directly to the pipeline. This
was a fun and liberating opportunity for me too as this involved working with Machine Learning.
 
This was a new territory for me. A lot of learning went down the lane. I learnt how to train a model and practiced it for some time. Soon I got comfortable and started working on deploying the code piece by piece to make it ready for testing. Once the test was successful, we got on a call with the client where I presented my work and even got praised
for it. This was my ‘Aha’ moment.
 

A challenge called Code Migration

Today, I am a senior data engineer, currently working on a cloud migration project for a leading HealthTech firm. One of the problems that we are solving for them is migrating the code from on-premise to cloud. As pipeline jobs were not completing on time, environment was not scalable which was impacting the capex and opex over run and then we decided to
migrate the code. At the end of this project, we plan to make all the pipeline jobs run successfully without any delay. Also,make their infrastructure cost effective and secure.
 
This project requires migration of data and its codes from Hadoop on-premise to GCP. Here, the entire source code must be changed from Hive to BigQuery while making the necessary function changes for a successful migration.
 
I started with automating a few modules and processes as per client requirements. Some modules are in R, which is a completely new language that I learnt, it automates moving the on-premises code to GCP. While another is developed using Python and SQL scripts. I have also built automation script to select the table and provides all the table names
automatically which was a manual process earlier. These automations helped me and my team to be quicker and more efficient at our tasks. To automate all the sql scripts, I build Airflow DAGS where the pipeline can run successfully.
 
While these codes are developed, I am also tasked with migrating 10-20 SQLs in a day and I have a total of 70 to finish in each sprint.
 
Apart from client work, I also organize events at GCP circle meetings. I have organized events like Code-A-Thon, KBC Quiz on GCP, 2 Truths and a lie, Personality quiz and many such ice-breaking activities which has helped the Team to bond really well.
 

Why Ascendion and I are a match?

Ascendion is where careers align with ambitions. As a GCP data engineer here, I not only work with the right tech stacks but also have endless opportunities to learn and dive into emerging technologies. My career goals are perfectly in sync with my role at Ascendion, and the support from my manager fuels my drive to reach new heights. What’s truly refreshing is our work culture, where productivity takes precedence over micromanagement, and hard work is celebrated with recognition. Plus, the remote work policy has been a blessing for balancing work and life. Ascendion empowers you to lead and innovate, like my role in the data engineering and GCP circles.
 
If you’re ready for a journey filled with cutting-edge possibilities, Ascendion is the place where it all starts. Join us and master your field.

Want to work with us?

Explore more roles