~/majji-kishore/work $ cat career.json

[CAREER] Work Experience
grep "experience" /var/log/career.log

A timeline of my professional journey and the challenges I've tackled in the world of software development.

Software Engineer

@NielsenIQ

Full-time

$ git log --since="Jul 2023" --until="Present"// 2 yrs

Pune, Maharashtra, India

  • Architected and modularized a 20,000-line codebase into reusable components, improving code maintainability and enabling faster feature development across multiple data pipeline projects.
  • Led critical infrastructure upgrade by migrating Databricks runtime from 9.1 to 14.3, implementing performance optimizations and security enhancements that supported increased data processing demands.
  • Engineered high-performance Spark jobs using broadcast variables, Delta Lake optimization, and selective column retrieval strategies, reducing critical ETL pipeline runtime from 12 hours to 4 hours (66% improvement).
  • Designed and implemented metadata management system for dimensional and fact tables, reducing UI dataset loading times by 70% and improving user experience for business analysts.
  • Built custom dimension table (Dim11) with regex-based fact table metadata extraction, enabling selective fact loading based on client requests and further optimizing UI performance.
  • Delivered production-critical system stability improvements through proactive monitoring and optimization, reducing job failure rates from 10+ monthly incidents to 1-2 occurrences (90% reduction).
  • Integrated Data Science models into data pipelines to handle missing store data through predictive sales adjustments, enabling accurate forecasting for incomplete datasets and unblocking new client onboarding processes.
  • Automated Root Cause Analysis reporting via Databricks API, enabling data-driven pipeline enhancements and faster issue resolution.

Azure Databricks, Python, PySpark, Apache Airflow, SQL, Delta Lake, Data Engineering, ETL Pipelines, Performance Optimization

SDE-Intern

@NielsenIQ

Internship

$ git log --since="Feb 2023" --until="May 2023"// 3 mos

Pune, Maharashtra, India

  • Gained hands-on experience with Big Data technologies including Hadoop and Spark to effectively process and analyze large datasets for retail analytics.
  • Identified and resolved critical bugs in production data pipelines, improving data quality and pipeline reliability.
  • Developed and implemented EQ_volume feature to establish standardized units for comparing product performance across different categories, enhancing accuracy and efficiency of retail data analysis.
  • Collaborated with senior engineers to optimize data processing workflows and improve overall system performance.

Python, PySpark, Apache Airflow, SQL, Hadoop, Big Data

SDE-Intern

@Mensa Brands

Internship

$ git log --since="May 2022" --until="Jul 2022"// 2 mos

Bengaluru, Karnataka, India

  • Fixed critical bugs in Python-based data scraping tool that was failing to extract product information from e-commerce sites, improving data collection success rate.
  • Learned AWS services (EC2, S3, Lambda, Step Functions, Redshift) and implemented CI/CD pipelines using GitHub Actions for automated deployment of scraping scripts.
  • Evaluated BI tools by building sample dashboards in Metabase and Zoho Analytics using company sales data, presenting findings to the data team for tool selection.
  • Designed interactive dashboards using AWS QuickSight for data visualization and analysis, providing intuitive interfaces for business stakeholders.

Amazon Web Services (AWS), Python, Metabase, Data Visualization, CI/CD, Business Intelligence

Backend-Intern

@NavGurukul

Internship

$ git log --since="Nov 2021" --until="Mar 2022"// 4 mos

Remote

  • Upgraded Node.js to a newer version and migrated all npm packages, resolving dependency conflicts and ensuring application compatibility across the entire codebase.
  • Implemented slot-booking feature that automated the student joining process, reducing manual intervention and improving user experience for educational platform.
  • Developed core APIs for the main dashboard, handling data retrieval and user interactions to support student management workflows.
  • Built finite state machine to manage all student states and transitions, ensuring proper workflow management throughout the student lifecycle in the educational system.

Node.js, Hapi.js, PostgreSQL, Express.js, API Development, Database Design