Energetic employee well-versed in strong communication and organisation skills. Seeks solutions to problems and applies extensive analytical knowledge to findings with 16+ years of Experience in IT industry. Adept at multi-tasking, leading group discussions and managing projects. Talented professional with practical experience and technical skill in Data Engineering / Data Analyst. Resolves issues using strategic problem-solving and logical thinking for targeted solutions. Proactive and self-motivated for productive independent and team working
Overview
15
15
years of professional experience
6
6
years of post-secondary education
1
1
Certification
Work history
Senior Data Engineer / Senior Data Analyst
Air Arabia
Sharjah, United Arab Emirates
10.2022 - 03.2024
Responsible for handling the data Engineering Team
Responsible for Air Arabia internal and external carrier data processing
The data included both structured, semi structured, and unstructured data
Responsible for developing the data-as-a product
Responsible for the data Migration using Snaplogic.
Responsible for implementing the Data Lake and to create the Qlik Sense reporting on top of SNOWFLAKE
Responsible to implement the 100 + Reports in Qlik Sense
Responsible to analyse and fix the issues in the Finance , Revenue and On-time performance Reports.
Responsible for monitoring the raised incidents
Responsible for Data Governance & Data Quality
Responsible for implementing the data science machine learning models using DATAIKU
Responsible for the work order approvals and for generating the Purchase order
Responsible for interacting with various vendors like Qlik Replica and gain Insights and so on
Responsible for data protection and data masking
Responsible for the Reservation, Ticketing and DCS system
Responsible for training the team via Learning & Development program
Responsible for Agile PI planning, Sprint Planning, Retro, Pre-PI planning Back log grooming sessions
Responsible for Performance review for the team members and cost management for each project
Responsible for gathering the requirements and planning the projects and delivering them before the deadline
Performed duties in accordance with applicable standards, policies, and regulatory guidelines to promote safe working environment
Served customers and followed outlined steps of service
Participated in continuous improvement by generating suggestions, engaging in problem-solving activities to support teamwork
Actively listened to customers, handled concerns quickly and escalated major issues to head of IT
Responsible for handling almost 3 TB of Raw data on daily basis from different source like MARS, MACS, FQT, Core and other applications
Responsible for enriching, transforming, and loading structured and unstructured data into HDFS and HBase tables using Scala and shell scripting, Oozie workflow
Responsible for the Order Domain using Streaming and Kafka to generate real-time orders
Application migrated from Oracle to HBase, developed incremental batched job to bring HBase Collection to Big data environment using Scala
Worked on multiple analysis and Data Monetization projects
Closely worked with Data science to generate meaningful data using complex SQL/HQL queries required for Use cases
Experienced in Data Quality & Governance check before and after deployments Spark batch jobs for new data source systems
Wrote multiple shell scripting to automate most of the daily task, streaming near real time data
Used Scala, Java, Hive, HBase, Spark SQL, Snowflake, Oracle, Shell Scripting, Kafka, Oozie workflow, Microsoft Azure, snap Logic for the order domain projects
Wrote multiple shell scripting to automate most of the daily task, streaming near real time data
Implemented the Data Governance & Data Quality in Order domain
Handled multiple UAT and SIT Environment
Created data modelling pipeline until serving layer (Modelled layer) (spark, Scala, Hive, SQL)
Data Ingestion to raw layer, decomposed layer, and modelled layer
Ingested the data to the HBase table (NOSQL)
Worked on developing the PNR (Booking), MACS (check- in), TKT data (Ticketing Data) and EMD data
Data Transformation done using Spark & Scala
Unit Test Case Preparation and Execution and uploaded in Confluence
Bug Fix if any
Downloading the latest code from GIT
CI/CD (Continuous Integration / Continuous delivery) Pipeline creation for promoting the programs into different environments like Development / Staging and production using Jenkins
Placing the latest Jar in the Nexus/Jenkins for CI/CD
Updating the issues in JIRA
Data Ingestion Testing - DBMS to Hadoop Cluster (Sqoop)
Validation of data in several layers of Data Lake (Example Raw, Branded, Modelled and Serving Layers)
Worked on the on-premises to Azure cloud Migration projects
Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems and vice-versa