Summary
Overview
Work History
Education
Skills
Languages
Accomplishments
Certification
Timeline
Generic
Mohammad Ashfaque

Mohammad Ashfaque

Data-lake architect
Dubai,UAE

Summary

Data-lake architect with 10 years' experience in Big-data solution designing, Digital Transformations, Bigdata Analytics, Data warehousing, Business intelligence, Architecture, Data Governance, IOT & Artificial intelligence implementation. Driving innovation and providing strategic direction to organization's data driven initiatives.

Overview

10
10
years of professional experience
1
1

Azure Solutions Architect Certified

1
1

Azure Security Certified

1
1

IoT Graduate from SoftwareAG

Work History

Data lake architect

DEWA, Dubai Electricity and Water Authority
Dubai, UAE
08.2021 - Current
  • Worked in implementing analytical solutions across different areas like demand forecasting, water leak detection, suspicious meter point identification, customer analytics, IoT analytics
  • Integration and extracting data from IoT terminal, electric load, PV panel, sensors
  • Configuration of centralized data repository for diverse data sources
  • Designing and implementing Big-data analytics solutions for multiple Utility use-cases
  • Translating the business requirements from stakeholders to the team of data scientists
  • Closely working with Business Unit in Digital transformation of utility domain
  • Perform advanced data analysis to extract meaningful insights and patterns from large and complex datasets
  • Implementing a Master Data Management strategy to ensure data standardization, quality, and governance, and provide a single source of truth
  • Integration of BI tools PoweBI, Tableau and Superset with Data-lake.
  • Established database access and usage policies and procedures for managing each.
  • Enforced security and integrity controls to prevent breaches, loss and damage.
  • Executed tests at different stages of design and production to validate, debug and improve code.
  • Organized planning, connection and deployment for data warehouse systems.
  • Researched and integrated new database management tools to meet changing needs.
  • Worked successfully under Agile methodologies to deliver on-time projects.
  • Diagrammed database designs and wrote descriptive documentation.
  • Consulted with users and management to formulate guiding principles.

Senior Bigdata and Cloud Engineer

Clairvoyant India
Pune, India
03.2017 - 08.2021
  • Responsible for implementation and support of the Enterprise Hadoop environment
  • Been a part of Big Data as a service (Bdaas), build new Hadoop cluster and Hadoop deployment decisions
  • Experienced in Secured Hadoop Cluster configuration and deployment in production environment
  • CDH, HDP and HDF cluster configuration and upgrading these cluster based on requirement
  • Configuration of Kerberos in Hadoop cluster and role bases security on databases using sentry
  • Experienced in configuration of data encryption on HDFS
  • TLS/SSL configuration on Hadoop Cluster
  • Setting up new Hadoop users, AD/LDAP/Kerberos Authentication models
  • Commissioning and Decommissioning of Node on running cluster
  • Setup new Hadoop users, setting up Kerberos principals and validating their access
  • Hadoop architecture and configuring various components such as HDFS, YARN, MapReduce (MR1 & MR2), Tez, Sqoop, Flume, Pig, Hive, Zookeeper, Oozie, Ranger, Knox, Sentry, Kafka, Storm, Solr, and HBase
  • Configuring Hadoop cluster for security at Enterprise level using MIT & AD Kerberos for authentication, HDFS ACL's, Sentry & Ranger for authorization, Knox for client gateway request, SSL/TLS implementation for Securing HTTP traffic
  • Importing and exporting data using Apache Sqoop from HDFS to Relational Database System and vice-versa.

Bigdata Consultant

Saama Technology
08.2016 - 03.2017
  • Responsible for implementation and support of the Hadoop Cluster environment (CDH &HDP)
  • Installation, configuration, supporting and managing Hadoop Ecosystems (Hive, Impala, Zookeeper, Oozie, Hue etc)
  • Implemented High end HDP security with KERBEROS with RANGER and KNOX integrated with LDAP
  • Administration and Monitoring Hadoop clusters in both environments such as Development Cluster, Production Cluster HDP and CDH with the help of Cloudera Manager, Ambari
  • Setup and manage Name node High Availability with journal node and Zookeeper Implementation
  • Implemented automatic failover zookeeper and zookeeper failover controller
  • Commission/Decommission failed and new node in the cluster
  • Administer Kerberos Security on Hadoop, create key tab files, principal and set appropriate permissions for the principals per requirements
  • Importing/Exporting the data using Sqoop from HDFS to Relational Database systems/mainframe and vice-versa
  • Monitor Hadoop cluster job performance and capacity planning, health of the cluster services keeping a track of all the running Hadoop jobs
  • Analyze multiple sources of structured and unstructured data to propose and design data architecture solutions for scale-ability, high availability
  • Clearly able to identify data patterns and design/implement as per solution needs.

Bigdata Engineer

Locuz Enterprises SOLUTION PVT. LTD, IT SOLUTION PVT. LTD
03.2015 - 07.2016
  • Responsible for implementation and support of the Hadoop Cluster environment (CDH, HDP and Pivotal)
  • Installation, configuration, supporting and managing Hadoop Ecosystems (Hive, Impala, Zookeeper, Oozie, Hue etc)
  • Implemented High end HDP security with KERBEROS with RANGER and KNOX
  • Administration and Monitoring Hadoop clusters in both environments such as Development Cluster, Production Cluster HDP and CDH with the help of Cloudera Manager, Ambari
  • Set up and manage Namenode High Availability with journal node and Zookeeper Implementation
  • Implemented automatic failover zookeeper and zookeeper failover controller
  • Commission/Decommission failed and new node in the cluster
  • Administer Kerberos Security on Hadoop, create key tab files, principal and set appropriate permissions for the principals per requirements
  • Importing/Exporting the data using Sqoop from HDFS to Relational Database systems
  • Monitor Hadoop cluster job performance and capacity planning, health of the cluster services keeping a track of all the running Hadoop jobs
  • PROBiTY VIRTUE

Associate Hadoop Admin

01.2013 - 03.2015
  • Commission/Decommission failed and new node in the cluster
  • Setup new Hadoop users, setting up Kerberos principals and validating their access
  • Troubleshooting installation & configuration issues
  • Monitor Hadoop cluster job performance and capacity planning, health of the cluster services keeping a track of all the running Hadoop jobs
  • Screen Hadoop cluster job performances and capacity planning
  • Installation, configuration & troubleshooting of Linux Servers
  • Apache Hadoop and Ecosystem installation, configuration and troubleshooting
  • User Creation (Local OS user) and AD user
  • Hadoop Job monitoring and Cluster health check and generate reports
  • Monitoring the system performance during peak hours and suggest and recommend upgrading if bottlenecks occurs
  • User and group service Management (Creation, Deletion and modification)
  • Filesystem Management and Process monitoring
  • Installation, configuration & troubleshooting of Linux Servers.

Education

Master of Computer Science (MCS) - Computer Science

Pune University
India
2013

Skills

  • Big Data and Ecosystem: Apache Hadoop including Hive,Tez, Imapla,Spark,Pig, Hbase, Sqoop, Kafka, NiFi , HDFS and YARN etc
  • Database Languages: Oracle, MySQL
  • Operating System: Windows and Linux
  • Programming Languages: Java, Python and SQL
  • Certifications
  • Data security
  • Solution development
  • SQL
  • Cloud Architecture Design
  • Big Data Management
  • Data lake Architecture
  • Database policymaking
  • Capacity planning
  • Python
  • Database architecture
  • System Architecture Design
  • SQL programming
  • Unix
  • Teamwork and Collaboration
  • Structure designs
  • Business process mapping
  • Optimizing and performance tuning
  • Information security
  • Cost estimation and budgeting

Languages

English
Fluent
Arabic
Beginner (A1)
Urdu
Native
Hindi
Native

Accomplishments

    Published research papers at ITIKD 2023 Bahrain

    Trained professional in Bigdata technology

Certification


  • Microsoft certified Architecting Microsoft Azure Solutions
  • Microsoft certified Azure Security Engineer Associates
  • Bigdata certified Administrator
  • MongoDB certified Engineer


Timeline

Data lake architect

DEWA, Dubai Electricity and Water Authority
08.2021 - Current

Senior Bigdata and Cloud Engineer

Clairvoyant India
03.2017 - 08.2021

Bigdata Consultant

Saama Technology
08.2016 - 03.2017

Bigdata Engineer

Locuz Enterprises SOLUTION PVT. LTD, IT SOLUTION PVT. LTD
03.2015 - 07.2016

Associate Hadoop Admin

01.2013 - 03.2015

Master of Computer Science (MCS) - Computer Science

Pune University
Mohammad AshfaqueData-lake architect