No. Of Resource :- 01
Location :- Remote
NP :- 15 Days to 30 Days
Email :- hr@staffconnect.ae
Industry :- Engineering Services
WhatsApp :- +971 529421270
Job description
- Interpret data, analyse results using statistical techniques and provide ongoing reports.
- Develop and implement databases, data collection systems, data analytics and other strategies that optimize statistical efficiency and quality
- Some of the most common data engineer responsibilities Data architect, ETL developer, Data quality engineer, Data security engineer, Lead data engineer
- Presenting the company with warehousing options based on their storage needs.
- Identify, analyse, and interpret trends or patterns in complex data sets
- Filter and “clean” data by reviewing computer reports, printouts, and performance indicators to locate and correct code problems
- Designing and coding the data warehousing system to desired company specifications.
- Conducting preliminary testing of the warehousing environment before data is extracted.
- Extracting data and transferring it into the new warehousing environment
- Testing the new storage system once all the data has been transferred.
- Troubleshooting any issues that may arise.
- Providing maintenance support.
- Researching, designing, implementing, and managing software programs
- Co-ordinated monthly roadmap releases to push enhanced/new code to production
- Modifying software to fix errors, adapt it to new hardware, improve its performance, or upgrade interfaces
Area of Expertise
- Min 10+ years of experience in Business Intelligence, Data Engineering.
- Proven working experience as a Lead Data Engineer
- Experience with ETL (Extract, Transform, Load) processes.
- Familiarity with data warehousing and cloud platforms (e.g., AWS, Azure, or Google Cloud).
- Design dimension model for OLTP, OLAP & Data Mart
- Experience in building Cubes & Data models.
- Proficiency in programming languages, such as Python, Java, or Scala
- Experience with big data
- Experience in version control systems like Git.
- Analyse data to identify trends, patterns, and insights that can inform decision-making and process improvements. This may involve working with databases, spreadsheets, and data visualization tools.
- Proficient knowledge in database design and writing T-SQL Stored Procedures, Functions, Views
- Expertise in OLTP/OLAP system study, Analysis, E-R modelling, and multi-dimensional models using star schema & snowflake schema techniques.
- Source Analyzer, Warehouse designer, Mapping designer, Mapplet Designer, Transformations Developer
- Familiarity with database technology such as MySQL, Oracle and SQL are must.
Technical Skills (Each section min 1 tools is a must)
- Analytical Tools: SSAS, Azure Synapse, Snowflake, Apache Kylin, Kyligence, DataBricks, Data Factory
- Storage & Database: SQL Server, Oracle, MySQL, Hive
- Reporting Tools: Power BI, Oracle BI, Tableau, Qlik
- ETL : Apache Spark, Talent, Py Spark
Qualification
- Bachelor’s degree in Engineering (Computer Science, Information technology, or related field)
Communication
- An excellent communicator (Written and Presentation) in English
Core Competencies
- Teamwork & Collaboration: Pragmatic and supportive team player, helping build collaborative relationships with colleagues across Kent Group.
- Communication: Emotionally intelligent individual with the ability to listen, empathize and understand the impact of nonverbal communication. An engaging individual can facilitate and present to big groups of stakeholders, including the C-Suite level.
- Critical Thinking: Creative problem solver using reasoning to analyse issues, make decisions, and solve problems.
- Self-Starter: Proactive, “Can-Do” attitude and commitment to go the extra mile to meet any committed timelines, and willing to fill gaps identified on the program.