KPMG is Hiring for Entry Level | Analyst | Helpdesk | IT | Technical Support | 0 - 2 yrs | Europe

KPMG Urgent hiring for Fresher | Engineer | Software Engineer | 0.8 – 4 yrs | Bangalore

KPMG International Cooperative is a multinational professional services network, and one of the Big Four accounting organizations. Seated in Amstelveen, the Netherlands, KPMG is a network of firms in 147 countries, with over 219,000 employees and has three lines of services: financial audit, tax, and advisory.

KGS:MC:Associate Consultant:KGS:MC:Consultant: Bigdata Software Engineer(AWS) – (200000HH)

Description:
Work in teams to perform data ingestion from disparate sources, develop complex event processing using Hadoop framework, review the data quality and data definitions, and perform data cleansing and data management tasks.

Role Summary Description:

· Support client engagements focused on Data and Advanced Business Analytics in diverse domains such as solution development.

· Develop spark applications/ map reduce jobs.

· Develop streaming/real-time based complex event processing on Hadoop framework.

· Interface with different databases (SQL, NoSQL).

· Manage data quality, by reviewing data for errors or mistakes from data input, data transfer, or storage limitations.

· Perform data management to ensure data definitions, lineage and source are suitable for analysis.

· Work in a multidisciplinary team to understand available data sources, needs, and downstream uses.

Functional/Technical Skills:

· Bachelor or higher degree in computer science, information systems, or business.

· Bring 1-4 years of software engineering and D&A solution development experience

· Experience with AWS EMR, Redshift, AWS Data pipeline, AWS Glue & PySpark , Pig/Hive on any Hadoop distribution (HDP/CDH/MapR).

· Proficient in SQL, in addition to one or more of modern programming language such as Java, Scala, Python etc.

· Experience working with databases (SQL, NoSQL)

· Experience working with data validation, cleaning, and munging

· Experience in one of cloud providers AWS, Azure or GCP

Qualifications

Role Overview: Work in teams to perform data ingestion from disparate sources, develop complex event processing using Hadoop framework, review the data quality and data definitions, and perform data cleansing and data management tasks.

Role Summary Description:

· Support client engagements focused on Data and Advanced Business Analytics in diverse domains such as solution development.

· Develop spark applications/ map reduce jobs.

· Develop streaming/real-time based complex event processing on Hadoop framework.

· Interface with different databases (SQL, NoSQL).

· Manage data quality, by reviewing data for errors or mistakes from data input, data transfer, or storage limitations.

· Perform data management to ensure data definitions, lineage and source are suitable for analysis.

· Work in a multidisciplinary team to understand available data sources, needs, and downstream uses.

Functional/Technical Skills:

· Bachelor or higher degree in computer science, information systems, or business.

· Bring 1-4 years of software engineering and D&A solution development experience

· Experience with AWS EMR, Redshift, AWS Data pipeline, AWS Glue & PySpark , Pig/Hive on any Hadoop distribution (HDP/CDH/MapR).

· Proficient in SQL, in addition to one or more of modern programming language such as Java, Scala, Python etc.

· Experience working with databases (SQL, NoSQL)

· Experience working with data validation, cleaning, and munging

· Experience in one of cloud providers AWS, Azure or GCP

https://lnkd.in/fe66Z-Z