Job Summary
Big Data Specialist (Hadoop) (59602725)
Location: Toronto, ON Category: Content/Document Management
Job Type: Temporary/Contract Reference: CA_EN_6_17450_59602725
Posted: November 15,2017 Salary: N/A

Print image email image
Apply Now

 
 
 
Modis, on behalf of our client is looking for a Big Data Specialist
 
Big Data Specialist (Hadoop) (59602725)
 
Contract Duration: (6 months)
Location: Toronto, ON
 
Responsibilities:
 
• Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support
• Develop standardized practices for delivering new products and capabilities using Big Data technologies, including data acquisition, transformation, and analysis
• Ensure Big Data practices integrate into overall data architectures and data management principles (e.g. data governance, data security, metadata, data quality)
• Create formal written deliverables and other documentation, and ensure designs, code, and documentation are aligned with enterprise direction, principles, and standards
• Train and mentor teams in the use of the fundamental components in the Hadoop stack
• Assist in the development of comprehensive and strategic business cases used at management and executive levels for funding and scoping decisions on Big Data solutions
• Troubleshoot production issues within the Hadoop environment
• Performance tuning of a Hadoop processes and applications
• Proven experience as a Hadoop Developer/Analyst in Business Intelligence and Data management production support space is needed.
• Strong communication, technology awareness and capability to interact work with senior technology leaders is a must
• Good knowledge on Agile Methodology and the Scrum process
• Delivery of high-quality work, on time and with little supervision
 
Required Qualifications/Experience
 
• Bachelor in Computer Science, Management Information Systems, or Computer Information Systems is required.
• Minimum of 4 years of Building Java apps
• Minimum of 2 years of building and coding applications using Hadoop components - HDFS, Hbase, Hive, Sqoop, Flume etc.
• Minimum of 2 years of coding Java Scala/Spark, Python, Pig programming, Hadoop Streaming, HiveQL
• Minimum 4 years understanding of traditional ETL tools and Data Warehousing architect.
• Strong personal leadership and collaborative skills, combined with comprehensive, practical experience and knowledge in end-to-end delivery of Big Data solutions.
• Experience in Exadata and other RDBMS is a plus.
• Must be proficient in SQL/HiveQL
• Hands on expertise in Linux/Unix and scripting skills are required.
• Strong in-memory database and Apache Hadoop distribution knowledge (e.g. HDFS, MapReduce, Hive, Pig, Flume, Oozie, Spark)
• Experience and proficiency in coding skills relevant for Big Data (e.g. Java, Scala, Python, Perl, SQL, Pig, Hive-QL)
• Proficiency with SQL, NoSQL, relational database design and methods
• Deep understanding of techniques used in creating and serving schemas at the time of consumption
• Identify requirements to apply design patterns like self-documenting data vs. schema-on-read.
• Played a leading role in the delivery of multiple end-to-end projects using Hadoop as the data platform.
 
Please note that Candidates must be legally eligible to work in Canada. Your resume will not be forwarded to any of our clients without your explicit permission. We thank all applicants, but only suitable applicants with the above qualifications clearly identified in their resumes will be contacted. Modis Canada is an Equal Opportunity Employer.
 
 
We’re at the center of exceptional IT connections. Every day, Modis connects premier IT professionals to great opportunities at leading companies. Put our connections to work for you!
 




Print image email image
Apply Now
alt