27 Oct

Specialist Hadoop Linux Jobs Vacancy in Bank Toronto

Position
Specialist Hadoop Linux
Company
Bank
Location
Toronto ON
Opening
27 Oct, 2017 30+ days ago

Bank Toronto urgently required following position for Specialist Hadoop Linux. Please read this job advertisement carefully before apply. There are some qualifications, experience and skills requirement that the employers require. Does your career history fit these requirements? Ensure you understand the role you are applying for and that it is suited to your skills and qualifications.

Follow the online directions, complete all the necessary fields, and provide all relevant information so your application is submitted correctly. When you click the 'Apply this Job' button (open in new window) you will be taken to the online application form. Here you will be asked to provide personal and contact details, respond to employment-related questions, and show how you meet the key selection criteria.

Specialist Hadoop Linux Jobs Vacancy in Bank Toronto Jobs Details:

The Analyst - Production and Systems Support is part of the Information Excellence team within Corporate Segment Technology Solutions. Reporting to the Manager, Production and Systems Support, this role is responsible for providing production & system support, data infrastructure management for the Information Excellence team. This position must work proactively and effectively with the CSTS teams, ITS and other technology & business partners to provide support and expertise in Data management, infrastructure support, and Big Data / Hadoop Initiatives.

Job Requirements

  • Provide Production Support Services to project initiatives as assigned initially on the list of new projects or change management requests.
  • Contribute in the preparation of system implementation plans and support procedures.
  • Participate and contribute to the on-going development of the team by sharing information, knowledge, expertise and lessons learned on a regular basis
  • Ensure high quality work and maintenance of standards within own area of responsibility
  • Provide on-going communications on project status and systems restrictions.
  • Ensure that Audit and TRMIS policies and procedures are implemented and respected.
  • Provide ongoing system management & support to Big Data – Hadoop platforms - some after-hours and weekend support will be required.
  • Other project support initiatives as assigned.

Additional Information
  • Post secondary degree: Computer Science, Engineering or similar degree preferred
  • A minimum of 1 to 2 years of financial institution and / or credit card experience.
  • A minimum of 1 to 3 years of experience in Information Management and Data Warehousing.
  • Tangible experience with information technology; data and systems management; knowledge of Unix/Linus is fundamental: Hadoop administration and utilities, Java, virtual environments; and knowledge of Tibco is preferred but not mandatory.
  • Demonstrated history of being self-motivated, energetic, results-driven, and executing with excellence
  • Strong analytical and problem solving skills; uses strong communication skills to communicate complex issues in easily understandable terms.
  • Strong verbal, written, presentation & communication skills.
  • Demonstrated ability to manage and deliver on multiple projects on time.
  • Sound understanding of Linux/Unix, especially RHEL and tuning Linus systems.
  • Familiar with Hadoop development tools & utilities (Pig, Hive, Java, Sqoop, Flume, etc.) and CDH.
  • Working knowledge of Change Management protocols and TD Incident and Problem system (SDM) is an asset.
  • Demonstrated experience in process analytics and process flow documentation.
  • Comprehensive knowledge and experience with system support, managing data solutions and development methodologies (Agile, Lean, Waterfall).
  • Working experience with Source Code Repository systems and data lineage standards is essential. In addition, ability to use revision control systems such as Git.
  • Proactive, organized, excellent analytical and problem solving skills.
  • Work well independently as well as within a team.
  • Experience with orchestration workflows and high-level configuration management concepts and implementations and automation tools such as Puppet or Saltstack.
  • Ability to use a scripting language such as Python or Perl is an asset.
  • Understanding of networking, firewalls and load balancing.
  • Working experience with software such as DHCPD, NTP, BIND and SSSD is an asset.
  • Experience with Kerberos and PAM.
  • Familiar with operating and/or developing Java applications is an asset.
  • Working knowledge of various business software products such as MS-Office, MS-Project, MS-Visio, MS-Project and others.
Make your mark. Join a dynamic team. Explore new ideas. This is your opportunity to impact the future of banking technology in areas and ways you've never imagined (at a bank)! Visittechjobs.td.com to learn more.

Inclusiveness
At TD, we are committed to fostering an inclusive, accessible environment, where all employees and customers feel valued, respected and supported. We are dedicated to building a workforce that reflects the diversity of our customers and communities in which we live and serve. If you require an accommodation for the recruitment/interview process (including alternate formats of materials, or accessible meeting rooms or other accommodation), please let us know and we will work with you to meet your needs.

**Province/State (Primary)
Ontario

City (Primary)
Toronto


Jobs Vacancy Related to Specialist Hadoop Linux:

22Jan

Senior Linux Engineering Specialist Jobs Vacancy in Rbc Toronto. Rbc Toronto opening great career opportunity and jobs vacancy for Senior Linux Engineering Specialist position. This jobs vacancy will be open for new jobs applicant starting for 22 Jan, 2018. As the Senior Linux Engineering Specialist you will deliver release work-streams, component analysis, design, and build on Hosting and Cloud Engineering... ... Continue reading -->


17Jan

Technical Specialist Hadoop Jobs Vacancy in Manulife Waterloo. Manulife Waterloo opening great career opportunity and jobs vacancy for Technical Specialist Hadoop position. This jobs vacancy will be open for new jobs applicant starting for 17 Jan, 2018. As a Hadoop Technical Specialist, this role will consist of working with several aspects of Hadoop technologies. Certified in Hadoop (i.e.... ... Continue reading -->


18Jan

Specialist Big Data Devops Jobs Vacancy in Bell Montréal. Bell Montréal opening great career opportunity and jobs vacancy for Specialist Big Data Devops position. This jobs vacancy will be open for new jobs applicant starting for 18 Jan, 2018. Knowledge of Linux and network concept. Experience working with Hadoop ecosystem (HDFS, Impala, Solr, Kafka, Spark and others) - high volume, high velocity data... ... Continue reading -->


16Jan

Big Data Developer Jobs Vacancy in Rbc Toronto. Rbc Toronto opening great career opportunity and jobs vacancy for Big Data Developer position. This jobs vacancy will be open for new jobs applicant starting for 16 Jan, 2018. As the lead of our IT Hadoop solutions:. Be a Big Data (Hadoop) Specialist that still loves coding. Advanced knowledge of Hadoop ecosystems, it's architecture... ... Continue reading -->


17Jan

Data Engineering Multiple Opportunities Summer Jobs Vacancy in Manulife Toronto. Manulife Toronto opening great career opportunity and jobs vacancy for Data Engineering Multiple Opportunities Summer position. This jobs vacancy will be open for new jobs applicant starting for 17 Jan, 2018. (1) Good understanding of Hadoop big data technology stack, such as Map-reduce, Yarn, Hive, Hbase, etc. The platform will be built with open source Hadoop... ... Continue reading -->