A Dutch IT outsourcing company with numerous international clients and delivery centers in thre...28 December 2016
This Vacancy is currently inactive. However, should you wish to send your CV for consideration, please use the form below and when the position becomes active again, we will notify you.
Symphony Solutions – Why So Special?
Symphony Solutions is an international Dutch IT company with offices in Ukraine, Macedonia, and Poland. We have been on the market for more than 10 years already and preserve unique culture within all our locations.
At Symphony Solutions we have removed all possible barriers created by the traditional organization and embraced the organic principles and a high-degree of self-management. We believe that this kind of organization is the optimal environment to attract and retain the best talents, fully develop them and leverage their potential.
We have a unique employee selection process where colleagues choose colleagues. Such approach eliminates possible conflicts and ensures honest and transparent relationship with clients and within the team. Symphony Solutions is a company that strives to be the Best Price/Performance and the easiest to do business with.
Our client is a global IT Data company based in Ottawa, Canada that specializes in designing, implementing, and managing systems that directly contribute to revenue and business success. They help companies adopt disruptive technologies such as
advanced analytics, big data, cloud, databases, DevOps and infrastructure management to advance innovation and increase agility.
As a Big Data Principal Architect, you will be part of our Customer Service Delivery team that is entrusted to manage our global client’s mission critical systems as well as deploying cutting edge technology from blockchain to serverless and cloud databases, covering all modern data and infrastructure. They deliver first class personalized level of service to our clients across financial, educational, media, retail and many more.
- Experience Architecting Big Data platforms using Apache Hadoop distros like, Cloudera, Hortonworks and MapR
- Strong knowledge of cloud data processing architectures
- A strong desire to work in a customer facing consulting type role
- In-depth knowledge of popular database and data warehouse technologies from Microsoft, Amazon and/or Google (Big Data & Conventional RDBMS), Microsoft Azure SQL Data Warehouse, Teradata, Redshift, BigQuery, Snowflake etc
- Demonstrated knowledge of data warehouse concepts
- Strong understanding of distributed systems architecture
- Be fluent in at least 2 object oriented language, preferably java and python and have familiarity with functional languages, proficient in SQL
- Experience with Hive and Impala
- Experience with major distributed processing frameworks: Mapreduce, Spark, Storm, Flink
- Experience with Kafka and Spark Streaming
- Ability to work with software engineering teams and understand complex development systems and patterns
- Ability to travel and work across North America frequently (occasionally on short notice) up to 50% of the time with some international travel also expected
Nice to have:
- Experience with BI platforms, reporting tools, data visualization products, ETL engines
- Real-time Hadoop query engines like Dremel, Cloudera Impala, Facebook Presto or Berkley Spark/Shark
- DevOps experience with a good understanding of continuous delivery and deployment patterns and tools (Jenkins, Artifactory, Maven, etc)
- Experience with Hbase
- Collaborating with and supporting sales teams in the pre-sales & account management process from the technical perspective, remotely and on-site for Big Data focused consulting projects
- Travelling to client sites frequently to provide face to face consulting services, meetings and presentation as well as delivering from work and value home
- Conducting technical audits of client’s existing architectures (Infrastructure, Performance, Security, Scalability and more) document best practices and recommendations
- Lead and meet the expectations for client deliverable completion timelines
- Work with our customer clients to help them develop data architecture models to further enable effective service oriented delivery
- Providing component or site-wide performance optimizations and capacity planning
- Mentoring and working with both internal and client development teams directly delivering against an agile backlog.to help engineer highly available, more manageable Big Data platforms