As passionate drivers of change, MetiStream is looking for the same in our employees. We want self-motivated individuals with innate curiosity and an enthusiasm for innovative problem-solving. We advocate for growth and learning and challenge our employees to experiment, be bold, and stretch beyond what they know now. Like us, you want to make an impact, are relentless about quality, and care about the customer. If this sounds like you, please contact us at recruiting@metistream.com and send us your resume.


Culture


Here at MetiStream, we believe that a positive work-life balance leads to a more productive team. We live the work-hard, play-hard mentality. What does that mean?


parallax background

Available Positions


SENIOR ARCHITECT

Senior Architect

You have
  • Significant professional IT experience with emphasis in Java design and development and current architecture experience
  • 1 – 4 years’ experience with cloud architectures and large data processing / Hadoop environments; ability to set up multi-node Hadoop clusters and write MapReduce jobs
  • Experience with Kafka, Kinesis, or other similar CEP / data ingest / messaging solutions
  • Experience Apache Spark / Spark Streaming and associated Spark components
  • Understanding of event processing / real-time streaming concepts and principles
  • Ability to integrate and orchestrate Hadoop Big Data ecosystem / Open Source technologies
  • Familiarity with DW/ETL/BI and visualization solutions and implementations
  • Understanding of various data storage concepts (in-memory, NoSQL, columnar, etc.)
  • Excellent communication/interpersonal skills including both written and verbal skills
  • Prior experience at top IT consulting company and/or software product company
  • BS/MS in Computer Science or related field
Additional Experience/Skills:
  • Familiarity with system administration and scripting tools such as bash, Python and/or Perl
  • Contributed to an Open Source software project, spoke at Big Data conferences, participated in an OS user/dev group, and/or created technical blogs
ARCHITECT

Architect

You have:
  • ~5 - 8+ years professional IT experience with emphasis in Java, Scala (or other functional programming language), Python or C++ design and development
  • 3+ years’ experience with cloud architectures and large data processing / Hadoop environments; ability to set up multi-node Hadoop clusters and write MapReduce jobs
  • Familiarity with Apache Spark / Spark Streaming and associated Spark components
  • Ability to integrate and orchestrate Hadoop Big Data ecosystem / Open Source technologies
  • Familiarity with DW/ETL/BI solutions and implementations
  • Familiarity with system administration and scripting tools such as bash, Python and/or Perl
  • Expertise with system administration of Hadoop ecosystem components including installation, upgrades, and security
  • Understanding of various data storage concepts (in-memory, NoSQL, columnar, etc.)
  • Specialized skills in either data visualization or analytics
  • Excellent communication/interpersonal skills including both written and verbal skills; experience speaking at conferences, meetups, or other technical events
  • Prior experience at top IT consulting company and/or software product company
  • BS/MS in Computer Science or related field
  • Expertise in an industry domain area (finance, healthcare, government, etc.)
Additional Experience/Skills (nice to have):
  • Experience with Kafka, Kinesis, or other similar CEP / data ingest / messaging solutions
  • Understanding of event processing / real-time streaming concepts and principles
  • Contributed to an Open Source software project, participated in an OS user/dev group, and/or created technical blogs
SENIOR ENGINEER

Senior Engineer

You have:
  • ~5+ years’ professional IT experience with emphasis in Java, Scala (or other functional programming language), Python or C++ design and development
  • 1+ years’ experience with cloud architectures and large data processing / Hadoop environments; ability to set up multi-node Hadoop clusters and write MapReduce jobs
  • Ability to integrate Hadoop Big Data ecosystem / Open Source technologies
  • Familiarity with DW/ETL/BI solutions and implementations
  • Familiarity with system administration and scripting tools such as bash, Python and/or Perl
  • Expertise with system administration of Hadoop ecosystem components including installation, upgrades, and security
  • Understanding of various data storage concepts (in-memory, NoSQL, columnar, etc.)
  • Specialized skills in Machine Learning / analytics or data visualization
  • Excellent communication/interpersonal skills including both written and verbal skills
  • Prior experience at top IT consulting company and/or software product company
  • BS/MS in Computer Science or related field
Additional Experience/Skills:
  • Experience with Kafka, Kinesis, or other similar CEP / data ingest / messaging solutions
  • Familiarity with Apache Spark / Spark Streaming and associated Spark components
  • Understanding of event processing / real-time streaming concepts and principles
  • Contributed to an Open Source software project, participated in an OS user/dev group, and/or created technical blogs
ENGINEER

Engineer

You have:
  • ~1 - 5+ years’ professional IT experience with emphasis in Java, Scala (or other functional programming language), Python or C++ design and development
  • Experience with cloud architectures and large data processing / Hadoop environments; ability to set up multi-node Hadoop clusters and write MapReduce jobs
  • Ability to integrate Hadoop Big Data ecosystem / Open Source technologies
  • Familiarity with DW/ETL/BI solutions and implementations
  • Familiarity with system administration and scripting tools such as bash, Python and/or Perl
  • Understanding of various data storage concepts (in-memory, NoSQL, columnar, etc.)
  • Specialized skills in either data visualization or analytics
  • Excellent communication/interpersonal skills including both written and verbal skills
  • Prior experience at top IT consulting company and/or software product company
  • BS/MS in Computer Science or related field
Additional Experience/Skills (nice to have):
  • Familiarity with Apache Spark / Spark Streaming and associated Spark components
  • Understanding of event processing / real-time streaming concepts and principles
  • Expertise with system administration of Hadoop ecosystem components including installation, upgrades, and security
  • Specialized skills in SAS or R
  • Contributed to an Open Source software project, participated in an OS user/dev group, and/or created technical blogs