By joining Valtech you will become part of a unique digital consultancy. We bring together design, data science and software engineering professionals to deliver new products for our customers across the UK.
This diverse set of capabilities means that no two projects are the same. We undertake transformative projects with a collaborative approach. We work with our customers employing Agile and Lean principals to support decision making, build understanding and ultimately deliver the right service to users.
We are passionate about delivery and curious about new innovations in digital. Our teams seek to fail fast to accelerate learning making experimentation safe. As communities we are sociable and encourage learning. We use regular internal events, meet ups and external events to engage with new ideas at all levels.
How we deliver
There are some fundamental principles to the way that we deliver. While we are pragmatic in our approach to any specific project, if you don’t share these values we are unlikely to be a good fit for you:
Agile methods have always emphasised the merit of teams with all the skills necessary to delivery on user’s needs. To work effectively in a multi-disciplinary team it helps to be genuinely interested in the work of those in other roles. We value the flexibility to pick up slack across other areas of capability as well as expertise and specialization.
We find that diverse groups working together come to better solutions. In user testing this approach builds empathy in the team. In design we collaborate to test ideas and develop alternatives. As a team we work together to improve our approach.
This applies to product development as it applies to individuals. Our projects are designed to engage with uncertainty creating opportunities to learn. We use experiments, prototypes and hypotheses to learn about user behaviour, design algorithms and select technologies. Our communities of practice provide a space for us to improve and learn from peers. While as individuals a personal training budget enables opportunity to grow and develop.
When driving our projects we focus on outcomes. Knowing how a project will be deemed a success by users and customers enables creativity and innovation.
Hands-on data architect with experience of delivering large scale data analytics applications on Hadoop stack (or equivalent).
- Create data architectures and designs for new systems primarily focused on big data and data analytics.
- Hands-on approach so can implement the designs as well as lead data developers to implement them.
- Perform detailed analysis of business problems, data analytics requirements and technical environments and use this in designing the solution.
- Develops database solutions by designing proposed system; defining database physical structure and functional capabilities, security, back-up, and recovery specifications.
- Create logical and physical data models.
- Design data flows and ETL/ELT.
- Understand and communicate the pros and cons of different technologies and approaches.
- Make recommendations and informed design decisions.
- Apply patterns for storing, processing and transforming data.
- Apply patterns for building views of the same data.
- Understand existing data architecture and data flow.
- Managing client expectations throughout the project lifecycle from initiation through to delivery.
- Strong communication skills including representing your company in industry standards organizations or industry technical forums.
Agile and Lean methodologist
Valtech believes in the principles of iterative delivery and the continued validation of the platform’s value proposition. While you may not have an exclusively agile background, you will believe in these Lean principles.
- Ideally 8+ years of technical solutions experience including 3+ years in a combination of relevant Big Data/Analytics areas including:
- Enterprise Business Intelligence and Analytics (e.g., SAP, Oracle, Teradata).
- Hadoop (MapReduce, Spark) and other Industry Big Data Frameworks, including using NoSQL databases such as MongoDB, HBase, Columnar etc.
- Underlying infrastructure for Big Data Solutions (Clustered/Distributed Computing, Storage, Data Centre Networking).
- Predictive analytics
- Data Visualization
- Unstructured data
Cloud/Big Data/NoSQL: AWS S3, EMR, Hadoop (Cloudera/Hortonworks), Hive, Pig, Spark, Yarn, HDFS, MapReduce, Sqoop, Flume, Elasticsearch
NoSQL databases: MongoDB, HBase, Columnar, Neo4j, Cassandra
Databases: MS SQL, Oracle, Teradata, Kognitio, Actian (ex ParAccel), Sybase
Data Integration/ETL/Reporting/OLAP: MS IS/RS/AS, Informatica, Ab Initio, Business Objects, MicroStrategy, QlikView, Pentaho, DataStage
Programming: C, Perl, Python, Scala, Java;
OS: Unix, Linux, Windows