Job Description
Roles and Responsibilities
Developer with hands-on experience to build end to end applications. This involves designing and developing data extraction/loading, processing, integration, quality layers and UI.
• Write code for delivering functional stories, test cases, infra automation script, security scripts, monitoring tools and other related use cases.
• Deep expertise in some of - Java/Python, React/Angular/Node JS, Oracle/SQL Server and strong analytic skills related to working with unstructured datasets.
• Knowledge & practical experience building applications using Amazon Web Services (AWS) (or other public Cloud platforms like GCP /Azure)
• Deep Knowledge of AWS and its various services, primarily EC2, VPC, IAM, Serverless offerings, RDS, R53, CloudFront
• Deep Knowledge of UNIX system architecture
• Strong hold on networking core concepts
• Deep Knowledge and understanding of Serverless Architecture and AWS offering in the area
• Comfortable and strong command on Terraform and/or CloudFormation Core concepts and hands-on writing
Hands-on experience with Unix Scripting, Python
Expertise with CI/CD pipelines and with few of the DevOps tools like Jenkins/Ansible etc. Understanding of containers - Docker/Kubernetes, Cloud build/deploy
Build & optimize data pipelines, architecture & data sets supporting data transformation, data structures, metadata, dependency and workload management
Working knowledge of APIs, caching and messaging
Experience in software delivery in agile methodologies. TDD & pair programming best practices to ensure quality certified deliverables
Experience in performing root cause analysis on internal and external data and processes to answer specific business questions and continuously identify opportunities for improvement.
Experience delivering on data related Non-Functional Requirements like-
• Hands-on experience dealing with large volumes of historical data across markets/geographies.
• Manipulating, processing and extracting value from large disconnected datasets.
• Building water-tight data quality gates on investment management data
• Generic handling of standard business scenarios in case of missing data, holidays, out of tolerance errors etc.
Good to have knowledge / past experience of
• Message queuing, stream processing, and highly scalable data stores on Cloud
• eXtreme prog, pairing, mobbing & other collaborative development practices
• Experience with snowflake SQL Writing SQL queries against Snowflake Developing scripts Unix, Python etc to do Extract, Load and Transform, Snowpipe for bulk distribution
• Big data stack - either on Cloud or on-prem. Data analytics & data science/machine learning / quantitative implementation
• Functional understanding of Capital Markets & Investment data