|San Antonio, TX
Gathi Analytics is a relentlessly client-focused group who builds ground-breaking data solutions for Smart Cities, Payment Services, Healthcare, Finance, and so much more. Gathi Analytics leverages proprietary frameworks and methodologies to deliver solutions faster, without compromising quality. We help our clients gain data insights they never thought possible and understand their data assets in ways they never could before. If this sounds like you, please read on.
About the Data Architect / Data Modeler Role
This role is responsible leading/building Relational and Dimensional Models for Integration Layer, Semantic Layer & Audit Layer. Ability to assess existing data models and quickly come up with a proposal on target state data architecture that suits the client needs that suit various consumption patterns from data lake to Integration/Harmonization to Semantic/Consumption layer data models (proficiency in Kimball). Ability to implement large scale industry reference data models to suit banking or insurance domains, with right functional knowledge of implementing the subject area models. Design and Build Conceptual, Logical, Physical Data Models and Create Physical Data Structures (DDLs). Create Source to Target Data Mapping rules with the business transformation rules by discussing with business users. Extensive experience in Solution design, Data architecture, Data modeling, Data integration and Business Analytics design and development.
- Lead large data modeling re design initiatives covering assessment of existing models, proposing target state data models, helping with conceptual designs for each type of the data model proposed.
- Experience leading large data architecture and modeling initiatives leveraging industry reference data models from IBM or Teradata or other market leaders.
- Lead/Build Relational and Dimensional Models for Integration Layer, Semantic Layer & Audit Layer. (proficiency in Kimball)
- Building Industry based Reference Data Models for Large Banks and Insurance companies
- Design and Build Conceptual, Logical, Physical Data Models and Create Physical Data Structures (DDLs).
- Create Source to Target Data Mapping rules with the business transformation rules by discussing with business users.
- Extensive experience in managing cross functional teams, leading the development, gathering Business Requirements, Analysis, Design, and Implementations of various functionalities of Data Warehouse, Business Intelligence, ERP applications and Big Data- Hadoop Solutions.
- Expert in Agile Data Warehouse, Enterprise Architecture using Industry model Framework Methodology, Data Modeling using E/R and Dimension Models, Conceptual, Logical and Physical Models for large scale DW and DL implementations.
- Expertise in the design of the Conceptual, Subject area Logical and Physical Data Models using Erwin and/or ER Studio Suite.
- Experience in Data model design frameworks and techniques pioneered by Ralph Kimball and Bill Inman.
- Experience in Industry Financial or/and Insurance Data Models: Teradata FSLDM and IBM BDW data Model Frame works and Insurance ACORD framework and capability models.
- Experience in Master Data management, Metadata Management solutions and Data Governess Policies and Procedures.
- Worked on Big data projects involved Hadoop technologies and Data Lake reference architecture and Frameworks.
- Optional – Worked on Cassandra Data Modeling, NoSQL Architecture, policies, and procedures.
- Experience in understanding the Business requirements and translating them into detailed design along with Technical specifications.
- A plus – Exposure to software engineering techniques like Micro services, Event Driven Architecture, using event streaming technologies like Kafka.
- Worked closely with cross functional teams to effectively co-ordinate, manage Business user expectations and adhere to organizational architectural standards and policies.
- Experience in Data Lake framework, Data Ingestion and Processing – Analyze various sources and develop the code to ingest the data into Hadoop Data Lake.
- Good experience in Big Data –Hadoop solution implementation for Operational Reporting and Analytics using HIVE, SPARK & SPARK SQL, Sqoop and other Hadoop eco system projects.
- Good exposer and experience on Cloud based data stores- AWS Cloud on Big data specialty – Redshift, S3 etc.
- Expert in Agile software development and release management using Scrum Process. Hands on Experience with Agile software development Tools such as JIRA.
- Manage stakeholder expectations with continuous engagement through status reports, proactive communication on new opportunities and issues.
- Design solution architecture, conducts analysis and development of applications and proof of concepts
- Acts as subject matter expert related to the effective use of analytical or BI methodologies, particularly those related to decision tree, time series data, forecasting, or data visualization and dashboard design.
- Build different design patterns, data architecture, Cloud technology stack (AWS/Azure)
- Continuous learning and keep track of the latest developments in business/ technical advancements
- Demonstrate the compassion to lead and bring value to the architecture portfolio
- 14 + years IT experience, with focus on Data Modeling, Data solutioning
- Deep functional Knowledge in Retail Banking a key requirement
- Experience working with Industry Reference Data Models primarily financial data models
- 8+ years’ experience with SQL (ability to read/write/manipulate queries)
- 8+ years of strong solution architecture experience in variety of IT positions. Recent 5+ years of experience must be on solution/systems/ enterprise architecture with large scale complex IT systems
- 5+ years’ experience working in an Agile/Scrum environment
- Experience and knowledge in handling Fortune 500 companies.
- Warehouse architectures including ETL design, staging, transformations
- Star-schemas, cubes, dimensional modeling
- Database platform layouts and configurations
- Demonstrated ability to work in a fast-paced, highly technical environment
- Excellent communications skills, both written and verbal
- Troubleshooting complex system issues, handle multiple tasks simultaneously and translate user requirements into technical specifications
- Excellent written and verbal communication skills.
- Conduct current state assessment and map opportunities and organizational goals to target state architecture and roadmap.
- Working creatively and analytically in a problem-solving and fast-paced agile development environment.
- Experience in knowledge of MapReduce, HBase, Pig, MongoDB, Cassandra, Impala, Oozie, Mahout, Flume, Zookeeper/Sqoop and Hive
- Strong database fundamentals including SQL, performance, and schema design
If this role sounds like you, we invite you to apply. Our recruiting team will be in touch shortly with qualified candidates regarding next steps.
Gathi Analytics is an equal opportunity employer and provides opportunities for advancement, compensation, training, and growth according to individual merit, without regard to race, color, religion, sex (including pregnancy), national origin, sexual orientation, gender identity, marital status, age, genetic information, disability, veteran-status, or any other characteristic protected under applicable Federal, state, or local law.