Data& AnalyticsArchitect--NJ (Hybrid)other related Employment listings - Branchville, NJ at Geebo

Data& AnalyticsArchitect--NJ (Hybrid)

New Jersey Long Term Contract The Data and Analytics Architect is responsible for the overall design of the data, platforms, and applications (batch/real time, cloud/hybrid) within our Information Management area (including Business Intelligence & Advanced Analytics).
This will include the definition of standards, best practices, policies and procedures, integration patterns, data models, high level data flows and data services.
This Architect will work in collaboration with Application Development teams, Enterprise Architecture, and business partners to establish data and analytics architecture patterns, evaluate new tools and technologies, provide recommendations, support POCs, and assist with implementation/adoption as appropriate
Job Description:
Should have 5-10 years of exp in Data & AI experience(SS) Should have experience in Databricks, spark, python, and ETL experience-Informatica Having AWS experience is added advantage.
Azure experience is must along with ADF Should have experience in providing solutions to the customer.
Having presales solutioning is an advantage.
Strong communication skills.
Able to explain the concepts with clarity.
Must be having working experience with US customer base.
Job Description:
A Data Engineering Solution Architect is responsible for designing, implementing, and overseeing data engineering solutions that enable organizations to effectively collect, process, and manage their data assets.
They play a pivotal role in creating robust and scalable data architectures to support business objectives, making data accessible and usable for various analytical and operational purposes.
This role requires a deep understanding of data engineering best practices, technologies, and trends, coupled with strong communication and leadership skills.
Key Skills:
Data Architecture Design:
Develop and design scalable, efficient, and reliable data architectures that cater to the organization's data processing and storage needs.
Big Data Technologies:
Proficiency in utilizing and integrating various big data technologies such as Hadoop, Spark, Kafka, and others to build data pipelines and processing systems.
ETL/ELT Frameworks:
Expertise in designing Extract, Transform, Load (ETL) and Extract, Load, Transform (ELT) processes, ensuring data quality, integrity, and timeliness.
Database Management:
In-depth knowledge of database technologies including relational databases (e.
g.
, SQL Server, PostgreSQL) and NoSQL databases (e.
g.
, MongoDB, Cassandra).
Cloud Platforms:
Experience with cloud platforms such as AWS, Azure, or Google Cloud, designing and implementing data solutions using cloud-based services.
Data Modeling:
Proficient in data modeling concepts, dimensional modeling, and schema design to ensure efficient data storage and retrieval.
Data Governance:
Establish data governance frameworks, ensuring compliance with regulations and industry standards while maintaining data security and privacy.
Real-time Processing:
Familiarity with real-time data processing techniques and streaming platforms (e.
g.
, Apache Kafka, Apache Flink) for handling data as it arrives.
Data Warehousing:
Knowledge of data warehousing concepts and tools (e.
g.
, Snowflake, Redshift) for creating efficient and accessible repositories of structured data Experience in Modern data platforms.
one of Databricks, Spark, Snowflake, Synapse.
Programming and Scripting:
Strong programming skills in languages such as Python, Java, Scala, or others, along with proficiency in scripting for automation.
Performance Optimization:
Ability to optimize data pipelines and processing for performance, scalability, and cost-effectiveness.
Collaboration:
Effective communication and collaboration skills to work with cross-functional teams, including data scientists, analysts, and business stakeholders.
Problem Solving:
Analytical mindset to identify and solve complex data engineering challenges while adapting to evolving business requirements.
Leadership:
Ability to lead and guide technical teams in implementing data engineering solutions, providing technical direction and mentorship.
Continuous Learning:
Stay updated with the latest trends, tools, and technologies in the data engineering and analytics space.
Kind Regards, Neha Gupta Jade Business Services 9300 John Hickman Parkway, Suite 401, Frisco TX 75035 Data& AnalyticsArchitect--NJ (Hybrid) Recommended Skills Adaptability Amazon Redshift Amazon Web Services Apache Hadoop Apache Kafka Apache Spark Estimated Salary: $20 to $28 per hour based on qualifications.

Don't Be a Victim of Fraud

  • Electronic Scams
  • Home-based jobs
  • Fake Rentals
  • Bad Buyers
  • Non-Existent Merchandise
  • Secondhand Items
  • More...

Don't Be Fooled

The fraudster will send a check to the victim who has accepted a job. The check can be for multiple reasons such as signing bonus, supplies, etc. The victim will be instructed to deposit the check and use the money for any of these reasons and then instructed to send the remaining funds to the fraudster. The check will bounce and the victim is left responsible.