Big Data Analyst with statistics

The course first teaches you statistical expertise and enables you to recognize correlations, verify observations and correctly classify measurement data. You will then acquire the relevant knowledge to be able to evaluate and visualize company data as a big data analyst. The requirements of data and databases as well as data warehouse modelling and the ETL process, the Apache framework, data analysis and its programming, visualization and management in the big data context are explained. You will also learn how artificial intelligence is used in the workplace.
  • Certificates: Certificate "Statistics and data analysis"
    Certificate "Big Data Analyst"
  • Additional Certificates: Data Engineer" certificate
    Data Analytics" certificate
    Certificate "Big Data Specialist"
  • Examination: Praxisbezogene Projektarbeiten mit Abschlusspräsentationen
  • Teaching Times: Full-time
    Monday to Friday from 8:30 a.m. to 3:35 p.m. (in weeks with public holidays from 8:30 a.m. to 5:10 p.m.)
  • Language of Instruction: German
  • Duration: 16 Weeks

Statistics and data analysis

Statistical basics (approx. 6 days)

Measurement theory basics (population, sample, sample types, measurement, scale levels)

Univariate descriptive statistics (frequency distributions, central measures, measures of dispersion, standardization, histograms, bar charts, pie charts, line charts, box plots)

Bivariate descriptive statistics (measures of correlation, correlation coefficients, crosstabs, scatter plots, grouped bar charts)

Basics of inductive inferential statistics (probability distributions, normal distribution, sampling distribution of the mean, significance test, null hypothesis test, significance level, effect size, parameter estimation, confidence intervals, error bar charts, power analysis, sample size)

Data preparation and data cleansing with suitable software

Descriptive analysis

Visualization of statistical results

AI-supported analysis and interpretation of statistical results


Methods for comparing two groups (approx. 5 days)

z-test, t-test for one sample

t-test for independent and related samples

Pretest-posttest designs with two groups

Supporting significance tests (Anderson-Darling test, Ryan-Joiner test, Levene test, Bonett test, significance test for correlations)

Nonparametric methods (Wilcoxon test, sign test, Mann-Whitney test)

Contingency analyses (binomial test, Fisher's exact test, chi-square test, cross-tabulations, measures of association)

Interpretation of test results

AI-supported interpretation of results


Basics of regression analysis (approx. 2 days)

Linear regression

Model interpretation

AI-supported model interpretation

Correlation analysis


Methods for comparing the means of several groups (approx. 3 days)

One-factorial and two-factorial analysis of variance (ANOVA)

Post-hoc analyses

Interpretation of group differences

Multi-factorial analysis of variance (general linear model)

Fixed, random, crossed and nested factors

Multiple comparison methods (Tukey-HSD, Dunnett, Games-Howell)

Interaction analysis

Power analysis for variance analyses


Introduction to Design of Experiments (DoE) (approx. 1 day)

Full factorial and partial factorial experimental designs


Project work (approx. 3 days)

To consolidate the content learned

Presentation of the project results

Data Engineer

Basics of Business Intelligence (approx. 2 days)

Fields of application, dimensions of a BI architecture

Basics of business intelligence, OLAP, OLTP, tasks of data engineers

Data Warehousing (DWH): handling and processing of structured, semi-structured and unstructured data


Requirements management (approx. 2 days)

Tasks, objectives and procedures in requirements analysis

Data modeling, introduction/modeling with ERM

Introduction/modeling in UML

- Class diagrams

- Use case analysis

- Activity diagrams


Artificial intelligence (AI) in the work process

Presentation of specific AI technologies

and possible applications in the professional environment


Databases (approx. 3 days)

Basics of database systems

Architecture of database management systems

Application of RDBMS

Implementation of data model in RDBMS, normal forms

Practical and theoretical introduction to SQL

Limits of relational databases, csv, json


Data Warehouse (approx. 4 days)

Star Schema

Data modeling

Creation of Star Schema in RDBMS

Snowflake Schema, basics, data modeling

Creation of Snowflake Schema in RDBMS

Galaxy Schema: Basics, data modeling

Slowly Changing Dimension Tables Type 1 to 5 - Restating, Stacking, Reorganizing, mini Dimension and Type 5

Introduction to normal, causal, mini and monster, heterogeneous and sub dimensions

Comparison of state and transaction oriented

Fact tables, density and storage from DWH


ETL (approx. 4 days)

Data Cleansing

- Null Values

- Preparation of data

- Harmonization of data

- Application of regular expressions

Data Understanding

- Data validation

- Statistical data analysis

Data protection, data security

Practical structure of ETL routes

Data Vault 2.0, basics, hubs, links, satellites, hash key, hash diff.

Data Vault data modeling

Practical structure of a Data Vault model - Raw Vault, practical implementation of hash procedures


Project work (approx. 5 days)

To consolidate the content learned

Presentation of the project results

Data analytics

Introduction to data analysis (approx. 1 day)

CRISP-DM reference model

Data analytics workflows

Definition of artificial intelligence, machine learning, deep learning

Requirements and role in the company of data engineers, data scientists and data analysts


Review of Python basics (approx. 1 day)

data types

Functions


Data analysis (approx. 3 days)

Central Python modules in the context of data analytics (NumPy, Pandas)

Process of data preparation

Data mining algorithms in Python


Artificial intelligence (AI) in the work process

Presentation of specific AI technologies

and possible applications in the professional environment


Data visualization (approx. 3 days)

Explorative data analysis

insights

Data quality

Benefit analysis

Visualization with Python: Matplotlib, Seaborn, Plotly Express

Data storytelling


Data management (approx. 2 days)

Big data architectures

Relational databases with SQL

Comparison of SQL and NoSQL databases

Business Intelligence

Data protection in the context of data analysis


Data analysis in a big data context (approx. 1 day)

MapReduce approach

Spark

NoSQL


Dashboards (approx. 3 days)

Library: Dash

Structure and customizing of dashboards

callbacks


Text Mining (approx. 1 day)

Data preprocessing, visualization

Library: SpaCy


Project work (approx. 5 days)

To consolidate the content learned

Presentation of the project results

Big Data Specialist

What is Big Data? (approx. 1 day)

Volume, Velocity, Variety, Value, Veracity

Opportunities and risks of large amounts of data

Differentiation: business intelligence, data analytics, data science

What is data mining?


Introduction to Apache Frameworks (approx. 2 days)

Big data solutions in the cloud

Data access patterns

Data storage


MapReduce (approx. 3 days)

MapReduce philosophy

Hadoop Cluster

Chaining of MapReduce jobs


Artificial intelligence (AI) in the work process

Presentation of specific AI technologies

and possible applications in the professional environment


Components (approx. 3 days)

Brief presentation of various tools

Data transfer

YARN applications

Hadoop JAVA-API

Apache Spark


NoSQL and HBase (approx. 3 days)

CAP theorem

ACID and BASE

Types of databases

HBase


Big Data Visualization (approx. 3 days)

Theories of visualization

Diagram selection

New types of diagrams

Tools for data visualization


Project work (approx. 5 days)

To consolidate the content learned

Presentation of the project results



Changes are possible, the course content is updated regularly.

Programming skills (ideally Python) and experience with databases (SQL) are required.

You are proficient in the processes involved in merging, preparing, enriching and forwarding data and understand big data analysis using basic Python programming, SQL and NoSQL database concepts. Knowledge of industry-specific software for processing and structuring large, unstructured data and visualizing it rounds off your knowledge.

After this course, you will be able to prepare data, evaluate it statistically and interpret the results professionally. You will be able to carry out statistical analyses independently using suitable software, visualize results and use methods for hypothesis testing and decision support. You will also be able to plan samples and carry out power analyses.

The course is aimed at people with a degree in computer science, business informatics, business administration, mathematics or comparable qualifications.

A systematic evaluation of data volumes is essential for companies in order to generate information about their own products and customer behavior. Against this backdrop, big data analysts are increasingly in demand across all industries.

A sound knowledge of statistics is a valuable additional qualification that is in great demand in industrial research and development, in drug development, in the supervision of medical studies, in finance and insurance, in information technology or in public administration.

Your meaningful certificate provides a detailed insight into the qualifications you have acquired and improves your career prospects.

Didactic concept

Your lecturers are highly qualified both professionally and didactically and will teach you from the first to the last day (no self-study system).

You will learn in effective small groups. The courses usually consist of 6 to 25 participants. The general lessons are supplemented by numerous practical exercises in all course modules. The practice phase is an important part of the course, as it is during this time that you process what you have just learned and gain confidence and routine in its application. The final section of the course involves a project, a case study or a final exam.

 

Virtual classroom alfaview®

Lessons take place using modern alfaview® video technology - either from the comfort of your own home or at our premises at Bildungszentrum. The entire course can see each other face-to-face via alfaview®, communicate with each other in lip-sync voice quality and work on joint projects. Of course, you can also see and talk to your connected trainers live at any time and you will be taught by your lecturers in real time for the entire duration of the course. The lessons are not e-learning, but real live face-to-face lessons via video technology.

 

The courses at alfatraining are funded by Agentur für Arbeit and are certified in accordance with the AZAV approval regulation. When submitting a Bildungsgutscheinor Aktivierungs- und Vermittlungsgutschein, the entire course costs are usually covered by your funding body.
Funding is also possible via Europäischen Sozialfonds (ESF), Deutsche Rentenversicherung (DRV) or regional funding programs. As a regular soldier, you have the option of attending further training courses via Berufsförderungsdienst (BFD). Companies can also have their employees qualified via funding from Agentur für Arbeit (Qualifizierungschancengesetz).

We will gladly advise you free of charge.

0800 3456-500 Mon. - Fri. from 8 am to 5 pm
free of charge from all German networks.

Contact

We will gladly advise you free of charge. 0800 3456-500 Mon. - Fri. from 8 am to 5 pm free of charge from all German networks.