Blog

  • Linux Terminal Mastery: Commands, Shells, and File Systems

    Linux Terminal Mastery: Commands, Shells, and File Systems

    This text is a transcript of a Linux crash course aimed at beginners. The course, offered by Amigo’s Code, covers the fundamentals of the Linux operating system, including its history, features, and various distributions. It guides users through setting up a Linux environment on Windows and macOS using tools like UTM and VirtualBox. The curriculum further explores essential Linux concepts like file systems, user management, and commands, including the use of the terminal. The course then introduces Bash scripting, covering variables, conditionals, loops, functions, and the creation of automated scripts. The goal of the course is to equip learners with the skills necessary to effectively use Linux for software development, DevOps, or system administration roles.

    Linux Crash Course Study Guide

    Quiz

    1. What is Linux and who developed it?

    Linux is a powerful and flexible operating system developed by Linus Torvalds in 1991. Unlike operating systems such as Windows and macOS, Linux is open source and allows developers around the world to contribute and customize.

    2. What are the key features of Linux that make it a preferred choice for servers?

    The key features are stability, security, the ability to be customized to specific needs, and performance. Due to these factors, servers worldwide often prefer Linux.

    3. What is a Linux distribution? Name three popular distributions.

    A Linux distribution is a specific version or flavor of the Linux operating system tailored for different purposes. Three popular distributions are Ubuntu, Fedora, and Debian.

    4. Explain what UTM is and why it’s used in the context of the course.

    UTM is an application that allows users to securely run operating systems, including Linux distributions like Ubuntu, on macOS. It’s used in the course to demonstrate how to set up and run Linux on a Mac.

    5. What is VirtualBox and how is it used for Windows users in the course?

    VirtualBox is a virtualization software that allows Windows users to install and run other operating systems, including Linux distributions like Ubuntu, within a virtual environment.

    6. What is the difference between a terminal and a shell?

    A terminal is a text-based interface where users type commands and view output. A shell is a program that interprets and executes those commands, acting as an intermediary between the user and the operating system.

    7. What is Zsh, and why is it used in this course?

    Zsh (Z shell) is an extended version of the Bourne shell, known for its advanced features like auto-completion, spelling correction, and plugin support. It is used in the course to provide a more customizable and efficient command-line experience.

    8. What is Oh My Zsh, and what does it offer?

    Oh My Zsh is an open-source framework for managing Zsh configuration. It includes numerous helpful functions, helpers, plugins, and themes to enhance the Zsh experience.

    9. Explain the command sudo apt update. What does it do?

    sudo apt update updates the package index files on the system. These files contain information about available packages and their versions. The sudo ensures the command is executed with administrative privileges.

    10. What is a Linux command and what are its three main parts?

    A Linux command is a text instruction that tells the operating system what action to perform. The three main parts are the command itself, options (or flags) which modify the command’s behavior, and arguments, which specify the target or input for the command.

    Quiz Answer Key

    1. What is Linux and who developed it?

    Linux is a powerful and flexible operating system developed by Linus Torvalds in 1991. It’s open-source and allows for worldwide contributions.

    2. What are the key features of Linux that make it a preferred choice for servers?

    Key features include stability, security, customizability, and performance, making it ideal for servers.

    3. What is a Linux distribution? Name three popular distributions.

    A Linux distribution is a specific version of Linux. Ubuntu, Fedora, and Debian are examples.

    4. Explain what UTM is and why it’s used in the context of the course.

    UTM lets macOS users run other operating systems, including Ubuntu. The course uses it to set up Linux on a Mac.

    5. What is VirtualBox and how is it used for Windows users in the course?

    VirtualBox is a virtualization software. It allows Windows users to run Linux within a virtual environment.

    6. What is the difference between a terminal and a shell?

    A terminal is the interface for typing commands. The shell interprets and executes these commands.

    7. What is Zsh, and why is it used in this course?

    Zsh is an improved shell with features like auto-completion. The course uses it for a better command-line experience.

    8. What is Oh My Zsh, and what does it offer?

    Oh My Zsh is a framework for managing Zsh configuration. It provides themes and plugins to customize the shell.

    9. Explain the command sudo apt update. What does it do?

    sudo apt update updates package lists, requiring administrative privileges through sudo.

    10. What is a Linux command and what are its three main parts?

    A Linux command is a text instruction to the OS. It consists of the command, options, and arguments.

    Essay Questions

    1. Discuss the advantages of using Linux as a server operating system compared to Windows Server. Consider factors such as cost, security, and customization.
    2. Explain the significance of open-source development in the context of Linux. How does the collaborative nature of its development benefit the Linux community and users?
    3. Compare and contrast the roles of the terminal and the shell in a Linux environment. How do they interact to enable users to control the operating system?
    4. Describe the process of installing Ubuntu on both macOS (using UTM) and Windows (using VirtualBox). What are the key differences and considerations for each platform?
    5. Discuss the importance of Linux file permissions and user management in maintaining a secure and stable system. Provide examples of how incorrect permissions can lead to security vulnerabilities.

    Glossary of Key Terms

    • Linux: A powerful and flexible open-source operating system kernel.
    • Distribution (Distro): A specific version of Linux that includes the kernel and other software.
    • Open Source: Software with source code that is publicly available and can be modified and distributed.
    • Terminal: A text-based interface used to interact with the operating system.
    • Shell: A command-line interpreter that executes commands entered in the terminal.
    • Zsh (Z Shell): An extended version of the Bourne shell with advanced features and plugin support.
    • Oh My Zsh: An open-source framework for managing Zsh configuration.
    • Command: An instruction given to the operating system to perform a specific task.
    • Option (Flag): A modifier that changes the behavior of a command.
    • Argument: Input provided to a command that specifies the target or data to be processed.
    • Sudo: A command that allows users to run programs with the security privileges of another user, typically the superuser (root).
    • UTM: An application that allows you to run operating systems on macOS devices.
    • VirtualBox: Virtualization software that allows you to run different operating systems on your computer.
    • Operating System: The software that manages computer hardware and software resources.
    • Server: A computer or system that provides resources, data, services, or programs to other computers, known as clients, over a network.
    • Root Directory: The top-level directory in a file system, from which all other directories branch.
    • File System: A method of organizing and storing files on a storage device.
    • Directory (Folder): A container in a file system that stores files and other directories.
    • GUI: Graphical User Interface. A user interface that allows users to interact with electronic devices through graphical icons and visual indicators such as secondary notation, as opposed to text-based interfaces, typed command labels or text navigation.

    Linux Crash Course: A Beginner’s Guide

    Okay, here’s a detailed briefing document summarizing the main themes and ideas from the provided text:

    Briefing Document: Linux Crash Course Review

    Overall Theme: This document is a transcript of a video presentation promoting a “Linux Crash Course.” The course aims to take complete beginners to a point of understanding and mastering Linux, particularly in the context of software engineering, DevOps, and related fields. The presenter emphasizes that Linux is a fundamental skill in these areas.

    Key Ideas and Facts:

    • Linux Overview:Linux is described as a “powerful and flexible operating system” developed by Linus Torvalds in 1991.
    • A key feature of Linux is that it’s “open source,” with developers worldwide contributing to its improvement and customization.
    • Linux boasts “stability, security, the ability of changing it to your needs, and performance.” This makes it preferred for servers globally.
    • Linux is used by “internet giants, scientific research companies, financial institutions, government agencies, educations,” and pretty much every single company out there.
    • Amigo’s code is actually deployed on a Linux server
    • Linux is versatile and used on “smartphones to service and also Raspberry Pi.”
    • Linux Distributions:Linux has different “flavors” called distributions.
    • Ubuntu is highlighted as the “most popular flavor of Linux.” It comes in server and desktop versions (with a graphical user interface).
    • Other distributions mentioned include Fedora, Debian, and Linux Mint.
    • Companies often customize Linux distributions to meet their specific needs.
    • Course Promotion:The presenter encourages viewers to subscribe to the channel and like the video.
    • The full 10-hour course is available on their website, with a coupon offered.
    • The course aims to “make sure that you become the best engineer that you can be.”
    • The course has a “Windows users as well as Mac users”
    • Setting up Linux (Ubuntu) on Different Operating Systems:Mac: The presentation details how to install Ubuntu on a Mac using an application called UTM (a virtualization software).
    • Windows: Installation of Ubuntu through VirtualBox.
    • Understanding the Terminal:The terminal allows users to interact with the operating system by entering commands.
    • Understanding the shellshell is a program for interacting with the operating system.
    • Z Shell (zsh)zsh also called the zshell is an extended version of Bor shell with plenty of new features and support for plugins and themes
    • Linux CommandsThey are case sensitive
    • Linux File SystemThe Linux file system which is the hierarchical structure used to organize and manage files and directories in a Linux operating system
    • Files and PermissionsLinux is a multi-user environment where allows us to keep users files separate from other users
    • Shell scriptingIt is essentially a command line interpreter

    Quotes:

    • “If you don’t know Linux and also if you are afraid of the terminal or the black screen then you are in big trouble so this course will make sure that you master Linux”
    • “Linux is a must and don’t you worry because we’ve got you covered”
    • “Linux it’s a powerful and flexible operating system”
    • “Linux is open source developers around the world contribute to improve and customize the operating system”
    • “servers around the world prefer Linux due due to its performance”
    • “Linux is open source but it’s also used on a wide range of devices from smartphones to service and also Raspberry Pi”
    • “Ubuntu is the most popular flavor out there”
    • “At Amigo’s code we want to make sure that you become the best engineer that you can be”
    • “So many original features were added so let’s together in install zsh and as you saw the default shell for Mac OS now is zsh or zshell”
    • “We’ve got Bash as well as chh Dash KS sh T C CH and then zsh”

    Potential Audience:

    • Beginners with little to no Linux experience.
    • Software engineers, DevOps engineers, backend/frontend developers.
    • Individuals seeking to enhance their skills and career prospects in the tech industry.

    In summary: The document outlines a Linux crash course that aims to provide individuals with the necessary skills to confidently navigate and utilize the Linux operating system in various professional tech roles. It covers core concepts, practical setup, and promotes the course as a means to become a proficient engineer.

    Linux and Shell Scripting: A Quick FAQ

    FAQ on Linux

    Here is an 8-question FAQ about Linux and shell scripting, based on the provided source material.

    1. What is Linux and why is it important for aspiring engineers?

    Linux is a powerful and flexible operating system developed by Linus Torvalds in 1991. Its open-source nature allows developers worldwide to contribute to its improvement and customization. Its stability, security, and performance make it a preferred choice for servers and various devices, ranging from smartphones to Raspberry Pi. For aspiring software, DevOps, or backend engineers, understanding Linux is crucial because most companies deploy their software on Linux servers, making it an essential skill.

    2. What are Linux distributions and how do they differ?

    Linux distributions (distros) are different “flavors” of the Linux operating system, each customized to suit specific needs. Popular distributions include Ubuntu, Fedora, Debian, and Linux Mint. Ubuntu, particularly its server and desktop versions, is a popular choice for many, while other distributions cater to specific requirements in different companies. The source material mentions Ubuntu will be used in the course.

    3. How can I install Linux (Ubuntu) on my Mac?

    On a Mac, Ubuntu can be installed using virtualization software like UTM. First, download and install UTM from the Mac App Store. Then, download the Ubuntu server ISO image from the Ubuntu website. Within UTM, create a new virtual machine, selecting the downloaded ISO image as the boot source. Configure memory and disk space as needed, and start the virtual machine to begin the Ubuntu installation process. The source material also highlights the Ubuntu gallery in UTM.

    4. How can I install Linux (Ubuntu) on my Windows machine?

    On Windows, you can use VirtualBox. The steps include downloading and installing VirtualBox. Then download the Ubuntu desktop ISO image from the Ubuntu website. Create a new virtual machine in VirtualBox, selecting the downloaded ISO image. Configure memory and disk space. Install ubuntu to the VM.

    5. What is the difference between the Terminal and the Shell?

    The terminal is a text-based interface that allows you to interact with the operating system by entering commands. It provides the prompt where commands are entered and outputs the results. The shell, on the other hand, is the program that interprets the commands entered in the terminal and executes them against the operating system. Shells include Bash, Zsh, Fish, and others.

    6. What is Zsh and how do I switch from Bash to Zsh?

    Zsh (Z shell) is an extended version of the Bourne shell, known for its advanced features like auto-completion, spelling correction, and a powerful plugin system. To switch from Bash to Zsh, first install Zsh using the command sudo apt install zsh. Then, change the default shell using the command chsh -s /usr/bin/zsh. After rebooting the system, Zsh will be the default shell. Oh My Zsh can be used to configure Zsh.

    7. What are Linux commands, options, and arguments?

    Linux commands are text instructions that tell the operating system what to do. They are case-sensitive. A command can include options and arguments that modify its behavior. For example, in the command ls -a ., ls is the command, -a is an option (for showing hidden files), and . is the argument (specifying the current directory).

    8. What are user types and how do permissions work?

    Linux is a multi-user environment with two main types of users: normal users and the superuser (root). Normal users can modify their own files but cannot make system-wide changes. The superuser (root) can modify any file on the system. Permissions control access to files and directories. The ls -l command displays file permissions, divided into three sets: user, group, and others. Each set includes read (r), write (w), and execute (x) permissions, dictating what actions each user type can perform on the file.

    Understanding Linux: Features, Usage, and Commands

    Linux is a powerful and flexible open-source operating system that was developed by Linus Torvalds in 1991 and has since become a robust platform used worldwide. Here’s an overview of some key aspects of Linux:

    • Open Source Linux is open source, meaning developers can contribute to improving and customizing it.
    • Key Features Stability, security, customizability, and performance are key features. Its flexibility and security make it a preferred choice for companies.
    • Usage Linux is used by internet giants, scientific research companies, financial institutions, government agencies, and educational institutions. Many companies deploy their software on Linux.
    • Distributions Linux has different versions called distributions, with Ubuntu being the most popular. Other distributions include Fedora, Debian, and Linux Mint.
    • Terminal In Linux, the terminal (also known as the command line interface or CLI) is a text-based interface that allows interaction with the computer’s operating system by entering commands. It provides a way to execute commands, navigate the file system, and manage applications without a graphical user interface.
    • Shell A shell is a program that interacts with the operating system. The terminal allows users to input commands to the shell and receive text-based output from the shell operations. The shell is responsible for taking the commands and executing them against the operating system.
    • File System The Linux file system is a hierarchical structure that organizes and manages files and directories. It follows a tree structure with the root directory at the top, and all other directories are organized below it.
    • Commands Linux commands are case-sensitive text instructions that tell the operating system what to do.
    • Shell Scripting Shell scripting automates tasks and performs complex operations by creating a sequence of commands. A shell script is saved with the extension .sh.

    Shell Scripting Fundamentals in Linux

    Shell scripting is a way to automate tasks and perform complex operations in Linux by creating a sequence of commands. It involves writing scripts, typically saved with a .sh extension, that contain a series of commands to be executed.

    Key aspects of shell scripting include:

    • Bash Bash (Born Again Shell) is a command line interpreter used to communicate with a computer using text-based commands.
    • Editor A text editor is needed to write scripts, which could be a simple editor like Vim or a more feature-rich option like Visual Studio Code.
    • Shebang The first line of a shell script typically starts with a “shebang” (#!) followed by the path to the interpreter (e.g., #!/bin/bash). This line tells the operating system which interpreter to use to execute the script.
    • Variables These are containers for storing and manipulating data within a script. In Bash, variables can hold various data types like strings, numbers, or arrays.
    • Conditionals These allow scripts to make decisions based on specific conditions, executing different blocks of code depending on whether a condition is true or false.
    • Loops Loops enable the repetition of instructions. for and while loops can iterate over lists, directories, or continue tasks until a condition is met.
    • Functions Functions group a set of commands into a reusable block, promoting code modularity and organization.
    • Comments Adding comments to scripts is considered a best practice as it helps in understanding the script’s purpose, functionality, and logic. Comments are lines in a script that are not executed as code but serve as informative text.
    • Passing Parameters Bash scripts can receive input values, known as parameters or arguments, from the command line, allowing customization of script behavior. These parameters can be accessed within the script using special variables like $1, $2, $3, etc. The special variable $@ can be used to access all parameters passed to the script.
    • Executable Permissions Scripts are executables that require giving executable permissions using chmod.

    To run a shell script:

    1. Save the script with a .sh extension.
    2. Give the script executable permissions using the chmod +x scriptname.sh command.
    3. Execute the script by using its path. If the script is placed in a directory included in the PATH environment variable, it can be run by simply typing its name.

    Linux File Management: A Command-Line Guide

    File management in Linux involves organizing, creating, modifying, and deleting files and directories. This is primarily done through the command-line interface (CLI) using various commands.

    Key aspects of file management include:

    • Linux File System: The file system is a hierarchical structure with a root directory (/) at the top, under which all other directories are organized.
    • Essential Directories:
    • /bin: Contains essential user commands.
    • /etc: Stores system configuration files.
    • /home: The home directory for users, storing personal files and settings.
    • /tmp: A location for storing temporary data.
    • /usr: Contains read-only application support data and binaries.
    • /var: Stores variable data like logs and caches.
    • Basic Commands
    • ls: Lists files and directories. Options include -a to show hidden files and -l for a long listing format that includes permissions, size, and modification date.
    • cd: Changes the current directory. Using cd .. moves up one directory level. Using cd – flips between the previous and current directory.
    • mkdir: Creates a new directory. The -p option creates nested directories.
    • touch: Creates a new file.
    • rm: Removes files.
    • rmdir: Removes empty directories.
    • cp: Copies files.
    • File Permissions: Linux uses a permission system to control access to files and directories. Permissions are divided into three categories: user, group, and others. Each category has read (r), write (w), and execute (x) permissions. The ls -l command displays file permissions in a long listing format.
    • Working with Files:
    • To create an empty file, use the touch command.
    • To create a file with content, use the echo command to redirect a string into a file.
    • To view the contents of a file, you can use a text editor or command-line tools like cat.
    • Working with Directories:
    • To create directories, use the mkdir command.
    • To remove empty directories, use the rmdir command.
    • To remove directories and their contents, use the rm -rf command.
    • Navigating the File System To navigate, utilize the cd command followed by the directory path.

    It is important to note that commands are case-sensitive.

    Linux User and File Permissions Management

    User permissions in Linux control access to files and directories in a multi-user environment. Here’s an overview:

    • Types of Users There are normal users and superusers (root).
    • Normal users can modify their own files but cannot make system-wide changes or alter other users’ files.
    • Superusers (root) can modify any file on the system and make system-wide changes.
    • Commands for User Managementsudo: Executes a command with elevated privileges.
    • useradd -m username: Adds a new user and creates a home directory.
    • passwd username: Sets the password for a user.
    • su username: Substitutes or switches to another user.
    • userdel username: Deletes a user.
    • File Permissions Permissions determine who can read, write, or execute a file.
    • The ls -l command displays file permissions in a long listing format. The output includes the file type, permissions, number of hard links, owner, group, size, and modification date.
    • The file type is the first character. A d indicates a directory, and a – indicates a regular file.
    • Permissions are divided into three sets of three characters each, representing the permissions for the user (owner), group, and others.
    • r means read, w means write, and x means execute. A – indicates that the permission is not granted.
    • The first three characters belong to the user, the second three to the group, and the last three to everyone else.

    Essential Linux Terminal Commands

    Linux terminal commands are case-sensitive text instructions that tell the operating system what to do. These commands are entered in the terminal (also known as the command line interface or CLI), allowing you to interact with the operating system. The terminal provides a way to execute commands, navigate the file system, and manage applications without a graphical user interface.

    Here are some basic and essential commands:

    • ls: Lists files and directories.
    • ls -a: Includes hidden files.
    • ls -l: Uses a long listing format, displaying permissions, size, and modification date.
    • cd: Changes the current directory.
    • cd ..: Moves up one directory level.
    • cd -: Flips between the previous and current directory.
    • mkdir: Creates a new directory. The -p option creates nested directories.
    • touch: Creates a new file.
    • rm: Removes files.
    • rmdir: Removes empty directories.
    • cp: Copies files.
    • sudo: Executes a command with elevated privileges.

    Each command may have options and arguments to modify its behavior. To understand how to use a command effectively, you can refer to its manual for instructions.

    Linux For Beginners – Full Course [NEW]

    The Original Text

    what’s going guys assalamualaikum welcome to this  Linux crash course where I’m going to take you   from complete beginner to understanding Linux this  is a course that abs and I put together and it’s   currently 10 hours which a bunch of exercises  if you don’t know Linux and also if you are   afraid of the terminal or the black screen then  you are in big trouble so this course will make   sure that you master Linux and whether you want to  become a software engineer devops engineer backend   front end it doesn’t really matter Linux is a  must and don’t you worry because we’ve got you   covered if you’re new to this channel literally  just take 2 seconds and subscribe and also smash   the like button so we can keep on providing you  content like this without further Ado let’s off   this video okie dokie let’s go ahead and kick off  this course with this presentation which I want   to go through so that you have a bit of background  about Linux so Linux it’s a powerful and flexible   operating system that was developed by lonus tals  in 1991 so the name Linux comes from the Creator   Linus and since 1991 Linux has grown into a robust  and reliable platform used by millions worldwide   as you’ll see in a second the cool thing about  Linux unlike operating systems such as Windows Mac   OS is that Linux is open source developers around  the world contribute to improve and customize the   operating system and it has a Vibrant Community  of contributors and I’ll talk to you in a second   about distributions as well because it plays a big  part since Linux is open source the key features   of Linux are stability security the ability of  changing it to your needs and performance so   servers around the world prefer Linux due due to  its performance so who uses Linux well interned   Giants scientific research companies financial  institutions government agencies educations and   the platform that you are using right now so  Amigo’s code is actually deployed on a Linux   server so you look at Google meta AWS NASA  and obviously Amigo’s code and pretty much   like every single company out there majority of  them I can guarantee you that their software is   being deployed on Linux it might be a different  flavor of Linux but it will be Linux and the   reason really is because of the flexibility  and it’s secure so this is why companies opt   to choose Linux and the cool thing about Linux  is that it’s open source as I’ve mentioned but   it’s also used on a wide range of devices from  smartphones to service and also Raspberry Pi so   if you’ve ever used a Raspberry Pi the operating  system on this tiny computer is Linux Linux has   something called distributions and these are  different flavors the most popular flavor of   Linux is Ubuntu and you have the iunu server or  the desktop which comes with a graphical user   interface and this distribution is what we’re  going to use and is the most popular out there   but obviously depending on the company that  you work for the software will be deployed on   a different flavor of Linux to customize their  needs but there are also other distributions   such as Fedora Debian Linux Mint and plenty  of others and this is a quick overview about Linux cool before before we actually proceed  I just want to let you know that the actual   10 hour of course is available on our brand  new website and I’m going to leave a coupon   and a link as well where you can basically go  and check for yourself because many of your   students already have engaged with the course  they’ve been learning a lot and to be honest   the positive has been really really great so far  so we are coming up with something really huge   and we decided that Linux had to be part of this  something and here Amigo’s codee we want to make   sure that you become the best engineer that you  can be details will be under the destion of this video okie dokie for the next two sections  we’re going to focus on Windows users as   well as Mac users and just pick the operating  system that you are using and go straight to   that section because the setup will be the  exact same thing so I’m going to show you   how to get Linux and you bu to up and running  on your operating system if you want to watch   both sections feel free to do so uh but in this  course I just want to make sure that there’s no   issues when it comes to Windows or Mac because  there’s a huge debate which uh is better and   also um after those two sections you’ll see  how to rent a server of the cloud okay so   if you don’t to use nor um yunto or Linux on  your local machine but you prefer to rent it   from the cloud I’m also going to show you how  to do so cool this is pretty much it let’s get started in order for us to install Ubuntu on  a Mac operating system we’re going to use this   application called UTM which allows you to  securely run operating systems on your Mac   whether it’s window Window XP which I really  doubt that you’re going to do windows I think   this is Windows 10 maybe you can also run your  buntu which is the one that we’re going to run   and also like old operating systems in here also  Mac as well so you can virtualize Mac and um I’ll   basically walk you through how to use it and  install yuntu right here which is what we need   in order to get up and running with Linux cool  so in here what we’re going to do is click on   download and you can download from the Mac Store  then pretty much save this anywhere so in my case   I’m going to save it on my desktop and just give  a second to download cool then on my desktop I’m   going to open up this UTM DMG there we go and all  I’m going to do is drag this to applications and   job done now let me close this and also I’m  going to eject UTM and also I’m going to get   rid of this UTM file in here and now I’m going to  press command and then space and we can search for   UTM and then I’m going to open and I’m going to  continue and there we go we successfully installed UTM the next thing that we need is to install  Ubuntu navigate to ubuntu.com and in this page   in here we can download yuntu by clicking on  download and then what I want you to do is   let’s together download the Ubuntu server and  I’ll show you how to get the desktop from the   Ubuntu Server so here you can choose Mac and  windows so I’ve got the arm architecture so   I’m just going to choose arm in here if you  are on regular Mac and windows you can just   basically download your Windows server for  the corresponding architecture and operating   system so here I’m going to click on arm and  you can read more about it in here so this is   the I think this is the long-term support 2204 and  then two and right here you can see that you can   download the long-term support or I think this  is the latest version in here so in my case it   doesn’t really matter which version I download so  I’m just going to download the long-term support   in here so this my bugs who knows so here let’s  just download and I’m going to store this onto my desktop now just give it a minute or so so  my internet is quite slow and you can see the   download still uh in progress but once this  finishes I will um come back to you awesome   so this is done also what I want to show you  is within UTM you can click on on browse UTM   gallery or you can get to it via so in here  if I switch to the UTM official website in   here click on gallery and basically this gives  you a gallery of I think the operating systems   which are currently supported so you can see Arch  Linux Debian Fedora Kali Linux which is quite nice   actually and then you have Mac OS you have Ubuntu  I think this is the older version actually you’ve   got the 20.01 which is the long-term support  Windows 10 11 7 and XP so if you want to go   back to olden days feel free to do so but  we just downloaded yuntu from the official   website which is good and also have a look the  architecture in here so arm 64 x64 right so make   sure to pick the one which is corresponding to  you so if you want to have for example Windows   as well feel free to download an experiment or  a different version of Linux so K Linux which   is quite nice actually feel free to do it but  in my case I’m going to stick with traditional   buntu and next what we’re going to do is to  create a virtual machine and have Linux up and running right we have UTM as well as the iso  image in here for Ubuntu let’s create a brand new   virtual machine and in here we want to virtualize  never emulate so here this is slower but can run   other CPU architectures so in our case we want to  virtualize and the operating system is going to be   Linux leave these and check and in here boot ISO  image open up the iso that we’ve just downloaded   right so here browse and I’ve just opened up  the Ubuntu 22.0 4.2 the next step is going to   be continue and here for memory usually you should  give half of your available memory so in my case   I’m just going to leave four gigs I’ve seen that  it works quite well and I do actually have 32 but   I’m not giving 16 so I’m just going to leave four  in here and CPU course I’m G to leave the default   and here if you want to enable Hardware open  Gil acceleration you can but there are known   issues at the moment so I’m not going to choose  it continue 64 gig so this is the size of the dis   continue and here there’s no shared directory path  continue and for the name in here what I’m going   to do is say YouTu and then the version so 20.0  four dot and then two awesome so you can double   check all of these settings and I think they’re  looking good click on Save and there we go so at   this point you can see that we have the VM in here  so we’re going to start it in a second we can see   the status is stopped the architecture arm the  machine memory 4 gig the size this will increase   in a second and there’s no shared directory but  the cd/ DVD is yuntu which is this one in here so   one more thing that I want to do is so before  we play I want to go to settings so in here   click on settings and you can change the name  if you want to in here and uh all I want really   is within this play I’m going to choose retina  mode and this will give me the best performance   cool so save and I’m good to go next let’s go  ahead and install Ubuntu within our virtual machine oky dokie now the next step for us is to  click on play in here or in here and this should   open this window so in here you can see that  currently so I’m just gonna leave the the screen   like this so currently you can see that it says  press option and um I think it’s enter for us to   release the cursor So currently my cursor is not  visible right and the way that I can interact with   this UI is by using my keyboard so the down arror  in here and the app Arrow right so if you want   your cursor back you just press control and then  option I think control option there we go you can   see that now I have the mouse in here cool let me  just close this in here for a second and I’m going   to Center this like so and now what we’re going  to do is try or install Ubuntu Server so I’m going   to press enter on my keyboard so display output  is not active just where a second and we should   have a second screen there we go you can see that  now we have this other the screen and basically   now we can basically configure the installation  process so in my case I’m going to use English UK   and here so for you whatever country you are just  use the correct country basically so here English   UK for me press enter Then the layout and the  variant so for the keyboard I want to leave as   default and at the bottom you can see that I can  flick between done and back so I’m just going to   say done and I’m going to leave the default  so I want the default installation for you Bo   to server not the minimized so just press enter  and here there’s no need to configure the network   connections continue no need to configure proxy  and also the mirror address just leave as default   enter and in here configure a guided storage so  here I’m going to use the entire disk and leave   as default so just with my errors go all the way  to done enter and here now we have the summary   and you can see the configuration basically 60 gig  and I think the available free space is 30 gig and   you can see there and at this point I you sure  you want to continue I’m going to say continue   now for the name I’m going to say Amigos code the  server name I’m going to say Amigos code the usern   name Amigos code the password I’m going to have  something very short and Easy in here and then   go to done continue and I’m not going to enable  yuntu Pro continue continue there’s no need to   install open SSH server because we don’t need to  remote access this server that we are installing   so here done and also we have a list of available  software that we can install so for example micro   Kates nexcloud Docker you can see AWS CLI in here  so a CLI Google Cloud SDK we don’t need none of   these also postgress which is right here so if  you want to install these by all means feel free   to take but for me I’m going to leave everything  as default done and at this point you can see that   what he doing is installing the system so just  wait for a second and I’m going to fast forward   this step oky doie so you can see that the  installation is now complete and at this point   what it’s doing is downloading and installing  security updates so it’s really up to you whether   you want to wait for this or not but in my case  I think it’s you know the best practice for you   to have everything patched and um updated so I’m  just going to wait and then I’ll tell you what are   the next steps so this might take a while so just  sit back and relax all right so this is now done   and you can see that installation is complete and  we can even say reboot now but don’t don’t click   reboot now what we need to do is so basically we  have to close this so again close this window will   kill the virtual machine which is fine okay now  open UTM once more and you can see that we have   the Ubuntu virtual machine in here and if I open  this up once more so all I want to show you is   that this will take us to the exact same screen  to install yuntu server now we don’t want this so   close this and what we have to do is in here  so CD DVD clear so we have to remove the iso   image and at this point feel free to delete this  so here I’m going to delete this and that’s it   cool so this is pretty much the installation for  Ubuntu next let’s get our Ubuntu Server up been running cool so make sure that this CD for/ DVD  is empty before you press continue or before   you press play so let’s just play this there  we go and I can close this and I’m going to   Center things there we go and at this point we  should see something different there we go now   have a look yuntu the version that we installed  and then it says Amigos code login cool so here   what we need to do is I think we have to add  the username which is Amigos code then press   enter followed by by the password so my password  was I’m not going to tell you basically but it’s   a very simple one so here you don’t see the  password that you type so just make sure that   you have the username and the password press  enter and check this out so now we are inside   and we’ve managed to log in cool so at this point  this is actually yuntu server right so there’s no   graphical user interface and um that’s it right  so later you’ll see that when you SSH into the   servers this is what you get right so this black  screen and that’s it now obviously for us I want   to basically install a graphical user interface  so that basically you see the Ubuntu UI and um   the applications I I’ll show you the terminal  and whatnot but in a shell this is yuntu server   so at this point you can type commands so here  for example if you type LS for example just L   and then s press enter nothing happens if you type  PWD so these are commands you learn about these uh   later but if I press enter you can see that I’m  within home/ Amigos code if I type CD space dot   dot so two dots enter you can see that now if I  type PWD I’m within home okay so this is pretty   much the yuntu server and this is a Linux box  that we can interact with but as I said we want   to install a graphical user interface to simplify  things for now and that’s what we’re going to do next within the official page for UTM navigate  to support in here and I’ll give you this link   so you can follow along but basically they give  you the installation Pro process and things that   you should be aware when working with UTM now  one of the things is if we click on guides or   expand you can see that you have the different  operating systems so Debian 11 Fedora Cali yuntu   Windows 10 and 11 so we installed 2204 so let’s  open that and it doesn’t really matter just click   on your BTU and whatever version here um that  you have installed this will be the exact same   thing okay so just select yuntu anything that says  going to in here cool so if I scroll down in here   so they have a section so we’ve done all of this  creating a new virtual machine and basically here   installing you B to desktop if you install you B  to server then at the end of the installation you   will not have the graphical user interface to  install we need to run these commands in here   so sudu apt and then update and then install and  then reboot awesome let’s go ahead and run these   commands cool so in here within in Yun to server  let me see if I can increase the font size for   this so control and then plus and it doesn’t look  like I can but I’ll show you how to increase the   font size for this in a second but here let’s  type together PSE sudo and then a and then   update press enter and then it’s asking me for  the password for Amigos code so make sure you   add the password for your username press enter  and you can see that it’s basically performing   the update now now the update basically is used  to update the package index files on the system   which contains information about the available  packages and their versions cool so you can see   that 43 packages can be upgraded now if you  want to upgrade the packages we can run PSE   sudo and then AP so let me just remove the mouse  from there ABT space and then up and then grade   and here I’m going to press enter and we could  actually use flags and we could say Dy but for   now let’s just keep it simple and later you learn  about these commands so press enter and you can   see that it says that the following packages will  be upgraded right so you see the list of all the   packages do you want to continue I can say why  why for yes cool now just give it a second and   now it’s actually upgrading these packages to  their latest versions so it’s almost done and   and now it says which Services should be restarted  leave everything as default and say okay so I’m   just going to press the tab and then okay cool  that’s it so now the last thing that we need to   do is to install Yun desktop so this point type  sud sudo a and then install Ubuntu Dash and then   desk and then top press enter and you can see  that it gives us a prompt I’m going to say w   and now we’ve go prompt and it says do you want  to continue I’m going to say y for yes and now   we just have to wait until it installs the Yun  to desktop and this is pretty much the graphical   user interface that will allows us to interact  with our operating system like we are using for   example the Mac OS right so this is the graphical  user interface but equally we do have the terminal   so if I open up the terminal quickly so terminal  so have a look so this is the terminal right so so   what we doing is we are basically installing the  same experience that we have within Mac OS so if   I click on the Apple logo and then basically use  all of the functionalities that this operating   system has to offer right so if I click on about  this Mac and then I can go to more info so on and   so forth so let me just cancel this and just wait  for a second until this finishes oky doie cool so   this is done and if you encounter any errors  or anything whatsoever just restart and then   run the exact same command but here you see that  there were no errors cool at this point there’s   no services to be restarted no containers and  all we have to do is re and then boot now just   wait for it and hopefully now at this point  you should go straight into the desktop okie   dokie you can see that now we managed to get the  desktop app and running cool so at this point just   click on it the password so this is my password  and I’m going to press enter hooray and we’ve   done it cool so if you managed to get this far  congratulations you have successfully installed   yuntu otherwise if you have any questions drop  me a message next let’s go ahead and set up Ubuntu okie dokie so we have yuntu desktop up  and running and from this point onwards let me   just put this as full screen and for some reason  I have to log in again again that’s fine cool so   you can see that the UI looks nice and sharp  and in here let’s just um basically say next   we don’t want to install the btu Pro and in  here whether you want to send information to   developers I don’t really mind to be honest  and location I’m just going to turn it off   and here it says you’re ready to go and let’s  just say done cool we have the home so here I   can just put this on this side and what we’re  going to do here is some customization this is   actually an operating system so you’ve got a  few things in here so you’ve got mail client   you’ve got files so if I click on files you  know this is a file system you know the same   as I have in my Mac so here let me just close  this and what I want to do is I want to go to   show all applications or I could just right  click in here and then I can say a few things   so one is display settings this is what I’m  mostly interested so in here I’m going to   put things a little bit bigger so fractional  scaling and here I’m going to increase this by 175% apply so that things are quite visible to  you so here keep changes and you can see that   now it’s nice and big cool so just move this in  here and it doesn’t look like it lets me right   So eventually it will let me move this but also  I can click on show all applications and I can go   to settings through here so the same thing and um  cool so you can go to keyboards you can you know   change according to your layout so I’m going to  leave my one as English UK which is fine you can   go to displays you can change the resolution if  you want to power configure that whether you want   power saver mode or not and um online accounts  so here I’m not going to connect to anything   privacy go back and I’m just showing you around  so background if you want to change the background   you’re more than welcome to do so here a couple of  different ones but for me I’m going to stick with   the default and appearance as well you can change  this right so here if you want a blue theme for   example which I kind of like to be honest or this  purple right here so let’s just actually choose   this blue for myself and for the desktop size  I’m going to say Lodge in here so that things   are visible to you okay and I can scroll down  and also icon size just as big as this and you   can show Auto Hide dock does that work probably  I don’t know right so I think it hides when yes   when this collides with this right so basically at  this point it just hides cool let me just remove   that I don’t think I need that and notifications  so you can go through and customize this the way   you want it but for us I think this is looking  good one other thing also is if you have a window   opened you can basically pull it all the way to  the left and it will just snap so if I open a   new so let’s just say I have um a new window for  example I can put this right in here and you can   see that it auto arranges for me which is kind  of nice and uh to be honest I think I’m going   to stick with the red so red not not there but I  think it was appearance yes I think this orange   or actually orange I think this orange looks nice  yeah I don’t know it’s very difficult cool so I   think I’ll just stick with this default for now  cool and um yeah so let me just close this and   this and this is pretty much it uh I don’t know  for some reason why this is not um oh actually let   me just click on arrange icons maybe that will  do it no I think it doesn’t do it because it’s   too big it doesn’t want to move but I think if I  restart then it should basically sort itself out   uh the other thing is so in here yes so here I can  basically remove so I’m going to remove this from   favorites same as this so stop and quit and remove  favorites the same with this and no need for help   and I think this is pretty much it awesome this  is my setup and also I think the clock is I think   it’s 1 hour behind so feel free to fix that but to  me it doesn’t really matter so this is pretty much   it if you have any questions drop me a message  but this is the configuration required for yuntu in order for us to install yuntu desktop on  Windows let’s use Virtual box which basically   allows you to install and run a large number  of guest operating systems so here you can   see uh Windows XP which I doubt you’ll ever use  Vista Windows and then Linux in here Solaris so   these are different distributions but basically we  need virtual box in order to install another guest   operating system on top of windows so navigate to  downloads and in here download the Windows host   so just give it a second and then open file there  we go I’m going to say next so the installation   process should be really straightforward and don’t  worry about this warning in here so just say say   yes and then it says missing pice dependencies  do you want to install it yes why not and then   install cool so this should take a while and you  can see that we have the shortcut being created   for us in here and we are good to go so let me  just unake this because we’re going to continue on   the next video say finish and we have successfully  installed virtual box catch me on the next one we do have virtual box installed and before I open  this application in here the next thing that we   need is the Ubuntu desktop itself so that we can  mount on top of virtual box so in here if I open   up my web browser and search for Ubuntu on Google  and basically go to ubuntu.com and you you should   see so if I accept in here you should see that  we can download so we have yunto server in here   and this is the current version as I speak and  whatever version that you see provided that is the   sorry this is not the latest this is the long-term  support but whatever long-term support you see   just download that so here go to developer and you  should see that we have yuntu desktop so click on   that and we can download yuntu desktop so you can  watch this video if you want but I’m not going to   but if I scroll down you can see that it says that  your buntu comes with everything you need to run   your organization School home or Enterprise and  you can see the UI in here so on and so forth so   let’s go ahead and download yuntu cool now if  I scroll down in here you can see that we have   this version so long-term support just download  whatever long-term support you see available so download and it should start very soon there we  go and it should take about 5 minutes or so to   complete or even less now and there we go now  let me open this on my desktop and you can see   that it’s right here awesome now that we have  Ubuntu desktop next let’s go ahead and uh use   Virtual box to install this ISO image in here  this is pretty much it catch me on the next one cool let’s open up virtual box in here  and I’ll walk you through the steps required   to get yuntu desktop up been running so if this  is the first time that you’re using virtual box   this should be completely empty so what we’re  going to do is create new and here we’re going   to name this as yuntu there we go and you can put  the version if you want but I’m going to leave it   as is then for the iso image is the one that we  downloaded so here let’s just select order and   then navigate to desktop and I’ve got my Ubuntu  ISO image open cool and in here you can see that   basically we can’t really select anything else  so let’s just click next and we can actually   have the username and password so in my case I’m  going to say Amigos and then code there we go and   then choose the password so if I click on this  I icon in here you can see that it says change   me right so you can change this or you can leave  it as default so in my case I’m going to leave as   change me but obviously I would never do this  and I want want to leave the host name as you   B to domain as uh what it is right now and then  next then we need to specify the base memory in   here as well as the CPU so for CPU let’s just  choose two cores in here and for memory if you   have more memory on your machine just feel free  to rank this up to maybe four gigs but for me I’m   going to leave as default next and then here it  says either to create a virtual hard disk or use   an existing or do not so in my case I’m going to  basically have 20 gigs so here I’m really saving   memory uh I don’t think there’s much space on  this Windows machine so 20 gigs I think should   be fine and uh yeah create a virtual hard disk  say next and now we have a summary you can read   through it and let’s finish cool so this is pretty  much it now you can see that it says powering VM up so just wait for a second until this is up  and running and you can see that I think it’s   done right so you can see that it’s actually  running now obviously if I click click on it and then we have this window and you can see  that it’s loading yuntu and it says mouse mouse   integration so click on here and then there as  well all right so just give her a second or so   and uh this should install successfully and there  we go you can see that this was really quick and   and here you can see that it’s installing few  things so this is now installing and basically   I’m going to leave this complete and then I’ll  come back to it so that we can proceed our setup   cool now it’s installing the system and I can  click on this button and you can see what it’s   doing on the terminal so let’s just wait until  this finishes and this should take a while for   you so for me I’m going to speed up this video but  uh you should yuntu up and running in a second and   in fact we could actually skip this all together  but I’m going to leave it finish and after a long   time of waiting it seems that it’s almost there  let’s just wait and finally we are done cool so   if you get to this point where you have your  user and then you can click on it and then in   here the password was change and then me right  so I didn’t change the password let me just show   you so change me if I press enter we should be  able to log in if the password is correct there   we go cool this is pretty much it I’m going to  leave it here and we’ll continue on the next video okie dokie so we are almost done  with the configuration so one thing that   I want to do is let’s just click on  Virtual box in here and uh click on settings and then let’s go to Advanced so under  General click on Advanced and then share clipboard   so we’re going to basically say bir directional  and the same for drag and drop so basically we   can basically drag stuff from our Windows to our  yuntu desktop and uh the same with the clipboard   say okay or actually let’s just go to system  and see whether we have to change something so   I don’t think we have to change anything else  and uh under storage so in here so just make   sure that this is empty so make sure that this  is empty that it doesn’t contain the iso image   in here cool audio everything should be fine if  you want to enable AUD the input feel free to   do so serial ports nothing USB shared folders and  user interface we’re going to leave everything as   is okay and in here I can just close this and uh  if I try to put this full screen in here you can   see what happens so to do this what we have to  do is install virtual box guest editions so in   here we’re not going to connect to any online  accounts let me just Skip and also I’m going   to skip the pro yuntu pro next and uh also if  you want to send data feel free but I’m not   going to send any data click on next and uh I’m  going to turn off location and there you go you   you see that it says you’re ready to go you can  use software to install apps like these so press   done and what I want to do is let’s open up the  T terminal so click on this button in here that   shows all applications and then open the terminal  so this is the terminal and with the terminal open   let me just put this full screen like that and  now what we’re going to do is type some commands   and at this point in here I don’t expect you to  know none of this because we’re going to learn   in detail how all of this works cool so the first  thing that that we want to do is is if you type   with me so P sudo and then here we’re going to say  a PT and then up and then date so if this command   in here does not work and now it’s asking me for  the password so change and then me now I’m typing   but you don’t see that I’m typing because this is  done by default because here I’m typing sensitive   information which is the password press enter  and and if it says that Amigo’s code is not in   sudo’s file this incident will be reported that’s  fine so all we have to do is so if this happens   to you type pseudo or actually sorry my bad Su and  then Dash and then here type the password again so   change and then me and or whatever password that  you added and there we go now we can type pseudo   and then add and then user so user and make sure  that add user is all together and then type Amigos   so the user in question so this is Amigos code for  me and then we want to add Amigos code to PSE sudo   just like that press enter and you can see that  this is basically added Amigos code to PSE sudo   and now it means that if I say Su and then Amigos  and then code and by the way Su just allows it to   change users and you will also learn about this  command press enter and now you can see that I’m   back to Amigos code in here and if we type PSE  sudo and then basically the the previous command   I’m just going to press the up aror this one so  P sudo AP and then update and then let’s add the   password once more so change me enter you can see  see that this time this command works so I’m going   to leave these commands that I’ve just did under  description of this video so that you can follow   along as well cool the next command that we need  to run is PSE sudo in here and then AP install   Dash and then Y and then build Dash and then  essential space and then Linux Dash and then   headers Dash and then add dollar sign and then  parenthesis and then you name space Dash and then   R and then close parentheses just like that cool  also you’ll find this command under description   of this video press enter and just give you a  second or so and there we go now navigate to   devices and then insert guest editions CD image  and you can see that we have this dis in here so   let’s just click on it and now what we want to do  is let’s take this file in here autorun Dosh and   then drag it to the terminal in here and let’s  see whether this works so if I so right at the   end if I press enter this doesn’t work and that’s  because I need to remove the quotes there we go so   the quote at the beginning and also the one at  the end and then press enter and it looks like   it doesn’t work so let’s just click on the dis  again and then here I’m going to right click and   then opening terminal so we have a new terminal  let me close the old one so close this terminal   and then we can close this as well and now if  I put this full screen all I want you to do is   to type dot slash and then autorun Dosh press  enter and now it’s asking me for the password   for Amigo’s code and the password was change me so  change me let me show you change me authenticate and it’s installing some modules now we have to  wait until this is complete and the last step   will be to restart our machine and uh it says  press return to close the window it seems to   be done so I’m just pressing return there we go  finished and now let’s together so in here click   on the battery icon and then let’s click on power  off restart restart and now if I restore down so   let’s just restore and and um what I want to do  is I want to put it full screen so if I maximize   now now you saw that the screen went black and uh  what we have to do is so let’s just basically make   this smaller open up virtual box and basically on  this Ubuntu which is running click on settings and   display and what we want to do is to change the  video memory now this is grade out because what   we need to do first is right click on the VM  itself and we want to stop it so stop and then   power off cool let’s switch it off now we can go  to settings and then display and now you can see   that we can change this now I’m going to put this  at 64 somewhere in the middle okay and if we click   on it so you can click here or you can start  if you want through this button just give you a second and very soon we should have so  let me just close this we don’t need this so it should start very soon there we go and  if we try to put this full screen on actually did   that for me but what I want to do is actually put  everything full screen you can see that this time   it works there we go and then if I click on Amigos  code the password was change me enter and we are   inside cool so now what we can do is go to view  and then you can say full screen mode and you   can switch in here and you can see that now all of  this is in full screen mode and we done it awesome   we successfully installed yuntu and if you want to  exit full screen you can see here I could just go   down View and then you have the keyboard shortcuts  as well but if I press on it you can see that I   came out from this and I do have access two in  here my Windows machine cool this is pretty much   it and also if you get software updata just  go and install as always but in my case I’m   going to be naughty and I want to say remind me  later this is pretty much it catch me on the next one cool in this video what I want to walk  you through is how we going to customize our   desktop so in here let’s together just put this  full screen and I’m going to switch there we go   and if you want to customize the look and feel  of yuntu desktop go to show applications at the   bottom and then click on settings cool now that we  have settings in here we are able to change couple   of things so you can change the network specific  information Bluetooths in here background so you   can choose for example if you don’t like this  background just choose this one for example you   can see that it changes but for my case I’m going  to stick with the default in here appearance so   appearance you can change the color theme in  here so maybe you like this color in here so   if I click on it you can see that the icons have  changed have a look right but I’m going to leave   the default in here so everything is consistent  throughout and you can change the icon size if   you want as well so I think the icon size I  think we have one icon in here so the oh icon   so if I increment this no it’s yeah the icon  size is basically this one right here on the   left right so I’m going to leave that as 64 you  can change this according to whatever you prefer   so for me it’s more about making sure that you  see everything visible in here notifications so   there’s nothing here search multitasking so you  can basically configure this uh I’m not going   to touch none of these applications so there’s  no configuration on any of these applications   in here let me just go back privacy same nothing  I’m going to change here and um online accounts   you can connect your Google account Microsoft and  whatnot sharing so there’s nothing here you can   change the the computer name if you want sound as  well power so basically you can have power saver   or not screen blank after whatever minutes screen  display and in here let’s basically scale this so   let’s say that we want 200 so apply and you can  see that things are now so big so I’m going to   keep these changes 200 and um let’s have a look  if I put this full screen what what do I get yeah   this looks nice right so 200 and then let me go  to I think it was background or sorry appearance   and then the icon size we can make a little bit  smaller now just like this you can leave it like   that uh but as as I said you could basically do  whatever you want okay so screen display and again   you can make this 100% I’m just making things  big so you can see sharply and then Mouse and   touchpad you can change the speed if you want the  keyboard so my one is so so let me just add United   Kingdom so this is my one English UK and add and  then delete this guy so remove there we go and   um printers nothing there removable media nothing  and device color profiles and obviously here I can   scroll down and you can see a bunch more right so  language region so here you can change the region   the language accessibility date and time so on and  so forth all right so also the same with users so   here we only have one user and uh you can change  the password if I’m not mistaken in here right   cool so my password is changed me I could change  for uh something better but I’ll show you how to   do all of this through the terminal which is uh  what we are here to learn how to use Linux andd   terminal let me cancel and then close this so  let’s just get rid of this from favorites I’m   going to get rid of this as well office as well  ubun to sofware as well help as well I want to C   I want to keep it clean eject there and I think  this is it awesome this is pretty much it if you   have any questions drop me a message but from now  on if you you followed the Mac installation this   is the exact same point and vice versa so from  now on both Mac and windows uses everything should   be the same because we are using Ubuntu desktop  this is pretty much it I’ll see you on the next one okayy doie so with Linux it’s all about the  terminal and really the reason why I installed the   desktop is so that you basically get an operating  system but what we’re going to focus throughout   this course is on the terminal so Linux it’s all  about the terminal and as you’ll see later when   we SS into a remote server we never have the  graphical user interface so it’s all about the   terminal cool so what is a terminal really  right so terminal also known as the command   line interface or CLI so in here if I go to show  applications and we have terminal so let’s just   take terminal and I think if we put it there  does it takes it from here so hold on so let   me just put it back in here right click add to  favorites oh yes it doesn’t really matter cool   so the terminal now it’s within the favorites and  now I can just click on it and open cool so what   is this terminal in here right so we have Amigos  code and then add Amigos code so the terminal   is a text based interface that allows you to  interact with a computer operating system by   entering commands so in here let me just type  one command and you’ll see how it works so if   I say date for example press enter so this  is giving me the current date so this was   a command and we’ll let learn more about this  um commands and what not in a second but this   is a command which allows me to interact with the  operating system so similarly if I want to create   a folder in here on my desktop I’m going to type  mkd and then here just type Tilda for a second so   tilder for SL desktop and then say for Slash and  here I’m going to say f for example press enter   and now have a look there’s a new folder that  was created for us full right so the terminal   allows us to interact with the computer operating  system by entering commands it provides a way to   execute commands navigate the file system manage  applications without the need of the graphical   user interface so to be honest we don’t even need  this UI right so usually you would right click   and then uh move to trash for example so this so  This basically deletes the file so this is with   the graphical user interface but in reality we  don’t need this right so here if I just say for   example RM and we’ll go through these commands in  a second so RM and then tiller for Slash and then   desktop and then F and actually so here I need  to say RM dasr and then F so you learn this in   a second if I press enter you can see that now  the folder has disappeared okay so this is the   terminal so the terminal allows us to interact  with the operating system the time not provides   a prompt so this is the prompt where we can  enter the commands and receive the output so   when we say date we get an output right so some  commands we don’t get an output but I’ll show   you um other other things that we can do right so  with this we can per perform a wide range of tasks   such as navigating directories creating modifying  files running programs accessing system resources   and whatnot so the terminal is commonly used by  developers and systems administrators to perform   a bunch of tasks including software development  server Administration and Automation and this is   a very powerful and efficient way to work with a  computer operating system and it’s an essential   tool for everyone working with programming and  development so knowing how to use the terminal   it’s a must for you right so this is the reason  why I’ve got this course for you because you   should be doing pretty much everything through  your terminal okay I don’t want to see you know   if I want to create a folder right click on your  uh graphical user interface new folder and then   say the folder name blah blah blah so this is  bad Okay so by the end of this course you oh   you see I’m actually deleting the folder using  uh the UI this is wrong but let me just do it   there we go but at the end of this course you  will familiarize yourself quite a lot with the   terminal so that you have all the required skills  in order to use the terminal and as you’ll see   a bunch of tools such as git Docker kubernetes  all of them you actually have to interact with   them through the CLI or The Terminal cool  this is pretty much it catch me on the next one within my Mac OS what I want to show you is  that I also have a terminal available within Mac   OS so here if I search for terminal so this right  here is the default terminal that comes with Mac   OS so here I can type any command and basically  this will be executed against my operating system   so if I say for example LS in here and you’re  going to learn about LS later but just type   this command press enter and in here this is just  listing all the contents that I have within home   right here so if I type clear for example so  this is another command so this actually clears   the terminal and here I can type for example  PWD in here you’ve seen this one so this is   users and then Amigos code so similarly there’s  also another ter available and this is not part   of the Mac OS operating system but it’s the one  that I use actually on my machine and that is   the iter so in here this is item so yet another  terminal this is way fancier than the other one   you can see the look is actually all black and  it has lots of customizations for example if I   want to split the screen into two for example  I could just do it like that and maybe three   times right here you can see that I’ve got one  two three and in this I can type LS in here I   can type PWD and in here I can type for example  Cal for example and you can see that basically   I’m executing commands in three different shells  and I’m going to talk about shells in a second   but basically you can see that this terminal  right here is way more powerful than the default   that comes with Mac OS so here let me just close  this and yeah so I just wanted to show different   terminals available for Windows what you have  is the command line or simply CMD and it kind   of looks like this and probably you’ve seen  this if you’re on Windows and again this is a   terminal so you can run commands and those will be  executed against your operating system and perform   whatever tasks that you tell it to do and this is  pretty much for this video catch me on the next one also what I’m going to show you is within  text editors and idees there will always be an   integrated terminal so you don’t necessarily  have have to use a terminal that ships with   your operating system or you don’t have to  install a ter for example so here I’ve got   VSS code so visual studio code open and within  Visual Studio code if I click on Terminal and   then here new terminal and in here you see  that I do have the terminal so here I can   type the exact same commands that you saw  so for example if we type who and then am   and then I so just type this command here if you  have Visual Studio code or any other text editor   so in here let me just type who and then um I so  don’t you worry about this uh we’ll cover all of   these commands but for now I’m just showing you  other the terminals so if I press enter you can   see that this gives me a MH code also so I think  so this is one is quite cool so within terminal   I can split the terminal in here so have a look  so the same way that you saw with item which is   quite nice right and here you actually have two  different shells so this is zsh and we’ll cover   uh shells in a second but here I can delete this  delete this as well and it’s gone also one of my   favorite ID is intellig so in here intell has an  integrated terminal if I open this you can see   that we have the terminal in here and I can type  again the same command so who am I press enter   and you can see that this basically gives you the  exact same output awesome so this is pretty much   about terminals if you have any questions drop  me a message otherwise catch me on the next one all right so you know what the terminal is now  let’s focus on understanding exactly what the   shell because often people use these two words  so terminal and shell they’re kind of the same   thing and if you do it that’s fine but it’s  very important that you understand what is   the actual difference between terminal and  shell and that’s what we’re going to focus   in this section and also you’ll see how we’re  going to switch from bash to zsh and you also   see different shells available for the Linux  environment so without further Ado let’s kick off in this section let’s focus on understanding  what the shell is and basically we’ll also   customize the default shell that we have to a  better one but inet shell a shell is a program   for interacting with the operating system right  so you’ve seen that uh we have the terminal in   here and the terminal is just an application  that allows users to input commands to the   shell and receive text based outputs from the  shell operations right now the shell is was   actually taking the commands themselves and then  executing those against the operating system let   me give you a quick demo so in here let me  just open the terminal and the terminal in   here is responsible for taking the inputs right  so the terminal basically allows you to create   uh multiple tabs allows you to expand uh allows  you to here new tab so this is a terminal right   but now whenever I type a command so if I type  for example touch and this is the command that   you’ve seen before so on the slide so touch  and here I’m want to say desktop and then   full bar.txt so if I press enter and don’t worry  too much about this command so you learn how this   works but basically this command in here right  so I’m passing this command through the terminal   so the terminal is responsible for taking the  commands and also outputting the results from   commands executed by the shell so the shell now  is responsible to interact against the operating   system so if I press enter you can see that  we have the file in here full bar.txt right   so the same if I say RM and basically the same  Command right so here let me just go back and   if I say RM right press enter you can see that  the file is gone and again don’t worry too much   about this you learn all of these commands  later but this is pretty much the concept of   terminal and shells now I’ve said shells because  there’s a bunch of different shells that you can   use with Linux you have bash in here so bash for  Born Again shell this is one of the most widely   used shells in the default and is a default  on many Linux distributions we have zsh so   this is the one that we’re going to to switch  to in a second and this is highly customizable   and offers Advanced features like autoc completion  spelling correction and a powerful plug-in system   and then you have fish and many others cool this  is pretty much the gist of shells next let’s go   ahead and understand and customize and change  and basically learn what you need to know about shells cool so you know that the shell is basically  what takes the commands and then basically   execute them and executes them and the terminal  is just the graphical user interface in here so   you saw item you saw the terminal for Mac OS  command line for Windows and the the shell   itself is basically the command line interpreter  right shell or command line interpreter they are   both the exact same thing so what I want to show  you here is how do you view the available shells   that you have installed but also how are we able  to change the current shell so let’s together   type in here so basically if you have Tilda and  then desktop in here or if you don’t have this I   think we did run the CD command before but what  I want to do is so that you have the exact same   screen as mine just type CD yeah so so you’ll  have something like the server name plus the   user in here so my one is just Amigos code at  Amigos code cool at this point let’s together   type we’re going to say cat so you’re going to  learn about the cat command later but here say   for SL and remember tab so I’m going to press  tab have a look so if I go back e and press tab   have a look I just get Auto competetion okay now  type SE e now type shells so sh and then tab so   if I tap tab again you see that we do have Shadow  Shadow Dash and then shells so shells like that   and and in here let me just contrl L and then run  this command all right cool so now have a look so   these are the available shells that we can use  so I think these are the defaults that come with   yonto so if I take this exact same command so CD  so cat Etsy and then shells and run it on my Mac   OS so here the same command but it just looks  a little bit different but it’s going to be the   exact same thing press enter and have a look so we  have bash we have chh Dash KS sh T C CH and then   zsh so I’m going to show you how to use this one  later but if you want to know the current shell   so the current shell that you are using so here  we could just type this command so I’m going to   basically say Echo and then dollar sign and then  shell so basically all caps and um we’ll go over   the echo command as well as dollar sign in here  but for now this is the command I need to run and   it will tell you the current shell that you are  using so in my case I’m using zsh if we take this   exact same command and running within Ubunto so  here Echo and then dollar sign and then shell run   it you can see that the default shell for yuntu  is Bash cool next let’s go ahead and install zsh zsh also called the zshell is an extended  version of Bor shell with plenty of new features   and support for plugins and themes and since  it’s based of the same shell as bash Zs has   many of the same features and switching over it  it’s a Nob brainer so in here you can see that   they say many original features were added  so let’s together in install zsh and as you   saw the default shell for Mac OS now is zsh or  zshell so let’s actually install this as well   in our Ubuntu desktop so in here you saw the  list of available shells so you saw that we   have bash in here which is a default right so  bin and then bash and when you run Echo shell   bin bash is a default so we want to change  this to zsh because it’s an improvement on   top of bash so here what we need to do first is  contrl L to clear the screen and to install zsh   we say pseudo and then AP and then space install  Zs so we’ll come back to AP or apt and this is the   pack manager basically that allows us to install  software okay so here let’s just press enter and   we need to enter the password for Amigo’s code  and in fact your password so here I’m going to   type and you might think that I’m not typing  anything but I’m actually typing so this input   right here doesn’t show the password for security  reasons so press enter and you can see that now   it went off and it’s installing and it’s just  waiting for us to confirm so in here just say Y and just wait for a second you can see that  we have the progress and boom so this is done   now to make sure that this was installed  correctly just type zsh and then Dash Dash   and then version press enter if you have  this output in here it means that you have   installed zsh so if I clear the screen control  l in here and then press the up Arrow a couple   of times and if we list the available shells  under ET C cat now you should see that we have   user bin and then sh right as well as bin dsh  and we’ll cover the difference between bin and   u or actually user or USR later on when  we discuss the Linux file structure cool   this point we just installed zsh but what  about using zsh let’s continue on the next video oky dokie now for us to use zsh what we need  to do is just simply type on the terminal Z red   s and then H press enter and now you can see that  the output is a little bit different and basically   instead of having this colid Amigos code at Amigos  code we just have Amigos code which is just a user   okay and at this point nothing else changes  because as I said zsh is built on top of bash   so all the commands that we execute for example in  here you saw that we run this this command before   LS so this command if I press enter this will work  so the output right here is not call it as before   but I’ll show you what we need to install later  so that we can improve the experience when using   zsh and to be honest this is it now if you want to  switch back to bash just type bash in here and now   we are back to bash and in fact let’s just press Z  SSH once more more and now if I search so here I’m   going to say dollar and then s basically and oh  actually sorry this will not even work because now   we are within a different shell so I was trying  to search for Echo and then shell so let’s just   type and not be lazy so Echo and then dollar sign  and then shell and I was expecting to say zsh but   the reason why is because zsh currently is not  the default one which means that if I open a new   tab in here and you can see that if I make this  smaller actually bigger and here I can type Echo   and then dollar sign and then shell just like that  and you can see that this is Bash in here cool so   let me just close this and you’ve just seen that  if you want to go back to bash you just say bash   in here right but also if I say cat for slash  etc for Slash and then shells press enter we   have all the shells so we have dash sh so let’s  just type sh for example in here so now we’re   going to switch from bash to sh boom you can see  that now this is sh so this is yet another shell   if I want to use for example dash dash this is  a another shell R bash R and then bash there we   go bash and zsh and this is pretty much how you  switch between shells but really what I want to   do is switch my defa shell to zsh and the way to  do it is by using this command in here so CH h s   and then Dash and then s and what we’re going  to do is point to the Shell itself so this one   user bin zsh so say for slash USR not user my bad  USR for slash bin SL Zs press enter let’s add the   password cool now if I show you something if I  open a new shell so contrl shift T have a look   this still Bash and I know because if I typee Echo  so let’s just type Echo and then dollar sign and   then shell and if I put this smaller press enter  you can see that it still says b bash so let me   just come out of this controll and then D and  now let’s just reboot so re and then boot press enter now let me loog in enter and if I open the terminal you can see that the first thing  that we are prompted with is to configure   zsh so in here let me just press control and  then minus so you see everything in here there   we go and you can see that this is the zshell  configuration function for new users you are   seeing this message because you have no zsh  startup files so basically this is the files   that it needs for configuring zsh and it says  you can quit or do nothing exit create in the   file continue to the main menu or populate your  zsh with the configuration recommended so this   is exactly what we’re going to do okay so type  one of the keys in parenthesis so we want two   and there we go so basically this has now created  a file called zshrc and U I’ll show you this in   a second right so from this point onwards we  have successfully installed zsh and now it’s   a default shell so if I clear the screen control  and then Z zero to increase the font and now if I   open a new shell control shift T have a look so  this is no longer bash so here let’s just type   Echo and then dollar sign shell press enter  have a look zsh in our previous one as well   and then type the same command Echo Dash and  then shell and you can see that now now it’s   zsh awesome we have successfully switched  to zsh and we have a better shell from now on cool now let’s switch our default shell  to zsh and the way to do it is by using this   command in here so CH H sh and then Dash and  then s and what we’re going to do is point   to the Shell itself so this one user bin zsh  so say for slash USR not user my bad USR for   slash bin for slash zsh press enter let’s add  the password cool now if I show you something   if I open a new shell so control shift T have  a look this still bash and I know because if   I type Echo so let’s just type Echo and  then dollar sign and then shell and if I   put this smaller press enter you can see  that it still says b bash so let me just   come out of this controll and then D and now  let’s just reboot so re and then boot press enter now let me loog in enter and if I open the terminal you can see that the first thing  that we are prompted with is to configure   zsh so in here let me just press control and  then minus so you see everything in here there   we go and you can see that this is the zshell  configuration function for new users you are   seeing this message because you have no zsh  startup files so basically this is the files   that it needs for configuring zsh and it says  you can quit or do nothing exit creating the   file continue to the main menu or populate your  Zs AG with the configuration recommended so this   is exactly what we’re going to do okay so type  one of the keys in parenthesis so we want two   and there we go so basically this has now created  a file called zshrc and um I’ll show you this in   a second right so from this point onwards we  have successfully installed zsh and now it’s   the default shell so if I clear the screen  control and then zero to increase the font   and now if I open a new shell control shift T  have a look so this is no longer bash so here   let’s just type Echo and then dollar sign shell  press enter have a look zsh in our previous one   as well and then type the same command Echo Dash  and then shell and you can see that now now it’s   zsh awesome we have successfully switched  to zsh and we have a better shell from now on the last thing that I want to do in this  section is to unleash the terminal like never   before with oh my zsh which is a delightful  open- Source Community Driven framework for   managing your zsh configuration it comes bundled  with thousands of helpful function helpers plugins   themes and basically you’ve got all the batteries  included and you can see here on the left you can   customize your theme and make it so powerful and  beautiful and uh yeah just a bunch of things that   will make you look like a professional so if I  scroll down you can read more about it in here   and they’ve got many plugins and you can see  that on GitHub so this is where this is hosted   and in fact if I click on this link in here so  let’s just give you a star I think I have done   it before but if not this is the right time  because it’s awesome so here we can see all   the staggers and let’s give it a star in here so  if you don’t have GitHub don’t worry so there we   go one more star and let’s click on this repo  in here or you can actually click on the code   Tab and here if I scroll down you can see some  description of what it is how to get started the   installation process in here have a look at this  method sh so this is a shell remember you saw sh   and you just pass this command with curl we’ll  look into curl as well and if I scroll down you   can see they talk about how to configure so this  is is zshrc so this is where the configuration   file is and also plugins g.m Mac OS and basically  you can install a bunch of plugins and also themes   they have a section on themes so you can choose a  theme and here I’ll show you this in a second how   to configure zshrc and um it might look like this  if you choose for example I think it’s this theme   in here but you can do a lot with this and also  you can choose a random theme for example which   is nice awesome so let’s install oh my zsh and  I can actually go back to the previous website   and in here they have a section on install oh  mysh now so what we’re going to do is let’s just   take this command and I’m going to copy this go  to yuntu desktop and here I’m logged out let me   just add the password there we go and just paste  the command so control shift and then V and let   me just put this smaller so you see what this  looks like whoops there we go you can see the   entire command in one line if I press enter  so have a look it’s doing few things so it’s   just cloning oh my that issh and basically it’s  just running some script and tada this is now   installed so oh my Zs is now installed before  you scream oh my Zs I actually screamed look   over the zshrc file to select plugins themes and  options also if you look closely so if I press   control 0 in here the so in here have a look so  so now the prompt has changed so you have this   arrow in here and you have Tia so Tia basically  means that you are in the home folder and we’ll   talk about home later but to be honest this is  pretty much it nice if I open up a new tab you   can see that this is already configured and zsh  is installed next let’s look how to configure zsh cool you saw that they said before you scream  oh my zsh look over the zshrc file to select   plugins and themes and other options so in order  for us to achieve this what we have to do is the   following I’m going to clear my screen crl L and  make sure to type CD just to make sure that we   are within the same folder so just type CD there  we go and what this does basically is if I say   for example CD and then desktop press enter so for  example maybe you are inside of a different folder   desktop so if you type CD it just takes you to the  home folder in here and again we’ll come back to   all of this commands in a second so type CD if I  claim my screen contrl l type VI space do zsh R   and then C you can just type tab so here if I type  zsh or Dot zsh and then tab have a look we’ve got   zsh history zsh RC and rc. pm. preo my zsh so  the one I want is our C right and if I click   the screen and then press enter and there we go  so this is the configuration for zsh and here if   we scroll down so just scroll down in here and  you can see that a few things are commented out   and scroll down in here you can see that we have  some stuff plugins so at the moment there’s only   one plug-in which is get but I’ll leave this up  to you to configure this the way you want it you   can explore themes and whatnot and also you can  configure alyses and a bunch of other things but   basically this is pretty much it if I go back to  the giab repository so here remember if I scroll   down I think they have a section on themes have a  look selecting a theme so once you find the theme   that you like you’ll see an environment variable  all caps looking like this right and to use a   different theme just change to agnos for example  so let’s try this and actually I think the themes   are available so I think there’s a link right  so yes in case you did not find a suitable theme   please have a look at the wiki so here if I click  on this link it takes me to external themes and   have a look so this one looks actually quite good  oh even this one wow so you can see that there’s   a oh you can see that there’s oh I’m getting  excited here you can see that there’s a bunch of   themes that you can use and basically just follow  the instructions in here on how to install them   but let’s just go back to the terminal and what  we’re going to do is so here if I scroll all the   way up to the top in here and have a look the zsh  theme is Robbie Russell so what I want to do is   the following so here we need to be really careful  and just follow exactly as I say because this is   VI and we will learn about this text editor so  here type J just type J and make sure that you   select the terminal type J and you can see that  the cursor is moving down so stop right here and   what I want to do is to type on your keyboard  the letter Y twice so y y and followed by the   letter P there we go so this basically duplicates  the line for us now I want you to type the letter   I and you can see that now it says insert and this  means that we can basically type on the text edit   editor itself so I want you to use the app arror  and we’re going to comment this line so here let’s   just comment this with the pound sign and then go  down and here I’m just using the arrow but I’ll   show you a better way later so let’s just remove  so delete anything within double quotes and let’s   use the AG Noster EG Noster and now I want you to  type the scape key on your keyboard and you can   see that insert is no longer here and then type  colon so colon in here W and then Q so write and   quit so this allows us to come out of this editor  in here press enter and that’s it awesome now   what we going to do is open a new tab you can see  that the theme looks slightly different and here   is actually missing some fonts which we have to  install but I’m going to leave this up to you in   terms of how you’re going to customize your IDE so  I’m not spending time on this okay so usually my   one is just black so here let me just close this  and let me go back to VI so I’m going to press the   app eror so here crl L and you can see the command  once more enter and what we’re going to do is the   following so here I’m going to press D and then D  so twice and make sure that the cursor is in this   line so DD twice so that is gone so basically that  deletes the line I’m going to press I for insert   and let’s just get rid of that and esape colon WQ  esape colon WQ I’ll leave instructions on how to   work with Vim but I’ll teach you Vim later on so  here press enter and now if I open a new tab you   can see that we have the default theme cool so  here control 0 to have the default font size crl   L and this is pretty much it I’ll leave some  links under the description of this video so   you can go and explore and Adventure yourself on  how to customize your ID e but if I show you my   one quickly on Mac OS it just looks like this so  it’s plain black and there’s no themes whatsoever   so let me just say uh in here VI and then zsh  zshrc so this is the exact same configuration if   I put this full screen in here have a look so the  exact same thing I didn’t change nothing and you   can add plugins and whatnot so I’ll leave this  up to you cool so here Escape W colon and then   Q this time I didn’t change this file and press  enter this is pretty much it catch me on the next video let us now focus on Linux commands because  moving forward we’re going to learn a bunch of   commands which essentially is what Linux is all  about right so it’s about learning a bunch of   commands that allows us to interact with the  operating system so a Linux command is a text   instruction that tells the operating system  what to do these commands can be entered in   the terminal or command Lin or basically CLI and  by now you should be familiar with the terminal   and um we basically pass those commands and then  an instruction is sent to the operating system   maybe you want to create a file you want to delete  a file you want to check the time or you want to   connect to a remote server so there’s a bunch  of commands that allows us to interact with the   underline operating system the Linux command  the commands themselves are case sensitive so   for example LS and LS in capital these are  two different commands Linux commands are   often various in options and arguments that can  modify their behavior allowing for a wide range   of functionality so in this example in here so  we have the command option and argument so this   is a command so LS is the command then we can  pass an optional argument so this is Dash and   then a and then an argument so here we are saying  dot which means the current directory so here are   some basic commands LS for listing files CD for  changing directories make di for creating a new   directory RM for removing files CP for copying  files and many more each of these commands they   have an instruction via the manual so if you  don’t know how to use a command you can see   the instructions or some guide on how to use  it effectively let’s go ahead and learn about commands in here I do have the terminal open and I  want to give you a quick introduction of commands   so throughout this course you actually seen some  of the commands that I’ve been using for example   LS so you saw that I did the type LS couple of  times on the terminal you also seen M KD iir or   make there you also seen I think it was sleep  so all of these are commands that allows us to   interact with the underlying operating system  so let me quickly just show you the ls command   and then we’ll go over the command itself the  options and the arguments and also on to show   you the list of all available commands as well  as alyses and the manual so the man page so in   here if I type LS you can see that this is a  command and literally just type LS anywhere   so if you are within a different folder or maybe  let’s just make sure that we are within the same   folder together so here type CD so CD this is a  command so press CD and and this will take you to   the home folder okay now we type CD so CD stands  for change directory so this is one command now   change directory means that you change in the  directory where the subsequent commands will be   run from so let’s just type LS so basically the  ls command will be run under this home folder   in here and come back to the tilder and home  folder as well so press enter and you can see   that we have desktop music templates documents  pictures videos downloads and public so these   right here these are folders currently okay but  I know that within this folder so in here if we   type PWD so this is another command and we’ll  come back to this in a second but this stands   for present working directory press enter and it  tells you that I’m under for slome for/ Amigos   code so this is the folder that I’m currently in  so we just typed LS under this folder and we have   these contents in here right now I know for a  fact that there are more stuff under the home   folder okay so this is home okay so home meaning  that if I say Echo so this is another command so   Echo and Echo takes an argument right so here  I want to pass the argument as the home so this   is actually an invironment variable and we’ll  cover environment variables later but this is   the command Echo and this is the argument in  our case for PWD we just executed the command   without no arguments nor options the same with  ls the same with CD so here if I press enter so   this will give me the home location which is  basically for slome and then the user itself   in here cool so LS in here so let me just clear  the screen so contrl L and here if I typ LS you   can see that we have desktop music templates blah  blah blah right now I know for a fact that there’s   more content inside of the home folder so here  let’s type LS and then we’re going to say Dash   and then a so a in here so this is an option  this is an option if I press enter now have   a look so what we have we have more stuff so if  I scroll up in here so we basically typed ls- a   and have a look so these are all the files bash  history we have the cache we have config then   we see so oops let me scroll down here so then we  see the desktop documents downloads as before but   here we are actually including as well as hidden  files so all of these are hidden now what do I   mean by hidden right so if I open files and in  here this is home right have a look under home   I see documents downloads music picture public  templates and videos so this is what I see so in   here let me just put this on this side like so and  then this guy right here and if I put the font a   little bit smaller and then crl L and if I type  LS without the option- a what we see is desktop   music templates so basically everything that you  see in here right but through the terminal if I   say ls- a now we have a bunch of more stuff so in  here what I can do is I can click on these three   lines at the top click on it and here show hidden  files click on it now can you see bash history profile Vim info zshrc so remember this file in  here so these are hidden files right so you saw   that by default it doesn’t come up in here but we  can toggle the hidden files so here this is the   Das a so – a means hidden files now before we move  on to the next video so one thing that I want to   show you also so LS in in here we could say LS and  then so let me just go back here say LS and then   dot so dot means the current directory and this is  actually optional with the ls command because we   are so here if I type again PWD present working  directory we are within Amigos code this folder   this is the home folder which is this one that you  see right so if I time LS and then dot this is the   exact same thing as LS okay so contrl C there let  me just type LS and then Dot and you can see that   is the exact same thing so what we’ve done before  was LS and then Dash and then a and then dot okay   so this is the command itself so LS the option and  this is the argument so here if I press enter you   can see that we get the exact same output now you  might be saying right so here the ls so if I press   the up error so ls- a and then dot so here this  is the argument well we are printing or well we   are listing the contents of the present working  directory which is home but let’s say that within   documents so let’s just go to documents in here  so documents and let’s just right click in here   new folder I’m going to say f in here okay press  enter let’s create another folder and then call   it bar create so now within documents we have  F and bar so we have two directories cool so   here let’s just press contrl + C I’m going to  type LS so you can see that we are able to see   desktop and here documents right so what we can  do is we can say LS and then the argument so we   want to list the contents of the folder called  documents so make sure that this is capital D   so the exact same name here I can press tab to  autocomplete press enter and have a look we see   F and bar so these are two folders that we are  seeing within documents so you can see that this   is the command and this is the argument we could  also say LS in here and if I go back and I can   say Dash and then a space enter and basically  we just see bar and then Fu okay so there’s no   hidden files Within the documents folder so this  is pretty much what a command is obviously there   are plenty of other commands which I’m going to  go through with you in this course but you should   know what a command is what are options and also  what are arguments in here so here there’s just   one argument but some commands might have multiple  arguments and also I actually forgot so remember   I said that commands they are case sensitive so  LS in here so if I type this command basically   command not found so this command is not the same  as LS in lowercase cool now that you know about   commands options and arguments next let me walk  you through the manual pages or simply the man page in this section let’s focus our attention  in Le in the Linux file system which is the   hierarchical structure used to organize and manage  files and directories in a Linux operating system   it follows the tree structure with the root  directory so here you can see this is the   root at the very top and all other directories  are organized below it so here we have bin is a   directory Etc another directory sbin USR VAR dev  then we have have home lib Mount opt proc root   so on and so forth so in this section basically  I want to give you the overview so that we can   start exploring and start using the CD command  PWD on all that good stuff so basically you have   the root in here and then after root you have all  the directories in here so sbin which is used for   essential user commands binaries so here bash gut  CP date Echo LS uh less kill and then basically   all of this commands are used before the USR so  here is mounted because within this USR so here   you can see that there’s also a for slash bin in  here right so these is where you find the programs   right so binaries are programs then you have Etc  so here it’s mainly used to store configuration   files for the system so here you can see fonts  Chown tabs in it and profile uh shells time zone   and whatnot so these are mainly for configuration  files then we have the sbin so sbin is similar to   bin but only for the super user then we have  USR or you can say user so here it’s read only   application support data and binary so you can  see binaries in here for SL include in here lib   right so here basically some code and packages and  also uh you can see some local software which is   stored as well under for SL looc you also have  the for/ share which is static data across all   architectures then we have the VAR so this was  initially uh named as variable because it was   used to store data that change frequently so here  you can see uh youve got cache so application   cache data lib data modified as program runs  lock for lock files to track resources in use   then log files are stored in here variable data  for installed packages and opt es poool tasks   waiting to be processed here you’ve got es poool  and then cron cups and mail and basically here   is where you store temporary data so once the  system reboots then the data is gone you have   Dev in here so this is for device files then we  have for slome so this is the home directory and   we’ll come back to this in a second you have lib  so here for libraries and carel modules Mount so   here Mount files for temporary file systems  such as USB then we have opt for optional   software applications proc for process and kernel  information files for/ root so this is the Home D   for the root user and this is pretty much it now  obviously here I’ve schemed through it and um as   you go through your journey in terms of learning  Linux and um using the command and navigating your   way around and even you know building software  then you start to familiarize yourself with all   all of these folders in here that I’ve just  talked about so this actually is from the   Linux Foundation org I’ll leave a link where  you can basically go and read more in detail   about what each of these folders and subfolders  do but this is pretty much the Linux file system   in a nutshell next let me go ahead and show you  how we can navigate around within the Linux file system all right so in here I’m with in my  terminal and let’s together explore the Linux   file system together so I want you to type this  command in here so CD in here literally just type   CD and then for Slash and then space and then  for slash so this whatever you are just type   CD and then for SL now CD in here means change  directory and basically allows us to change from   one location to another So currently I’m within  the home directory and I want to see I want to   change directory into forth slash so for slash  is the root so press enter now we are within root   if I type PWD it just gives us forth slash which  means root nice if I type LS so list directories   and basically so list the contents within this  directory press enter in here so anytime that   you see something which looks you know something  like bin lib proc serar boot Dev ety Mount up   temp user this is basically the Linux file system  from root so remember I’ve showed you so in here   we have root and then we have ban at C SB user  VAR Dev home lib so if I go back so have a look   bin lib VAR temp user in here so on and so forth  media Mount opt home as well in here have a look   home right so this is pretty much it now what I  want to do with you is so let me just clear the   screen crl L and in here let’s together type this  command we’re going to say PSE sudo and we’ll come   back to P sudo and a also apt we’ll come back to  this in a second and then say install and then   tree so this is basically a way of us installing  extra binaries into our operating system so press   enter Then adding the password and I’m typing  trust me but you can’t see press enter and now   this is installing the tree binary and there we  go so now if I type in here so I just clear the   screen I’m going to type tree or actually let’s  just say which and then tree press enter you can   see that this is under user bin and then tree  okay so it means that we have this binary that   we can use now this binary here so tree so here  I’m going to pass an option so the option will be   Dash and then capital l in here and then I have  to pass an argument into this option and press   enter and literally what this basically gives  me is this nicely formatted LS so basically we   are lising the directories within the root folder  but this is nicely formatted for us so here you   can see we have bin we have boot we have Dev we  have ety we have home lib and also these arrows   in here I’ll come back to them in a second but  in here so temp as well V are so this is pretty   much the Linux file structure from root and you  can see currently there’s 20 directories and   one file so the file is this swap. IMG awesome  next let’s go ahead and learn how to use the CD command all right so in order for us to navigate  around the Linux file system we need to use the   CD command so here let me just put this full  screen and clear the screen crl l so we need to   use the CD and then command so CD and if I press  tab let’s just press tab in here so this now is   giving me the list of all available option right  so here if I want to now move into the directory   called let’s say temp for example so this is where  temporary files are stored I can just say DM press   enter and now you can see that this reflects so  this is all my zsh and it’s telling me that I’m   within the temp folder and now if I press LS  and you can see that there are some files in   here so these are temporary files so if I say  ls- a and um yes so you can also see the dot so   file started with Dot in here and if you want more  information about it ls- and in here you can see   that basically anything that starts with d in here  and I’ll come back to what all of this means in a   second but these are directories in here and these  are files in here okay which means that we can you   know navigate into the snap and then private temp  or system d right so I’m not going to do it this   is p myit now I’m inside of so let me just press  control Z and then clear the screen I’m inside   of temp folder right here so let’s just type  this command and I want to go back to the root   again so how do I do it well I’ve got couple of  options I can say CD and then for slash root or   I can basically go back one folder right so here  if I say CD dot dot so this goes back a folder   so you see from temp it went back to root so if I  type CD TMP in here press enter again I can type   CD for slash so this is actually going directly  to the location instead of going back a folder   press enter and we get the same thing what about  if you want to switch between these two folders   for example right so you don’t want to say CD and  then temp or CD dot dot well you could just say   CD and then Dash and basically this flips between  the previous and the current folder and this goes   back to the previous folder whatever it is within  the file system so if I press enter I go back to   Temp if I press CD and then Dash again I should  go back to root have a look I went back to root   in here cool so I’ll show you more examples of  this in a second and this is pretty much how   you navigate around the Linux file system so  if I type LS once more clear the screen enter   you should see a bunch of folders if you want to  navigate into a particular folder you just say   CD let’s go into bin for example CD and then bin  this and then press enter and now I’m within bin   in here if you want to go for example within  media you don’t necessarily have to go back a   folder you could just say CD for Slash and then  media right so because media is within the root   right here press enter and you can see that now I  went back to Media if I type CD Dash This should   take me back to where think for a second so this  goes back to the previous location which was Bin   press enter you can see that now I’m within bin if  I press up arror and then CD Dash I should go back   to Media enter and you can see that I’m within  media this is pretty much it catch me on the next one all right in this section let’s focus our  attention in terms of working with files and   also in the next section I will show you file  permission in this section let’s focus our   attention in terms of working with files and  later I’ll show you um also directories and   then we’ll learn about permissions so in Linux  in here so you’ve seen that if I open the so   this folder files so in here remember if I click  on these three lines I can basically show hidden   files right how do we create files manually  with Linux so one we have two options so we   could use the UI and we could open so let’s  just open any of these so zshrc and this will   bring the text editor so here what we could do  is we could create a new document and here I   could say hello and then Amigos and then I can  save this I can give the destination so let’s   let’s just save it into documents I’m going  to say hello.txt and there we go close the   file close this let’s navigate to documents  so CD and then documents so you’ve learned   about the CD command LS and you can see that we  have our file in here right so obviously that   is the wrong way of doing that so hopefully  in this section we’ll go through in terms of   how to work with files creating files deleting  them and moving them to a different directory   I’ll show you how to print them as well and also  how to zip any file cool let’s begin in the next video cool in order for us to create a file with  Linux we have this command called touch so this   command allows us to create a file so here let’s  just say buy so buy. txt now obviously here if I   don’t specify the location and just give the  name so this is the name of the file so this   will be saved under the current working directory  which currently is documents right so home and   then documents so if I press enter now let’s type  LS and you can see that we have the file in here   called by. txt so this is how to create files  now obviously this file in here is empty let me   show you so before I actually show you the Linux  command in order for us to print this file if I   open up files in here and then let’s navigate  to documents we have buy. txt let’s open that   up and you can see that it’s absolutely empty so  this is to much how to create an empty file now   obviously it’s not useful because you know most  of the times what we want to do is to create a   file with some content so there are a couple ways  that we can do that and one way is for us to use   the echo command so here I’m going to say Echo  and here we have to pass a string and here I’m   going to say for example by by and then Maria for  example right so this is just a random string now   if I press enter this command just prints back  by Maria now what I can do is I can take this   command and then redirect to the file so here  I can give an existing file name or I can give   it a brand new file so let’s just overwrite the  file that we have which is b.txt so here by. txt   press enter and we get nothing so here we know  that basically if you don’t see anything on the   console and the command just run and executed  you know that it works so let me just clear the   screen crl L LS we have our buy. txt now if we  want to view the contents let’s just again just   open files and here let’s go to documents buy.  txt and sure enough we have buy and then Maria   so this is pretty much how to create a file both  an empty file as well as passing some string or   some output into the file so basically we use use  the echo command in here and then we pass a string   and then we say by. txt or if you want an empty  file you could just say basically so here you can   use the touch command okay so this is touch and  then you can say whatever right so lol and you   don’t even need the extension so here if I say  enter and then LS you can see that we have buy.   txt we have F we have low. txt actually these  are folders so I think we did these before we   created these folders before and um yeah so this  is pretty much it so if I open of files again once   more go to documents you can see that we have  three files in here both with extension and uh   without extension cool there’s another way that  we can create files which is so basically let’s   say that you want to type a couple of things  uh before you actually uh submit the content so   here you see that I’m just saying Echo and then  by Maria I’m redirecting the output from this   command into this file but maybe you want to type  a document or a piece of code right so this is not   feasible and I’ll show you later with Vim how to  do it but for now these are the basics of creating files cool in this section let’s learn how to work  with folders or directories so you saw that we   can basically create files we can delete files  through the terminal using commands and so far   I’ve been creating folders by right clicking and  then new folder and also the same with deleting   folders right click and then basically I think  it’s moved to trash in here right so there’s   better ways of doing it and through the terminal  we can use the mkdir so this right here allows   us to create folders or directories so in here  let’s just CD into and then add documents and let   me put my font a little bit smaller just like  that clear the screen now if I want to create   a folder in here I can say mkd I bar and then  hello bar for example so this now is the name   of my folder press enter and you can see that  we have a folder in here called hello and then   bar if I want to delete a folder which is empty I  can say rmd iir so this actually will remove the   folder only if it’s empty right so here if I press  enter you can see that the folder is no longer in   here if you want to create a folder or basically  nested folders you say mkd I Dash and then P so   Dash p in here and then you can say f for slash  in here and then the bar press enter and in here   actually uh I think we had a folder called Fu  which was here so it didn’t actually create a   new folder but basically inside of Fu now you can  see that we have a new folder called bar so let   me just go back in here and what I’m going to  do is I’m going to run the exact same command   but I’m going to say for example here test and  then bar now you can see that we have a folder   called test and inside of test we have a folder  called bar now if we try to delete so let’s just   say rmdir and then the folder called test in here  press enter you can see that RM failed to remove   test because it’s empty right so rmd just deletes  the folder if it’s empty remember how to delete   the files so we can use the RM command in here so  RM dashr in here and then I’m going to say f to   basically Force delete and accept the prompt so R  for recursive and then here I can say the name of   the folder which is test now this is the key so  if I say in here for Slash and instore right so   pretty much just delete anything under test and if  I open up test you can see that we have bar so if   I press enter here this is still going to prompt  me yes or no so because we have the force here so   let’s just press n for a second so what we want to  do is just add a trailing for slash so this will   basically remove without prompting it’s just going  to remove every single folder and subfolder so if   I press enter you can see that now the folder is  gone and we kept the parent folder in here so if   you want to keep let’s say you want to keep the  parent in here so let’s just again create a new   folder so make the- p test bar let’s also say bar  and then two or three doesn’t really matter and   let’s just have two in here right so here inside  we have three folders if you want to delete them   all all you do is you say rm-rf the name of the  folder for slashstar for slash so here you could   also do a pattern right so let’s just say you  want to delete anything that ends with three for   example so three in here press enter and you can  see that only the folder that ended with three was   gone right so star means pretty much any folder  right so also if for example here so you have   bar so let’s just create a new folder inside of  bar two for example so here I’m going to say FU   press enter now within bar two we have Fu okay  so I’ve just said- P to create subfolders so if   I was to reun the command in here so this will  pretty much just delete all subfolders including   folders within folders right so if I press enter  you can see that it’s gone and we can’t go back   right because the folder doesn’t exist so the  parent which was bar two doesn’t exist so here   let me just say okay and go to documents and test  and you can see that all folders are gone if you   want to delete basically everything including  the parent all you do is so let’s just create   a new folder here so basically bar two inside  with f so this is the command and you want to   delete everything including the parent folder  which is test all you do is mkd and then Dash   oh actually sorry rm-rf so D RF or f r is the  same thing but I’ve just switch the options   and then the name of the folder test then press  enter and you can see that the folder is now gone   and this is pretty much how you create folders  but also delete contents within your folders Linux is a multi-user environment where allows  us to keep users files separate from other users   we can add a new user by using the pseudo and  then user ad- M so this is a flag that allows   you to create the home directory and then you  pass the name we can also set the password for   the particular user by using the pass WD and  then the user in question and if we want to   switch between users we use the Su command I.E  substitute user if you also want to delete the   user you can say user Dell and then here we can  pass some flags and then the user in question   but I’ll show you the flags in question as well  so and with this we have two types of user we   have a normal user that can modify their own  files this user cannot make systemwide changes   install software update system files and also  cannot change other users file so You’ seen the   pseudo command I think throughout this course but  I didn’t cover it but I’ll show you in a second   when we try to install for example a package then  we are not allowed to do that unless we use the   pseudo command and then we have the super user  in here I.E root and this user can modify any   file on the system it can make systemwide changes  such as install software and even change files on   the operating system so in this section let’s  understand basically how all of this works and   then we also touch on files and permissions  which is very important and this is actually   very important that you must understand how it  works because it’s key towards your journey in   terms of becoming a Linux administrator  if you want to follow that path but for   a regular software engineer still crucial for  you to know how this works because it’s key to Linux cool you’ve seen that if I was to type this  command and we’ll learn about package managers   later but here let’s say that we want to install  a piece of software in our machine so basically a   binary so here I think you’ve seen this APK or  I think no it’s apt and then install and here   let’s just say tree so here just say tree and we  did this before but let’s just understand exactly   what we had to do to install this software  if I press enter in here it says could not   open lock file permission denied have a look  permission denied and then it says enable to   acquire the front end log blah blah blah are  you roote well so in order for us to execute   this command successfully we need to execute  this as root I.E with the pseudo command so   the way we do it is we can type in here pseudo  so PSE sudo this is the command that we need to   use and then here I’m going to say exclamation  mark exclamation mark twice and then press Tab   and basically this just adds in the previous  command that I had in my terminal and now if   I press enter now it’s actually asking me for the  password you’ve seen this before so here I’m going   to add the password so it looks like I’m not  typing but trust me it’s hiding the password   for security reasons so just type your password  and then press enter and in fact if I basically   have a wrong password press enter it will say that  the password was incorrect so now I need to type   the password again so here I’m going to type the  correct password press enter and you can see that   basically it tries to install but this is already  installed so this dependency is already installed   as we did before similarly if we try so in here  if I try to navigate to CD and then the root so   for slash LS so in here we have couple of folders  in here but one is root so let’s just try and say   LS and then root for example press enter you  can see that it says LS cannot open directory   root permission denied so if we want to execute  this command on this particular folder we need to   execute the exact same command as root so here  we can say pseudo LS and then root or if I add   exclamation mark exclamation mark twice tab it  just brings in the previous command so this is   a nice trick that I use all the time so if I press  enter now you can see that now we are able to list   the contents inside of the root directory so for  you this might say snap or it might say something   different or maybe nothing right but you can see  that with this command we have super powers now   obviously this here you have to be really careful  how you use this command so remember I said never   do this so sud sudo rm-rf in here and then root  right because if you do this basically you are   getting rid of your entire operating system and  you don’t want to do this so obviously you need   to be careful who you choose to allow to have  the super powers and I’ll will show you this   later in a second but this is the main idea  and basically this is the pseudo command in   NHL pseudo executes a given command with elevated  Privileges and these privileges are required to   perform certain tasks which are required by admins  let’s explore pseudo even further in the next video if you remember correctly when we  do an LS so if we type LS in here so LS   and then Dash a l you can see that we have  some output in here that contains a bunch   of information so in here have a look so we’ve  got that we’ve got this we’ve got all of this   and basically we have some information and  then this is the actual file itself right   so file or directory so in this section we’re  going to focus on understanding the files and   permissions in here and also I’ll explain the  uh output from the ls command so in here if   you remember correctly so if I type in here  ls- Dash and then help and then scroll up in   here so remember the dash so Dash and then l  in here use a long listing format so that’s   why you see all of the information and then  – A as well for Eden file in here so do not   ignore entry that start with DOT right which is  basically all of this right so this dot in here   so the dot files are for the- a and then the  L is for long listing which is for all of this   information in here cool let’s go ahead and uh  start learning about Linux files and permissions next in here I do have this awesome diagram where it  teaches about the output of the command ls- L and   more specific we’ve got the permissions here  which is the core of this section but let me   actually start from here and explain the output  for uh ls- L now in here you can see that we have   the folder name or the file basically the name of  the file that you list within that directory so in   our case we got food. txt as well as di so this  is the name of a folder in here and this could   be literally anything then we have the date and  time and this more specific is the last modified   date and time okay so if you create the file  so that will basically be the creation time and   if you change or modify the file then this will  reflect then we have the size so for a file this   is 4 kilobytes then for a directory this is 4,000  kilobytes and basically the total of whatever is   inside of the folder last section you’ve learned  about groups so this is the group name so to which   group the file belongs to so in this case Amigos  code so both files and then we have the owner so   the owner is the user so in my case Amigos code  earli on you saw that we created a user called   Jamal so this also will reflect then here we have  the number of hard links so two and one and we’ll   come back to hard links later then we have in here  the permissions and the very first character in   here so basically this so excluding D basically I  can’t select it but basically the first character   of this sequence in here it’s always the file  type and then we have the set of permissions   now for the file type in this case d stands for  for directory and then Dash stands for basically   a regular file now when it comes to permissions  this is divided into three sections as I said the   first one is the file type and then and then rwx  rwx r-x in here so basically what this means is   read write and execute R for read W for write and  X for execute the first three characters belongs   to the user so this means that a user can perform  a set of actions on the file type the second set   of characters so the first three for the user  the second three for the group so these are group   permissions and the last three are for everyone  else or others so if it doesn’t belong to the   user nor the group then the rest of of the world  and here’s an example so for example for this file   in here called f.txt so you can see that the file  type in here Dash so it’s a file read W and then   Dash so this means that Amigos code user can only  read and write it cannot execute and we’ll come   back to execute in a second so once we create a  Bash script but also for folders execute Works   a little bit different then the next three set  of characters read write and then Dash so when   it’s a dash it means that there’s no permissions  in there so here Amigos code group so the group   can read and write and then the last three are  dash dash it means that anyone can literally   read so we’ll go into details uh in a second in  terms of the actual permissions but this is the   general overview of the Linux file permissions  and they more specific when you perform ls- L   this is basically the output but as I said in  this section we want to focus on the permissions themselves we’re done with Linux and that’s not  Myer feat you’ve learned about some key commands   and you’re already on your way to becoming  an expert but how do we group those commands   together well that’s where shell scripting  comes in is shell scripting like a programming   language such as python or goang exactly  so you learn about conditions and Loops you   learn about functions how to do effective error  handling and that’s not all we have challenging   exercises waiting for you to put your skills  to the test I’m looking forward for it so am I scripting is where Linux comes in really handy  let’s dive into shell scripting a game changer   for anyone who wants to automate tasks in Linux  first things first what is Bash Bash stands for   Born Again shell a bit of a fun name isn’t it  it’s essentially a command line interpreter   which in simple terms means it’s your gateway  to communicate with your computer using text   based commands with bash you boost efficiency  Skyrocket productivity and can effortlessly   streamline tasks that might otherwise be tedious  think of bash as a way to create a sequence of   commands automating tasks and Performing complex  operations without breaking a sweat now how do   you write these magical scripts all you need to  start is an editor it could be as simple as Vim   which we’ve covered in this course or a feature  editor like Visual Studio code your choice once   you’ve penned down your script save it with  the extension Dosh which tells your system   hey this is a bash script now let’s explore  some fundamental elements of bash scripting   as we talk remember the true understanding  comes from practical application which we’ll   delve into shortly first up variables think of  them as containers they store and manipulate   data for you typically denoted by something like  dollar Vore name variables can hold a variety of   data be it strings numbers or even arrays  moving on to conditionals they are scripts   decision makers allowing it to make choices  based on specific conditions based on whether   something is true or false different blocks of  code will run making your scripts Dynamic and   responsive next up loops loops let you repeat  instructions over and over as needed with for   loops and while Loops you can iterate over  lists sift through directories or continue   a task until a specific condition is met last  but not least functions imagine grouping a set   of commands into one block you can call upon this  block or function multiple times throughout your   script they’re the essence of code reusability  modularity and organization in your script which   is a very key component when it comes to script  writing and there you have it an introduction to   Shell scripting with bash while this is just the  tip of the iceberg armed with these fundamentals   you’re well in your way to master the art of  scripting Linux so without further Ado let’s get started let’s write a simple script to get  a taste of B scripting in this example we’ll   create a simple script to greet the user first  let’s create our script we can use the touch   command followed by the name of our script let’s  call it my first script and make sure you use the   extension Dosh which indicates that these are  bad scripts let’s now open our file using Vim   my first script.sh now the first line of every  file starts with a shebang line don’t worry too   much about this line at the moment we’ll cover  this in a future video now we Echo hello world   in our script and then we can escape and then  we can save our file using coal WQ exclamation   mark now because scripts are executables we  have to also CH mod our script using CH mod   and we can use the symbolic notation plus X  followed by the script name my first script   and now we can run our script using the dot for/  prefix and my first script and there you have it   H world now this is just a basic example but  B scripting allows you to do so much more you   can manipulate files process data automate  backups and perform complex operations all   through the power of scripting so to become  proficient in B scripting it’s essential to   understand the fundamental concepts that form  the building blocks of scripting let’s briefly   cover some of these Concepts and that’s  all for this video I’ll see you in the next one in this video we’ll delve into an important  concept called shibang the shebang also known as a   hashbang or interpreted directive plays a crucial  role in the execution of bash scripts so let’s   first create a file we can just touch greet Dosh  for example and then press enter now we briefly   touched upon shebang line in the previous video  and that’s the first line that you find in any   bash script where you have followed by bin bash  this line is a she and it serves as a directive   to the operating system on how to interpret the  script in this case we’re asking the system to   interpret the script using the binary bash so the  path after the exclamation mark is essentially   pointing to the specific interpreter or shell that  should handle the script the shebang line provides   flexibility by allowing you to specify different  interpreters or different types of scripts for   example if you’re writing a python script you can  use a shebang line that instead has user being   Python and then you can decide if you want Python  3 or python 2 to ensure the script is executed   using the python interpreter and similar for  scripts written in Ruby for example you’ll just   change this binary to Ruby and so on now let’s  see the impact of the shebang connection suppose   in this bad script we want to print hello world  for example we want to write a greeting message   so first let’s write our she bang with bin bash  and then we do Echo hello world and then we escape   and then colon WQ to save this now remember B  scripts are executables which means we need to   give it the executable permission so to do this  we use a CH mode command followed by the symbolic   notation to give it executable permission so we do  plus X followed by the name of the file so greet   Dosh and then we press enter now this file is an  executable so we can check that this is the case   by running LS followed by the long form option  and as as you can see our greet Dosh file now has   executable permissions which is the X here okay  let’s clear our screen now to run this we can use   a for/ prefix followed by the script name and then  we press enter and as you can see it prints hello   world now this is only one way to run this bash  script we can also use the command sh to run the   bash script so greet Dosh and that gives you the  same thing and we can also use Bash read.sh and   we press enter this is for when you don’t specify  The Interpreter within the bash script so if we   remove our bin bash line or or a shebang line in  the bash script we can use these two commands to   interpret this script as bash great now the She  Bang is not limited to just the bash shell you   can use different interpreters depending on your  needs and by specifying the correct interpreter   in the shebang you ensure that your scripts are  executed consistently regardless of which sh or   environment they’re running so a quick summary the  shebang line starts with the hash followed by an   exclamation mark it specifies The Interpreter or  shell that should handle the script it enables   consistent execution of scripts across different  environments regardless of whatever shell you’re   using even though we’re using the zsh Shell here  it was still able to interpret the GRE Dosh file   as a bash script you can specify different  interpreters for different types of scripts   and the she bang line should be placed as the  first line of the script without any leading   spaces or characters before it and that’s it  thanks for watching and I’ll see you in the next one in a previous video we learned how to run  scripts using slsh and Bash so let’s recall our   simple script greet Dosh so if I do a cat greet  Dosh we can see that this script prints hello   world right now we can run it from its current  directory using for/ greet Dosh but what if we   want to run it from anywhere without specifying  its path well the trick is to place our script   in one of the directories that’s in our path  environment variable the path is an environment   variable that tells the shell which directories  to search for executable files in response to   commands so let’s clear our screen and if I do an  echo of the path environment variable we can see   that there are several directories separated  by colons any executable file placed in one   of these directories can be run from anywhere in  the terminal now a common directory to place user   scripts is user local bin that’s this directory  over here so let’s move our greet Dosh file into   this directory and give it executable permissions  now for this we are going to use pseudo because it   requires super user permissions to move scripts  into this directory so we run PSE sudo and then   move and then our script greet Dosh and we’re  going to move this to user local bin and we’re   also going to change the name to greet so it  becomes easy to run later on so now we can press   enter it will ask you for your password so enter  your password so that’s moved greet Dosh into the   user local bin directory now remember you also  have to CH mod since this is a script so once   again pseudo CH mod with the plus X symbolic  notation followed by user local bin and then   greet so we press enter now the reason we changed  the name to greet is so that for Simplicity this   is how we will call it now let’s clear our screen  now if I has to run the command greet just like   this it will give me hello world without using sh  without using the current directory and if I was   to also change it directory and call greet it will  still work so if I change directory to let’s say   desktop for example I can also run it from here as  GRE so you can see we were able to run our script   using just its name without needing to specify  any path or use for/ sh or bash so to recap by   adding our script to one of the directories in  a path environment variable we can conveniently   run it from anywhere in our terminal this can be  incredibly useful as you build up a library of   custom script just be cautious though and ensure  the scripts you add to Global paths are safe and   intended for Global use and that’s all for this  video Happy scripting and I’ll see you in the next one in this video we’ll explore the concept of  comments and how they can enhance the clarity   and understandability of your script comments  are lines in a script that are not executed as   part of the code instead they serve as informative  text for for us reading the script adding comments   to your scripts is considered a best practice  because it helps you and others understand the   purpose functionality and logic of the script so  let’s take a look at how comments are Written In A   bash script in bash there’s two types of comments  you have a single line comment and a multi-line   comment so let’s first go into our greet Dosh  file VM greet Dosh and then press enter now we   know what the script does it prints hello world  to the console when we run the script so first we   can press I to insert and to write a single line  comment simply start the line with the hash symbol   anything following the hash symbol on that line  will be treated as a comment so print greeting   to the console for muline comments you can enclose  the comment text between colon with single quotes   and then we can have our comments within the lines  enclosed between the single quotation marks so   anything between 6 and 9 will now be considered  a comment this is a multi-line comment and we can   just get rid of this line as well so Escape great  now if I was to exit and save this file and rerun   GRE Dosh you’ll notice that it only prints hello  world even though that in our GRE Dosh file we   have these two lines but because they are being  taken as comments they are therefore not executed   now let’s see the Practical benefits of comments  in action consider a bash script that renames all   txt files in a directory to have a Bak extension  so what we do here is VI extensions. sh file and   then press enter and here we have a for Loop  without comments the script may appear cryptic   especially for someone unfamiliar with the purpose  and inner workings of this for Loop especially for   someone who’s new to B scripting who doesn’t  really know how to write for Loops so in our   case it’s very important for us to write comments  here to improve the understandability so we start   with our hash and then we add our comment so what  we’re doing in this for Loop is renaming all txt   files Tob so we’re changing the extension of the  file now we can use multi line comments to add   more detail as to what the script is doing so to  do this we start with a colon and then the single   quotation mark and also close this with a single  quotation mark So now anything inclosed between   these two quotation marks would be considered  a comment so we can write explanation and then   what we’re doing is looping through all. txt  files in the current directory we are using   the move command as you can see move command to  rename each. txt file to do B and finally the   this part of the code is the syntax so the and  then we can paste that here is the syntax that   removes the txt extension and the pens B okay  let’s save the script WQ and exit let’s just   now cut this file and let’s zoom out a little  bit now notice by adding comments throughout   the script we can provide explanations and context  as to what the script is actually doing making it   easier for others and ourselves to understand the  script’s intention so comments not only help with   the script comprehension but also enable you to  temporarily disable or exclude specific sections   of code without deleting them let’s say we did  want the script to run these three lines or we   we can actually prevent those lines from running  by turning them into comments so let’s go back   into our file and what we do is let’s first do an  i and then add a hash in front of this so now it   turns into a comment and same for the remaining  two lines now you can see this script essentially   won’t run anything because now we’ve turned all  our commands into comments we can exit and we can   save this if we tried running the script we do  for/ extension. sh oops wrong file okay we get   permission denied because it’s not executable so  let’s make it executable sh and then rerun as you   can see nothing happens because all our commands  have now been commented okay and that’s all you   need to know about comments and how useful they  are within our scripts so by adding comments to   your scripts you improve the readability you  can you know Foster collaboration within your   team and you can ensure that the scripts purpose  remains evident throughout the life cycle of the   script so other people can read what the script  is doing and so later if changes need to be made   you know where those changes need to happen okay  thanks for watching and I’ll see you in the next one in this video we’ll delve into the world of  variables variables are an essential component   of bash scripting as they allow you to store and  manipulate data it makes your script Dynamic and   flexible in bash variables are like containers  that hold data such as text strings numbers or   even arrays they provide a way to store values  that can be accessed and modified throughout the   script so let’s look at how variables are created  and used in bash script to create a variable you   simply assign a value to it using the assignment  operator so let’s first create a file and call   this file. SH now first in this file let’s begin  with our shebang b bash and then let’s also assign   a variable called greeting and we can assign it  to the string hello world to access the value of   this variable you prepend the variable name with a  dollar sign so we start with dollar greeting let’s   say we want to use the echo command to Output  the value stored in greeting we just start with   Echo followed by Dollar greeting and then we can  escape and save our file make sure you CH mod your   script to give it executable permission v.sh and  then the/ prefix to run the script and there you   have it hello world now variables in bash are not  constrained to a specific data type they can hold   different types of data such as strings numbers  and arrays let’s create a variable that we can   assign a number so let’s reopen our v.sh and let’s  assign another variable I use all my keyboard to   go to the next line and we can assign the count  variable to the number 42 for example so as you   can see I’m not enclosing the number 42 within a  string because I want this bad script to interpret   this as a number and not a string right and then  we can call this variable using the same format   Echo count then let’s exit save our file and run  our bash script v.sh and there you have it it   prints the number 42 as well as our hello world  now variables can also hold an array so let’s   create another variable called fruits and assign  it to the values apple banana and orange so we do   that using parentheses first element will be apple  second element banana and the third element orange   and then you close your parenthesis and there you  have it the fruits variable assigned to this array   now you can also use variables within strings to  create Dynamic output this is known as variable   interpolation let’s see how we can do that let’s  assign a variable name to the name let’s say armid   for example we can now Echo and then within our  string we can use variable interpolation to say   hello to this variable name and Let’s Escape and  WQ to save and let’s call a.sh and there you have   it hello armid so it’s taken the name variable and  assigned it with within our string so essentially   we’re doing variable interpolation in this case  so the value stored in name is inserted into   the string using the dollar name syntax great  let’s summarize what we’ve learned variables   are created using the assignment operator  equals to access the value of a variable   we prend the name of the variable with a dollar  sign variables can hold different types of data   such as strings numbers and arrays and variable  interpolation allows you to use variables within   strings to create Dynamic output and that’s  all for variables I’ll see you in the next one in this video we’ll dive into the topic of  passing parameters to bash scripts by allowing   inputs from the command line you can make  your script more versatile and interactive   bash scripts can receive input values known  as parameters or arguments from the command   line when they are executed these parameters  allow you to customize the behavior of your   script and make it more flexible let’s look  at how to pass parameters to a b script when   running a script you provide parameters after  the script name so for example let’s say we   had a script.sh file we pass parameters just  like this parameter 1 parameter 2 and so on so   when running a script you provide the parameters  after the script name separated by spaces so the   parameters are all separated by spaces so in  this example we’re executing a script called   script.sh and passing two parameters parameter  1 and parameter 2 inside the B script you can   access these parameters using special variables  dollar one doll two and dollar three let’s look   at an example let’s create the script.sh file and  let’s start with our Shang bin bash and let’s say   we wanted to echo three parameters let’s say  the first parameter parameter one and we use   use a special variable so dollar one which  basically grabs the value of the parameter   passed into the script when we run the script  let’s say we have these lines three more times   so let’s just copy this line here copy and then  paste let’s say now parameter two we have two   and then for parameter three we have three so in  this script snippet we’re using the echo command   to display the value of these three parameters so  Let’s Escape and save this file now when I call   the script.sh I can pass in a parameter so let’s  say the first parameter let’s call it hello and   second parameter hi then press enter as you can  see because we’ve only passed in two parameters   it only prints the first two hello and hi because  this is taken as dollar one and this is taken as   dollar two if I was to pass in a third parameter  let’s call it hey and press enter now we have a   third parameter and it is printed in this third  line excellent so when executing the script with   parameters the values passed on the command line  will be substituted into the scripts parameter   variables dollar one doll two and doll 3 now what  if we wanted to access all the parameters passed   into a script we can do this using a special  variable so let’s go into our script.sh let’s   add another line and let’s say we want to Echo  all parameters right we use a special variable   followed by the at symbol and then quotation  marks and then now let’s save the script and   now when we run it we get all parameters and then  the parameters that we’ve passed into the script   in other words the echo command in that line  will output all the parameters passed to the   script great let’s summarize what we’ve learned  parameters are provided after the script name when   executing a script inside the script parameters  can be accessed using dollar one do two doll three   and so on based on their position and the special  variable dollar at can be used to access all the   parameters pass to the script so by allowing  inputs through parameters you can make your   script more interactive and versatile great that’s  all for this video and I’ll see you in the next one phew well done for reaching  the end of this course but your   journey doesn’t stop here whether  you’re taking the devil’s path or   the software engineering path well it’s  only the beginning we have courses to   help you on this journey it was a pleasure  teaching you and we’ll see you in the next one Assalamualaikum

    By Amjad Izhar
    Contact: amjad.izhar@gmail.com
    https://amjadizhar.blog

  • Full Stack Learning Management Application Development

    Full Stack Learning Management Application Development

    The text details the creation of a full-stack learning management application using Next.js, Node.js, and AWS. It covers the development process step-by-step, including front-end UI construction with ShadCN components and Tailwind CSS, back-end API design with database interaction, authentication via Clerk, and payment integration with Stripe. The tutorial extensively explains the use of Redux Toolkit Query for efficient data fetching and management. Finally, it addresses features like course creation, editing, and user progress tracking.

    Learning Management Application Study Guide

    Quiz

    1. What is the primary function of Node.js in the context of this application? Node.js is a server-side JavaScript runtime that allows JavaScript code to be executed on the server. In this application, it enables the creation of a backend that can handle requests and data management.
    2. Explain the purpose of npx create-next-app in the project setup. npx create-next-app is used to create a new Next.js application with a default configuration. This provides a quick start for building the frontend of the application.
    3. What are two essential VS Code extensions recommended in the source, and what are their purposes? The two essential extensions are ES7+ React/Redux/React-Native snippets, which helps create React components, and Prettier, which formats code automatically upon saving, ensuring consistent formatting.
    4. Describe the role of Clerk in the application. Clerk is an authentication service that is used to handle user sign-up, sign-in, and profile management within the learning management system. It simplifies the process of managing user accounts.
    5. What is Tailwind CSS, and how is it used in the project? Tailwind CSS is a utility-first CSS framework. In this application, it is used to style components by applying predefined classes, which are imported in a global CSS file, avoiding the need to write custom CSS from scratch.
    6. Why is DynamoDB chosen as the database for this application, and what type of database is it? DynamoDB is chosen for its scalability, performance, and suitability for applications with fewer tables and relationships. It is a NoSQL database and allows you to store data, in this application, such as courses, transactions, and user progress.
    7. What is the significance of the “non-dashboard” layout in the application? The “non-dashboard” layout is used for pages that do not require user authentication or are not part of a user dashboard. This includes the landing page, course search, and authentication pages.
    8. Explain the difference between the course object and the user course progress. The course object stores core course information such as the title, description, and teacher ID. The user course progress tracks how much progress a single user has made in a specific course, including how much they’ve completed. This is a separate object to avoid a massive object in the case of multiple users.
    9. What is the purpose of Shadcn UI libraries in this application? Shadcn UI libraries provide pre-built, accessible, and customizable React components. In this project, they are used to quickly build UI elements such as buttons, forms, and dropdowns with consistent styling.
    10. What is a payment intent in the context of Stripe, and how does it relate to the backend? A payment intent is a Stripe object that represents a customer’s intent to pay. The backend of the application creates a payment intent, and then the frontend uses this to process payments.

    Essay Questions

    1. Discuss the architectural choices made in the development of this full-stack learning management system, considering the trade-offs between different technologies and approaches. How do these decisions contribute to the scalability and maintainability of the application?
    2. Analyze the data modeling approach used in this application. Why were separate data structures used for course information and user course progress, and how do these choices impact the performance and complexity of the system?
    3. Evaluate the use of serverless functions (AWS Lambda) in this project. What are the benefits and challenges of using this technology, and how does it align with the overall goals of the learning management application?
    4. Explain the role of third-party services, such as Clerk and Stripe, in this learning management application. How do these services simplify development, and what are the potential drawbacks of relying on external providers?
    5. Describe the process of deploying and managing this application on AWS and Vercel. What steps were taken to ensure the security, performance, and reliability of the deployed system?

    Glossary of Key Terms

    AWS (Amazon Web Services): A cloud computing platform providing various services, including storage, computing power, and databases.

    API Gateway: An AWS service that acts as a front door for applications to access backend services. It helps in securing and managing APIs.

    CORS (Cross-Origin Resource Sharing): A browser security mechanism that restricts web applications from making requests to a domain different from the one that served the web application.

    Clerk: A third-party service for managing user authentication and authorization in web applications.

    CloudFront: AWS’s content delivery network (CDN) service. It stores content and delivers it from edge locations closer to the user, improving loading times.

    Container: A lightweight, standalone executable package of software that includes everything needed to run an application: code, runtime, system tools, system libraries, and settings. Docker is an example of container technology.

    Docker: A platform for developing, shipping, and running applications in containers.

    DynamoDB: A fully managed, serverless NoSQL database offered by AWS. It is designed for scalability and high performance.

    ECR (Elastic Container Registry): A managed Docker container registry that allows developers to store and retrieve Docker container images.

    Framer Motion: A library used for adding animations to React components.

    IM (Identity and Access Management): A service provided by AWS that helps in managing access to resources. It is used to create roles and manage permissions.

    Lambda: A serverless compute service by AWS that allows running code without provisioning or managing servers.

    Middleware: Software that provides services and capabilities that can be applied across different parts of an application.

    Molter: Middleware for handling multipart/form-data, which is primarily used for uploading files.

    Next.js: A React framework for building full-stack web applications. It provides features such as server-side rendering and routing.

    Node.js: A JavaScript runtime that allows JavaScript code to run on the server.

    NoSQL: A type of database that is not based on the traditional relational model (SQL). It is suitable for handling unstructured or semi-structured data.

    npm: The package manager for Node.js. It is used for installing and managing packages needed in a project.

    npx: A tool that executes npm packages.

    Redux Toolkit Query: A data fetching and caching solution built on top of Redux.

    Shadcn UI: A library of pre-built and customizable UI components for React applications.

    SQL: Structured Query Language is a language for managing and querying data in relational databases.

    S3 (Simple Storage Service): A scalable object storage service by AWS.

    Serverless: A cloud computing execution model where the cloud provider manages the infrastructure, and developers only focus on writing and deploying code.

    Stripe: A third-party service that provides payment processing infrastructure for applications.

    Tailwind CSS: A utility-first CSS framework that provides low-level utility classes to style HTML elements.

    TypeScript: A strongly typed superset of JavaScript that adds static typing to the language.

    Vercel: A platform for deploying and hosting frontend web applications, with a focus on performance and ease of use.

    VPC (Virtual Private Cloud): A virtual network in AWS that allows users to launch AWS resources in a logically isolated network environment.

    Full-Stack LMS Application Development

    Okay, here is a detailed briefing document summarizing the provided text, including key themes, ideas, and quotes:

    Briefing Document: Full Stack Learning Management Application

    Overall Theme: This document details the step-by-step construction of a complete, production-ready Learning Management Application (LMS). The application utilizes a Next.js frontend, a Node.js backend, and AWS for deployment, aiming to be a comprehensive portfolio piece. It stresses beginner accessibility by explaining concepts in simple terms, and the creator encourages the audience to use parts or the whole project in their own portfolios.

    Key Ideas and Facts:

    • Application Overview:The LMS features a landing page with animations, course listings, user signup/login (Clerk authentication), user profiles, billing, and course creation capabilities.
    • The application is intended to be production-ready with its own set of use cases.
    • “This application is probably one of the most extensive applications I’ve ever built on YouTube and I might even use this for my own purposes of creating a course.”
    • Frontend Technologies (Next.js):Utilizes Next.js for the frontend framework, building components with React.
    • Leverages Tailwind CSS for styling, with many classes pre-defined for rapid development, instead of spending time styling.
    • Imports Google Fonts (DM Sans) for styling.
    • Uses Shadcn for pre-built UI components and theming, enhancing the overall development process.
    • Implements framer-motion for animations, primarily for transitions and loading states.
    • Redux Toolkit Query is used for efficient data fetching and state management.
    • “We’re just going to be using those classes for our Styles they are already using Tailwind classes we’re going to be using those and we’re just going to apply those classes directly onto Those comp components”.
    • Backend Technologies (Node.js):Employs Node.js as the server-side JavaScript runtime.
    • Uses Express.js for creating the API endpoints.
    • DynamoDB (NoSQL) is selected as the database with data being persisted locally during development.
    • Dynamus provides schema validation and database interaction for DynamoDB
    • “Dynamo DB excels more so than mongod DB if you have less tables ideally you have less tables”
    • The app utilizes AWS SDK for direct database access.
    • Includes an environment variable configuration system with .env files.
    • Utilizes Multer to handle file uploads on the backend.
    • Database Design (DynamoDB):Data is modeled using three core schemas: course, transaction, and user course progress.
    • A key point is the separation of user course progress from the main course object to manage large amounts of data efficiently and prevent single object bloat.
    • “we do not store the progress of the user because you know how the user watches a video they are progressing through and we need to save information on how far they have gotten in that course whether they completed a certain section or a chapter that is where the user course progress is aligned.”
    • The project uses DynamoDB local for development, with persistence using dbPath config.
    • DynamoDB was chosen as it is well suited to the project’s data needs, as it has relatively few tables and is highly scalable.
    • “Dynamo DB is much more fast and performant and skills but it is even worse than mongod DB and document DB of complex filtering and sorting.”
    • Authentication (Clerk):Clerk is used for handling user authentication.
    • Middleware routes are created to protect specific pages (user/teacher dashboards) based on user roles.
    • Uses Clerk Provider to wrap the application for state and data management, and middleware to determine which routes require authentication.
    • User roles are stored in Clerk metadata as either “student” or “teacher”.
    • The project creates a separate “auth” folder for auth related pages.
    • Payment System (Stripe):Integrates with Stripe for handling payments.
    • The backend creates Stripe payment intents that connect directly to the front end.
    • The project integrates a Stripe Provider to wrap around payment content pages.
    • The project uses react-stripe-js to handle stripe functionality on the frontend.
    • UI Components & Styling:Emphasizes the usage of existing styles (Tailwind) and pre-built components (Shadcn) to avoid spending too much time doing styling.
    • Utilizes the Sunner library for creating toast notifications for user feedback.
    • Custom UI components are built to reuse common functionalities.
    • Loading screens and animations enhance the UI and user experience.
    1. Course Creation and Management:
    • Allows users with teacher roles to create courses by populating a course form.
    • Chapters and Sections can be added to courses via modals.
    • The application supports editing and deleting of courses, sections and chapters, and the ability for teachers to edit and upload videos, though this is implemented later in the series with S3 bucket.
    • State Management (Redux Toolkit):Uses Redux Toolkit and RTK query for handling client state and making API requests.
    • Custom base query is configured to show toast notifications from the API response.
    • Redux is used to manage the application’s state, like whether modals are open.
    • The project uses Redux toolkit query to handle API requests.
    • AWS Deployment:Application is deployed to AWS using Lambda, API Gateway, and S3.
    • Docker is used to containerize the backend application and deploy it to ECR.
    • IAM roles are configured to grant necessary permissions for Lambda to access other AWS services.
    • CloudFront is used for CDN to make loading video fast for users.
    • Vercel is used to host the front end application because the creator faced issues using AWS Amplify.
    • A basic budget system is recommended using AWS billing so that developers are not charged extra.
    • “. The Lambda is going to take some time to load it’s not that much but there is a little bit of delay if someone H if you don’t have assistant users constantly using your application”
    • Code Organization and Setup:The project is broken into several major directories, including client, server, and source.
    • The server has different folders for utils, seed, models, routes and controllers.
    • The client has different folders for components, app, lib, state and types.
    • The project uses typescript for both the client and the server and has necessary types installed for various libraries.
    • The project uses a custom base query to have consistent error handling across different API requests.

    Important Quotes (reiterated for emphasis):

    • “This application is probably one of the most extensive applications I’ve ever built on YouTube and I might even use this for my own purposes of creating a course.” (Emphasis on scope and usability)
    • “We’re just going to be using those classes for our Styles they are already using Tailwind classes we’re going to be using those and we’re just going to apply those classes directly onto Those comp components.” (Emphasis on rapid development with pre-defined styling.)
    • “Dynamo DB excels more so than mongod DB if you have less tables ideally you have less tables” (Reasoning behind database choice.)
    • “we do not store the progress of the user because you know how the user watches a video they are progressing through and we need to save information on how far they have gotten in that course whether they completed a certain section or a chapter that is where the user course progress is aligned.” (Emphasis on data modeling best practice)
    • “Dynamo DB is much more fast and performant and skills but it is even worse than mongod DB and document DB of complex filtering and sorting.” (Discussion about pros and cons of the chosen database)
    • “The Lambda is going to take some time to load it’s not that much but there is a little bit of delay if someone H if you don’t have assistant users constantly using your application” (Emphasis on cold start)

    Conclusion: This source provides a thorough walkthrough of building a modern web application from start to finish. It covers a broad range of technologies and best practices, making it a valuable resource for developers interested in full-stack development, cloud deployment, and understanding the interplay between various web components and services. The emphasis on production readiness and beginner accessibility makes it suitable for developers of all skill levels.

    Full-Stack LMS Application Development

    Frequently Asked Questions: Full-Stack Learning Management Application

    • What is the purpose of this application being developed? This project aims to create a comprehensive, production-ready Learning Management System (LMS) with a Next.js frontend, Node.js backend, and deployment on AWS. It’s designed to be a full-stack application that could be used for course creation and delivery. The application provides features for user authentication, course browsing, user settings management, and billing. The creator of this project also mentions that it can be used as a reference or portfolio project for other developers.
    • What technologies and tools are used in this project? The project utilizes several key technologies:
    • Frontend: Next.js for the user interface and React components, along with styling using Tailwind CSS and Shadcn UI components. Additional libraries like Framer Motion are used for animations and React Player is used for video playback.
    • Backend: Node.js and Express.js for the server-side logic, with AWS SDK for interacting with AWS services like DynamoDB. Data validation is done with ‘Dynamus’ and unique IDs are created using uuid.
    • Authentication: Clerk is used to manage user authentication and authorization including sign-up, sign-in, profile management, and session handling.
    • Database: DynamoDB (local for development and cloud-based on AWS for production) is chosen as the NoSQL database.
    • Cloud: AWS is used for hosting the application, including ECR for storing Docker images, Lambda for the backend server, S3 for storage, and CloudFront for content delivery. Vercel is used for hosting the front-end application. Other tools include npx, prettier, Visual Studio Code, pesticide, redux dev tools.
    • How is the user authentication handled in this application? User authentication is managed by Clerk, a third-party service that provides a comprehensive authentication platform. Clerk handles user registration, email verification, sign-in, and profile management. It also manages sessions and provides necessary components for easy integration with the frontend. The application also stores user types (student or teacher) in Clerk metadata. The application also uses a middleware that protects certain routes that can only be accessed through authentication using Clerk.
    • Why was DynamoDB chosen for the database? What are its advantages and disadvantages in this context? DynamoDB, an AWS NoSQL database, was chosen for its scalability, performance, and cost-effectiveness. Its advantages include:
    • Scalability and performance: DynamoDB is well-suited for handling large amounts of data and high-traffic applications with fast reads and writes.
    • Cost-effectiveness: It provides a generous free tier for developers and is generally cost-effective when scaled.
    • No complex relationships: This project’s schema is simple with only 3 tables, making DynamoDB a viable option. However, DynamoDB has disadvantages:
    • Not ideal for relationships: It is not ideal for complex relational data structures, hence not best practice to store nested data.
    • Filtering and sorting: It is not as strong at complex filtering and sorting of data compared to other NoSQL databases like MongoDB.
    • Data Nesting: DynamoDB isn’t well suited for nested data and can lead to dense data structures if not handled properly.
    • How is the data structured in this application (data modeling)? The data is structured with three main entities:
    • Course: Stores all details of a course, including teacher ID, title, description, price, and category.
    • Transaction: Contains details for each transaction or payment made including information about payment providers.
    • User Course Progress: Stores a user’s progress in a specific course, including completed sections and overall progress. This is separated from the main course object to avoid a large and dense data structure. This design decision prevents excessive data in the main course object when there are multiple users associated with the same course.
    • How is the backend API built and how can you test it? The backend API is built using Node.js, Express.js, and AWS SDK. It is structured with controllers containing the logic, models for the schema of our data, and routes to connect the endpoints. The setup is done by importing necessary modules and then the app is set up to use middleware such as express.json(), helmet(), and morgan() to handle and log request and security. The routes are then set up to handle different endpoints.
    • Testing the backend API can be done through tools like curl (directly in the terminal) or through a UI tool like Postman for making API calls and inspecting responses. Locally, the server is run through npm run dev, while building for production runs with npm run build.
    • How are payments integrated into the application? Payments are integrated using Stripe. The application utilizes the Stripe JavaScript library (stripe.js) along with @stripe/react-stripe-js. This setup is used to create payment intents on the backend, and to process payments through the client side. React context is used to manage payment information. The checkout flow involves steps to get course information, handle payment details and the creation of a client secret key, and finally the rendering of payment information before completion. This is all done with a Stripe provider.
    • How is the application deployed and how is a serverless function used? The application is deployed using several AWS services and Vercel. Here’s how it works:
    • Frontend Deployment: The frontend is deployed using Vercel, a platform that simplifies the deployment of front-end applications.
    • Backend Deployment: The backend is packaged into a Docker container and deployed on AWS Lambda. The Docker image is stored in AWS ECR (Elastic Container Registry) and is used by the Lambda function. Lambda provides a serverless compute service that runs the application code.
    • API Gateway: An API Gateway is used as a front-end for the Lambda function, providing a secure endpoint for the frontend to interact with the backend logic and routes.
    • Serverless Logic: The server uses the serverless-http library for compatibility with the serverless environment. The Lambda function has permissions granted using IAM roles that are assigned for different AWS services, allowing access to DynamoDB and S3.
    • S3 and CloudFront: AWS S3 is used to store static assets or files. AWS CloudFront is set up as a CDN (Content Delivery Network) to distribute the content to users for faster loading times.
    • The serverless function is used by exporting the Lambda handler. The Lambda handler handles seed functions in the database and defaults to the serverless app function otherwise.

    Full-Stack Learning Management Application Architecture

    The sources describe a full-stack learning management application built with a Next.js frontend, Node.js backend, and deployed on AWS. Here’s a breakdown of the key components and technologies used:

    Application Overview

    • The application includes a landing page with animations, a course catalog, user authentication, user profiles, billing information, and course progress tracking.
    • It is designed to be a production-ready application with a focus on scalability and customizability.
    • The application is also responsive, adapting to different screen sizes.

    Frontend Technologies

    • Next.js: Used as the primary framework for building the user interface.
    • Redux Toolkit: Manages the application state.
    • Redux Toolkit Query: Handles API interactions with the backend.
    • Tailwind CSS: Provides styling along with the Shadcn component library.
    • TypeScript: Used for type checking.
    • Framer Motion: Implements animations.
    • React Hook Form: Handles form management and Zod for form validation.

    Backend Technologies

    • Node.js and Express: Used to create the backend API, separate from the Next.js frontend, to enhance scalability.
    • Docker: The backend is containerized using Docker for consistent environment packaging.
    • AWS Lambda: Hosts the backend, using the Docker image from ECR.
    • API Gateway: Securely routes requests to Lambda functions.
    • DynamoDB: Serves as the NoSQL database.
    • S3: Handles storage of video content.
    • CloudFront: Used as a content delivery network for videos to ensure low latency and high availability.

    Authentication

    • Clerk: A third-party service is used for user authentication, offering pre-built components for sign-in, sign-up, and user management. It is used instead of AWS Cognito due to its easier setup.

    Deployment

    • AWS: The application utilizes a serverless containerized architecture on AWS.
    • Vercel: Hosts the Next.js frontend, integrating with other AWS services.

    Key Features

    • Course Management: Users can browse courses, view course details, and track their progress. Teachers can create and edit courses.
    • User Authentication and Management: The application uses Clerk for user authentication, profiles, and roles.
    • Billing: The application uses Stripe for payment processing.
    • Responsive Design: The application is designed to adapt to different screen sizes.

    Development Process

    • The development process includes setting up Node.js, NPX, and Visual Studio Code.
    • The project utilizes various libraries and extensions for React development and code formatting.
    • The application also uses a custom base query to format API responses, handling data and error messages.
    • The application is deployed on AWS using services such as Lambda, API Gateway, DynamoDB, S3, and CloudFront.
    • The deployment also includes setting up IM roles to manage permissions for Lambda to access other AWS services.

    Data Modeling

    • The application uses a NoSQL database, DynamoDB, due to the nature of the data and relationships.
    • The data model includes courses, sections, chapters, user progress, and transactions.
    • User progress is stored separately from course data to prevent overly large data objects.

    Additional Points

    • The application emphasizes learning backend and deployment skills, not just frontend.
    • The use of a custom base query in Redux Toolkit Query provides a way to format API responses and display toast notifications for successful mutations.
    • The application also utilizes custom form fields for managing user settings.
    • The application uses Shaden UI components for styling.

    This detailed overview should give you a solid understanding of this full-stack learning management application.

    Full-Stack Learning Management Application

    The sources detail a full-stack learning management application with a variety of features and technologies. Here’s a breakdown of the key aspects:

    Core Functionality

    • Course Catalog: The application allows users to browse courses, view course details, and enroll in courses.
    • User Authentication: Clerk is used for user authentication, offering features such as sign-in, sign-up, profile management, and user roles. User roles determine access to different parts of the application.
    • Course Creation: Teachers can create and edit courses, including course titles, descriptions, categories, and prices. Courses can be organized into sections and chapters, with video content for each chapter.
    • Billing: Stripe is used for handling payments and transactions.
    • Course Progress: The application tracks user progress through courses, including marking chapters and sections as complete.

    Key Features

    • User Roles: There are distinct roles for users and teachers, each with specific access and functionalities. Teachers can create and manage courses, while users can enroll and track their progress.
    • Responsive Design: The application is designed to be responsive, adapting to different screen sizes.
    • Scalability: The application is built with a focus on scalability, using a separate backend to avoid tight coupling.
    • Data Modeling: The application uses a NoSQL database, DynamoDB, due to the nature of the data and relationships. The data model includes courses, sections, chapters, user progress, and transactions.

    Technology Stack

    • Frontend:Next.js: The primary framework for building the user interface.
    • Redux Toolkit: Used for state management and API interactions.
    • Tailwind CSS and Shadcn: Used for styling and component library.
    • TypeScript: Used for type checking.
    • Framer Motion: Implements animations.
    • React Hook Form and Zod: Handles form management and validation.
    • Backend:Node.js and Express: Used to create the backend API.
    • Docker: Used to containerize the backend.
    • AWS Lambda: Hosts the backend using the Docker container.
    • API Gateway: Securely routes requests to Lambda functions.
    • DynamoDB: Serves as the NoSQL database.
    • S3: Handles the storage of video content.
    • CloudFront: A content delivery network (CDN) used to deliver videos with low latency.
    • Authentication:Clerk: A third-party service for user authentication.
    • Deployment:AWS: The application uses a serverless containerized architecture on AWS.
    • Vercel: Hosts the Next.js frontend.

    Development Highlights

    • Custom Base Query: The application uses a custom base query in Redux Toolkit Query to format API responses, handle data, and display toast notifications for successful mutations.
    • Form Management: Custom form fields are used for managing user settings, and react hook forms for form management and validation.
    • Backend Security: The backend API endpoints are secured using Clerk middleware, which requires authentication for certain routes.
    • Video Upload: Videos are uploaded to S3 using pre-signed URLs.
    • IM Roles: IM roles are created for Lambda to access AWS services such as DynamoDB, S3, and API Gateway.

    Additional Information

    • The application prioritizes backend and deployment skills alongside frontend development.
    • The choice of DynamoDB is due to the data structure, scalability, and performance requirements.
    • User progress is stored separately from course data to prevent overly large data objects and improve performance.

    In summary, this learning management system is a complex full-stack application with a variety of features, utilizing a modern tech stack and cloud infrastructure. It demonstrates a strong emphasis on scalability, customization, and user experience.

    Serverless Learning Management System on AWS

    The sources describe the deployment of a full-stack learning management application on AWS using a serverless, containerized architecture. The application leverages multiple AWS services to achieve scalability, performance, and cost-effectiveness. Here’s a detailed breakdown of the AWS deployment process:

    Core AWS Services

    • ECR (Elastic Container Registry): Docker images of the backend are stored in ECR.
    • Lambda: The backend is hosted using AWS Lambda, running the Docker image stored in ECR. Lambda is configured with a five-minute timeout and environment variables for production use.
    • API Gateway: Serves as a secure entry point for the application, routing requests to the Lambda function. It provides HTTPS endpoints without managing TLS certificates. A proxy resource is used in API Gateway to handle all requests, which are then routed to the Express server in the Lambda function.
    • DynamoDB: A NoSQL database used to store application data. The data model includes courses, sections, chapters, user progress, and transactions.
    • S3 (Simple Storage Service): Handles storage for video content.
    • CloudFront: A content delivery network (CDN) that delivers video content from S3 with low latency and high availability.

    Deployment Steps

    • Dockerization: The Node.js and Express backend is packaged into a Docker container. The Dockerfile includes instructions for building, installing dependencies, and setting up the production environment.
    • ECR Setup: A repository is created in ECR to store the Docker image. The Docker image is then pushed to the ECR repository using the AWS CLI.
    • Lambda Configuration: A Lambda function is created using the Docker image from ECR. The Lambda function is given an IAM role with the necessary permissions to access other AWS services.
    • IAM Roles: IAM (Identity and Access Management) roles are created to manage permissions for AWS services. The Lambda function is granted access to DynamoDB, S3, and API Gateway through a custom role. The IAM role includes a trust policy that allows both Lambda and API Gateway to assume the role.
    • API Gateway Setup: API Gateway is configured to route requests to the Lambda function. A proxy resource is used to forward all requests to the Lambda backend, allowing the Express server to handle routing.
    • S3 Configuration: S3 is set up with blocked public access, using a bucket policy to allow CloudFront read access. CORS (Cross-Origin Resource Sharing) settings are configured to allow different services to access S3.
    • CloudFront Configuration: CloudFront is set up to deliver video content from S3. It uses an origin access control setting, which requires a policy to be set on the S3 bucket. CloudFront is configured to redirect HTTP to HTTPS and allow various HTTP methods.
    • Environment Variables: Lambda environment variables are configured for production, including AWS region, S3 bucket name, CloudFront domain, Stripe keys, and Clerk keys.
    • Seeding the Database: A seed function is included in the Lambda code, triggered by an action parameter, allowing the database to be seeded directly from Lambda.

    Key Deployment Concepts

    • Serverless Architecture: The application uses serverless services like Lambda and DynamoDB, which reduces operational overhead and allows for automatic scaling.
    • Containerization: The backend is containerized using Docker, ensuring a consistent and portable environment for the application.
    • Pre-signed URLs: S3 pre-signed URLs are used to allow the client to upload videos directly to S3, bypassing the 10MB limit on API Gateway.
    • Cold Starts: Lambda functions may experience cold starts, where the first request after a period of inactivity can take longer to process.

    Additional Points

    • The deployment process prioritizes cost-effectiveness by utilizing free tier services on AWS, and budgets are created to monitor usage and prevent unexpected charges.
    • The application is deployed using a combination of the AWS Management Console, AWS CLI, and third-party services, like Vercel for the frontend.
    • The deployment emphasizes understanding the security and access requirements for each service, especially when dealing with data and video content.
    • The application’s architecture on AWS uses managed services to minimize the need for complex networking configurations.

    In summary, the application’s AWS deployment is a comprehensive process involving multiple services working together to create a scalable, secure, and performant learning management system. The deployment utilizes best practices for security, cost management, and efficiency, while leveraging serverless technology and containerization.

    Next.js Learning Management System Frontend

    The sources provide a detailed look at the Next.js frontend of a learning management application, highlighting its features, technologies, and development practices. Here’s a comprehensive overview:

    Core Functionality

    • User Interface: Next.js is the primary framework for building the application’s user interface. It handles routing, page rendering, and overall structure.
    • Dynamic Pages: Next.js is used to create dynamic pages for course listings, search, user profiles, and course editing.
    • Component-Based Architecture: The frontend uses a component-based architecture, making it easier to manage, reuse, and update the user interface.
    • Server-Side Rendering (SSR): Although the application uses client-side data fetching with Redux Toolkit Query, Next.js provides SSR capabilities, which can improve performance and SEO in some cases. This is a key feature of the framework.

    Key Technologies & Libraries

    • Redux Toolkit: Used for managing application state and handling API interactions. It includes features like Redux Toolkit Query for fetching data.
    • Tailwind CSS and Shadcn: Used for styling and UI components. Tailwind provides utility classes for styling, and Shadcn provides a library of pre-built, customizable components. The application uses a customized Tailwind configuration with its own color palette.
    • TypeScript: Used for static typing, making the code more robust and easier to maintain.
    • Framer Motion: Used for adding animations and transitions to the user interface.
    • React Hook Form and Zod: Used for handling form management and validation.
    • Clerk: Handles user authentication, including sign-in, sign-up, and user profile management. It integrates well with Next.js.
    • Stripe: Used for payment processing.

    Development Practices

    • Custom Hooks: The application uses custom React hooks to encapsulate and reuse logic, for example, useCarousel for image carousels and useCheckoutNavigation for managing navigation steps within the checkout flow.
    • Component Libraries: The use of Shadcn component library allows for consistent UI elements across the application, and components can be installed individually as needed.
    • Code Organization: The project is structured with clear separation of concerns, including components, utilities, styles, and state management. The src directory contains components, lib, state, types, and app directories. The app directory is for Next.js pages and routes.
    • Styling: The application emphasizes functionality and logic over extensive custom styling, using pre-defined Tailwind classes to quickly style components.
    • API Integration: Redux Toolkit Query is used to make API calls to the backend. The application uses a custom base query to handle responses and add authentication tokens to each request.
    • Environment Variables: Environment variables are used to manage configuration settings, API keys, and URLs.
    • Client-Side Data Fetching: The application fetches data on the client-side using Redux Toolkit Query. Although Next.js provides server-side rendering capabilities, this application primarily uses client-side data fetching.
    • State Management: Redux Toolkit is used for state management, providing a central store for application data.
    • Form Management: React Hook Form is used with Zod for form validation and management. The application also makes use of a custom form field for creating forms faster.

    Key Features in the Frontend

    • Landing Page: Includes a loading screen, animated elements, and a course search feature. It features a carousel of course images.
    • Search Page: Displays a list of available courses with filtering options, along with a detailed view of selected courses.
    • User Profile and Settings: Includes settings for notifications, email alerts, SMS alerts, and notification frequency, which are stored in Clerk’s user data.
    • Checkout Process: The checkout process is a multi-step wizard, including details, payment, and completion pages.
    • Course Editor: Provides a WYSIWYG editor for creating and modifying course content, structured into sections and chapters. It supports uploading video content.
    • Billing Page: Allows users to view their transaction history.
    • Navigation: The application has a navigation bar and a sidebar, which adapts to different screen sizes and contexts. The sidebar provides links to various parts of the application.
    • Loading Indicators: A shared loading component is used in various parts of the application.

    Additional Notes

    • The application uses a ‘non-dashboard layout’ for pages that don’t require user authentication and a ‘dashboard layout’ for pages that are behind the authentication wall.
    • The application emphasizes a balance between UI/UX and functionality, with styling applied efficiently using Tailwind CSS and pre-built components.

    In summary, the Next.js frontend of this learning management application is a well-structured and feature-rich component, utilizing a modern tech stack and best practices for frontend development. It’s built for scalability, maintainability, and a smooth user experience.

    Node.js Learning Management System Backend

    The sources describe the backend of a learning management application built with Node.js and the Express framework, emphasizing its scalability, customizability, and independence from the frontend. Here’s a detailed breakdown of the backend:

    Core Technologies & Architecture

    • Node.js and Express: The backend is built using Node.js as the runtime environment and Express as the web framework. This combination allows for handling server-side logic and routing API requests.
    • Separate Backend: The backend is designed to be a separate application, not part of the Next.js frontend. This separation allows the backend to scale independently and prevents tight coupling between the frontend and backend, which enhances maintainability.
    • Docker: The backend is containerized using Docker, ensuring a consistent and portable environment across different stages of development and deployment.
    • Serverless Architecture: The backend is designed to run in a serverless environment using AWS Lambda, allowing for automatic scaling and reduced operational overhead.

    Key Features and Functionality

    • API Endpoints: The backend provides a variety of API endpoints for managing courses, users, transactions, and video uploads.
    • Data Modeling: The backend uses a data modeling approach where the data is structured into courses, sections, chapters, comments, transactions, and user course progress.
    • Database Interaction: The backend uses DynamoDB as its NoSQL database for storing data. It’s chosen for its scalability and speed, as well as its low cost. The backend interacts with DynamoDB using the dynamus library which is similar to mongoose for MongoDB.
    • Authentication: The backend is integrated with Clerk for user authentication and authorization. Clerk is used to handle user sign-in, sign-up, and user roles.
    • File Uploads: The backend handles video uploads to S3 using pre-signed URLs for better scalability.
    • Payment Processing: The backend integrates with Stripe for payment processing and transaction management.
    • Middleware: The backend uses various middleware to add functionality like parsing request bodies, setting security headers, logging API calls, and managing CORS.
    • Route Management: Express Router is used to handle routes for different resources such as users, courses and transactions.
    • CORS: The backend is configured to handle cross origin requests with the CORS middleware so the frontend can communicate with the backend.
    • Environment Variables: The backend uses environment variables for configuration, like port numbers, database connection details, and authentication secrets.
    • Data Validation: The backend uses dynamus to validate data types, similar to mongoose.

    Backend Development Practices

    • Controllers: The backend uses controllers to manage the core logic, where each controller is focused on a specific resource like courses or user accounts.
    • Routes: Routes are organized to handle API endpoints for different resources.
    • Data Models: Data models are created to define the structure and validation of the data stored in DynamoDB. The data models are stored in a separate folder.
    • Middleware: Middleware functions are used to add extra functionality to the request handling, such as authentication, logging, and parsing.
    • Asynchronous Operations: Asynchronous operations are used for database interactions and other I/O bound operations.
    • Security: The backend uses helmet middleware for securing the application with proper HTTP headers and uses requireAuth middleware to protect API endpoints.
    • Error Handling: The backend uses try-catch blocks for error handling to send appropriate responses to the frontend.
    • Code Organization: The backend uses a structured folder approach to separate controllers, models, and routes.
    • Database Seeding: The backend has a seed script that inserts mock data into the DynamoDB database, which is useful for development.

    Key Components

    • Course Controller: Manages the API endpoints for courses, including listing, getting, creating, updating, and deleting courses.
    • User Clerk Controller: Manages the API endpoints for user settings, including updating user profile information in Clerk, using Clerk SDK.
    • Transaction Controller: Manages transactions for payment processing.
    • API Routes: The API endpoints are grouped into routes, making it easier to manage different resources.
    • Seed Script: Used for seeding the database with initial data.

    Deployment Preparation

    • Serverless HTTP: The backend is prepared for serverless deployment with serverless-http, which allows it to run as an AWS Lambda function.
    • Dockerization: The backend is packaged into a Docker image, which is stored in a container registry.
    • Lambda Handler: The application uses a custom handler that is compatible with AWS Lambda.
    • Environment Variables: Environment variables are set up in the Lambda environment for authentication and configuration.

    Important Points

    • The backend design emphasizes the importance of backend skills and knowledge for software engineers and discourages a sole focus on frontend development, emphasizing backend and DevOps skills as essential.
    • The backend is designed to be scalable and maintainable by separating concerns and using modern software engineering practices.
    • The backend utilizes a serverless, containerized architecture, taking advantage of AWS services to minimize infrastructure concerns and reduce operational overhead.
    • The backend is built to interact with a variety of services, including Clerk, Stripe, AWS Lambda, API Gateway, DynamoDB, and S3.

    In summary, the Node.js and Express backend of the learning management application is a robust and well-structured system that leverages modern software engineering practices and cloud-based services to provide a scalable, secure, and performant API for the frontend. It emphasizes customizability and the separation of backend logic from the frontend.

    Build a Nextjs Learning Management App | AWS, Docker, Lambda, Clerk, DynamoDB, ECR, S3, Shadcn, Node

    By Amjad Izhar
    Contact: amjad.izhar@gmail.com
    https://amjadizhar.blog

  • Why Kindness Makes People Disrespect You Modern Stoicism: Stoic Secrets: Kindness, Boundaries, and Respect

    Why Kindness Makes People Disrespect You Modern Stoicism: Stoic Secrets: Kindness, Boundaries, and Respect

    The sources examine the potential downsides of unchecked kindness, highlighting how it can lead to disrespect, exploitation, and burnout. They discuss how an excess of kindness without proper boundaries or wisdom can invite others to take advantage and disregard personal priorities. Drawing upon Stoic philosophy, the sources encourage practicing self-respect, setting clear boundaries, and discerning who genuinely appreciates kindness from those who seek to exploit it. They advocate for emotional regulation, purposeful action, and distancing oneself from those who deplete energy. Ultimately, the sources emphasize that true kindness stems from a place of strength and inner balance, benefiting both the giver and receiver.

    The Stoic Path to Kind and Respected

    Study Guide Contents:

    I. Quiz: Knowledge Review (Short Answer)

    II. Essay Questions: Critical Thinking & Application

    III. Glossary: Key Terms and Definitions

    I. Quiz: Knowledge Review (Short Answer)

    Answer each question in 2-3 sentences based on the provided source material.

    1. Why does excessive kindness, when given without boundaries, sometimes lead to disrespect according to the source?
    2. How do people generally value things that are difficult to obtain versus those that are easily accessible?
    3. What is the relationship between strength and kindness, according to the source? Are they mutually exclusive?
    4. According to the source, what message are you sending when you consistently allow people to push your limits?
    5. How does unchecked kindness contribute to an imbalance in relationships, according to the source?
    6. What does it mean to “reward appreciation, not entitlement,” and why is this important?
    7. Why might people see someone who is excessively kind as emotionally dependent?
    8. Explain the stoic view on the relationship between kindness and approval, describing what motivates “true kindness”.
    9. Why is protecting your energy considered as important as protecting your time, according to the source?
    10. According to the source, how does walking away from toxic situations command respect?

    Quiz Answer Key:

    1. Excessive kindness, when given without boundaries, can lead to disrespect because people tend to devalue what is easily accessible and unearned. This can transform kindness from a virtue into an expectation, leading others to feel entitled and not appreciate or reciprocate.
    2. People tend to value what they have to earn and respect those whose kindness must be earned. Anything easily obtained is often overlooked, while that which requires effort is cherished.
    3. The source suggests that strength and kindness are not opposites but go hand in hand. True kindness is not about being a doormat, but about balancing giving with self-protection, and understanding that saying no when necessary is a sign of self-worth.
    4. When you consistently allow people to push your limits, you are teaching them how to treat you. If you don’t set limits, you are silently approving of mistreatment and indicating that you don’t value yourself enough to stand firm.
    5. Unchecked kindness creates an imbalance in relationships because one person is always giving while the other is taking, leading to dependency and entitlement rather than mutual respect. This occurs when generosity is excessive or poorly placed.
    6. Rewarding appreciation, not entitlement, means acknowledging and valuing gratitude while refusing to enable demanding or expectant behavior. This is important because it reinforces a mindset of respect and gratitude rather than one of obligation.
    7. People might see someone who is excessively kind as emotionally dependent because their kindness may stem from a fear of rejection or a need for approval, indicating that their self-worth depends on the approval of others.
    8. The stoic view suggests that true kindness comes from strength, not a need for approval. It is motivated by a genuine desire to do good because it aligns with one’s values, rather than a strategic attempt to gain favor or validation.
    9. Protecting your energy is as important as protecting your time because your energy is a limited resource that needs to be carefully managed. Giving it away too freely leaves little left for what truly matters and can lead to burnout and resentment.
    10. Walking away from toxic situations commands respect because it demonstrates an unwavering commitment to self-respect and sets a standard for how you expect to be treated. It shows that you value your well-being and will not tolerate mistreatment.

    II. Essay Questions: Critical Thinking & Application

    Choose one of the questions below and develop an essay response based on the ideas presented in the source material.

    1. Discuss the potential dangers of unlimited self-sacrifice and how modern stoicism offers a balanced approach to generosity and self-preservation. Provide real-world examples to support your arguments.
    2. Analyze the relationship between kindness, boundaries, and respect. How can one practice kindness without becoming a “doormat”? Explore the stoic principles that support this balance.
    3. Examine the concept of “emotional dependence” as it relates to acts of kindness. How can one ensure that their generosity stems from a place of inner strength rather than a need for approval?
    4. Explain the stoic perspective on the role of expectations in relationships. How can letting go of unrealistic expectations lead to more authentic and fulfilling connections?
    5. Discuss how setting and enforcing personal boundaries protects your time and energy and communicates your worth to others.

    III. Glossary: Key Terms and Definitions

    TermDefinitionStoicismAn ancient Greek philosophy emphasizing virtue, reason, and living in accordance with nature. Focuses on what can be controlled (inner thoughts and actions) vs. what cannot (external events).Modern StoicismContemporary application of stoic principles to everyday challenges, focusing on self-improvement, resilience, and living a meaningful life.KindnessThe quality of being friendly, generous, and considerate; showing concern and compassion for others.BoundariesPersonal limits that define what behaviors a person will accept from others. Essential for maintaining healthy relationships and protecting one’s well-being.RespectA feeling of deep admiration for someone or something elicited by their abilities, qualities, or achievements.Self-RespectHaving pride and confidence in oneself; valuing one’s own well-being and dignity.EntitlementThe belief that one is inherently deserving of privileges or special treatment.Emotional DependenceA state of relying excessively on others for emotional support, validation, or a sense of self-worth.Self-SacrificeGiving up one’s own needs or interests for the sake of others; can be positive when balanced, but detrimental when excessive.VirtueMoral excellence; behavior showing high moral standards. In Stoicism, primary virtues include wisdom, justice, courage, and temperance.DiscernmentThe ability to judge well; having keen insight and good judgment, especially regarding moral or ethical matters.Emotional DetachmentThe ability to separate one’s emotions from a situation, allowing for a more objective and rational response.

    Okay, here’s a briefing document summarizing the key themes and ideas from the provided source, with relevant quotes:

    Briefing Document: Navigating Kindness with Stoic Wisdom

    Main Theme: The sources explore the nuanced relationship between kindness and respect, arguing that unchecked kindness can lead to disrespect, exploitation, and personal burnout. It advocates for a balanced approach, integrating stoic principles of self-respect, boundary setting, and emotional awareness to ensure kindness remains a virtue rather than a liability.

    Key Ideas and Facts:

    • Kindness Without Boundaries Breeds Disrespect: The fundamental premise is that excessive, freely given kindness can be devalued. People tend to value what they have to earn. “People value what they have to earn. Kindness is often seen as a virtue, yet paradoxically in the modern world it can lead to disrespect when given too freely and without boundaries.” The more available and accommodating you are, the less others may appreciate your efforts.
    • The Paradox of Kindness: Kindness can paradoxically lead to disrespect in the modern world when given too freely and without boundaries. “This is one of the great paradoxes of human nature: people tend to devalue what is easily accessible.” When kindness becomes expected, it’s no longer seen as a gift.
    • Self-Respect is Paramount: The foundation for healthy kindness is self-respect. “Respect starts with self-respect. If you respect yourself enough to set limits, others will follow suit.” Stoicism emphasizes controlling what’s within your power, and that begins with valuing your own time and energy. Epic tetus asked, “How long will you wait before you demand the best for yourself?”
    • Saying “No” is Essential: The ability to say “no” is a critical tool for setting boundaries and protecting personal well-being. “Practice saying no. When you are always agreeable, people take you for granted, but when you establish clear limits…your kindness retains its value.” Saying no is not unkind; it’s a sign of self-worth. “A truly kind person is not someone who always says yes, but someone who knows how to balance giving with self- protection…”
    • People Test Limits: Human nature tends to test boundaries. If you consistently allow things to slide, you’re teaching people how to treat you. Epictus wisely said “you are not to blame for being uneducated, but you are to blame for refusing to learn and one of the most crucial lessons in life is this people will only respect the boundaries that you enforce. “
    • Imbalance in Relationships: Unchecked kindness creates an imbalance where one person carries the burden. “When you are always the one offering help, making sacrifices, or accommodating others, you unconsciously set a precedent…where you are expected to give and others feel entitled to receive.” This can lead to resentment and feeling unappreciated. Seneca reminds us that “he who gives when he is asked has waited too long”
    • Vulnerability to Manipulation: Excessive kindness can make you a target for manipulators who seek to exploit those who are easily swayed. “You might believe that being kind will earn you respect, but to the wrong people, it signals weakness.” True kindness, when paired with wisdom, is a strength. “Marcus Aurelius one of the greatest stoic philosophers wrote the best revenge is to be unlike him who performed the injury”
    • Emotional Dependence: Kindness stemming from emotional dependence (fear of rejection, need for approval) is a silent invitation for disrespect. “People instinctively admire those who are emotionally independent, those who do not seek validation through their acts of kindness but rather offer it from a place of inner abundance. ” If your kindness is a bargaining tool, it backfires.
    • Intentional vs. Automatic Kindness: Kindness must be intentional, not automatic or reactive. “Commanding respect while maintaining kindness is a delicate balance, but it is not about people-pleasing or seeking approval; it is about acting with intention.” Stoicism emphasizes deliberate action based on values, not fear.
    • Reward Appreciation, Not Entitlement: It’s crucial to reward those who appreciate your kindness, rather than those who feel entitled to it. “The happiness of your life depends upon the quality of your thoughts and this applies to how you allow others to treat you if you continue to give kindness to those who feel entitled to it you reinforce the wrong mindset.”
    • Respect Over Approval: It’s more important to be respected than liked. Kindness rooted in a need for approval can backfire. “When you live with Integrity when your kindness is rooted in genuine goodwill rather than a desperate need to be liked people notice they might not always agree with you but they will respect you and more importantly you will respect yourself. “
    • Protect Your Energy: Energy is a finite resource. “Protect your energy as fiercely as your time.” Don’t allow yourself to be an emotional dumping ground. Associate with those who uplift you. If you invest your energy in things that don’t serve you, you lose a piece of yourself. “Marcus Aurelius advised the tranquility that comes when you stop caring what they say or think or do, only what you do when you stop allowing outside distractions to consume your inner peace you gain power.”
    • Know When to Walk Away: Walking away from toxic dynamics demonstrates self-respect. “The more we value things outside our control the less control we have your time energy and peace of mind are among your most valuable assets and not everyone deserves access to them. ” Seneca stated “associate with people who are likely to improve you. “
    • Don’t Set Yourself On Fire To Warm Others We must remember not to set ourselves on fire to warm others. This stoic principle speaks to the importance of maintaining boundaries and not sacrificing our own well-being in the name of helping others
    • Balance Generosity With Self-Care: Stoicism encourages us to live in accordance with reason and virtue, which includes making thoughtful decisions rather than acting impulsively or out of an emotional desire to please others. There is a fine line between offering assistance and overextending ourselves to the point of exhaustion.

    Stoic Solutions:

    • Emotional Detachment: Practicing emotional detachment can help manage reactions to others’ actions. It’s about consciously choosing how to respond, not becoming numb.
    • Forgiveness: Letting go of past hurts is essential for emotional freedom. It doesn’t mean excusing actions, but rather freeing yourself from the emotional weight. “Marcus Aurelius said ‘the best revenge is to be unlike him who performed the injury.’”
    • Reciprocity has an Expiration Date: The true value of generosity lies not in what we receive but in what we offer to others
    • Discernment: This includes not only understanding our emotions but setting boundaries and protecting our energy

    Overall Message: True kindness isn’t about unlimited self-sacrifice; it’s about acting with virtue, wisdom, and self-respect. By integrating stoic principles, individuals can ensure their kindness is a source of strength, enriching their lives and the lives of others. It’s about living a life rooted in clarity, resilience, and balance.

    I hope this is helpful!

    FAQ: Kindness, Respect, and Stoicism

    Here are some frequently asked questions that best capture the main themes and ideas from the provided sources.

    1. Why does being too kind sometimes lead to disrespect from others?

    Kindness, when given without boundaries, is often devalued. People tend to value what they have to earn or work for. When kindness becomes a constant, easily accessible presence, it transforms from a virtue into an expectation. Others may feel entitled to your generosity, no longer seeing it as a gift to appreciate or reciprocate. This can lead them to take advantage of your kindness and disregard your needs.

    2. How does unchecked kindness make me appear weak?

    In society, strength is often associated with assertiveness and the ability to set clear boundaries. Unchecked kindness can be mistaken for weakness because you may always say yes, always yield, and never push back. While kindness is not inherently a flaw, it can lead to disrespect when not balanced with self-respect. People may overlook or take advantage of someone who is endlessly accommodating.

    3. Why do people test my limits when I am consistently kind?

    Human nature tends to test limits. If you consistently let things slide and don’t enforce boundaries, people will push to see how far they can go. It’s not always malicious, but a way of understanding what is acceptable. By setting clear expectations, you show that you value yourself and your boundaries, commanding respect.

    4. How does unlimited kindness create an imbalance in relationships?

    When you are always the one offering help, making sacrifices, or accommodating others, you unconsciously set a precedent where you are expected to give and others feel entitled to receive. This creates an imbalance where one person carries the burden of maintaining the relationship, leading to feelings of being drained, used, and unappreciated. It’s crucial to establish fairness and reciprocity in relationships to avoid this imbalance.

    5. How does being kind make me a target for manipulators?

    Manipulators seek out people who are easy to sway. Being kind can signal weakness to them, making you an easy target to exploit. They may see it as an open invitation to push boundaries, take without giving, and bend you to their will. Balancing kindness with wisdom is essential to avoid being taken advantage of.

    6. How can kindness be rooted in emotional dependence, and why does this lead to disrespect?

    When kindness stems from a place of emotional dependence, such as fear of rejection or a need for approval, it becomes a silent invitation for disrespect. People instinctively admire those who are emotionally independent. If your kindness is driven by the need for validation, it ceases to be an act of virtue and instead becomes a bargaining tool, signaling that your worth depends on their approval. People are wired to value what is scarce to admire what is self-sufficient

    7. What is the difference between true kindness and people-pleasing?

    True kindness stems from strength, not from a need for approval. People-pleasing is often driven by a desire to be liked, gain validation, or secure affection. It comes across has neediness rather than generosity. When kindness is transactional it can come across has a form of emotional bribery. True kindness is an act of virtue that comes from Inner Strength, not the fear of rejection. Being a good person, does not mean being a doormat.

    8. How can I maintain kindness while commanding respect, according to stoicism?

    To command respect without losing your kindness, you must practice kindness with wisdom and boundaries. Make your kindness intentional, say no without explaining yourself, reward appreciation, and protect your energy. Stoicism emphasizes finding a balance between generosity and self-preservation. Setting limits, giving intentionally, and ensuring your kindness is valued (not exploited) is very important. Remember respect starts with self-respect.

    The Pitfalls of Excessive Kindness

    When overused, kindness can lead to several negative outcomes, including disrespect from others, the appearance of weakness, and the creation of imbalanced relationships. Here’s a breakdown of how excessive kindness can be detrimental, according to the sources:

    • Disrespect Kindness, when given too freely, can be devalued. People tend to value what they have to earn, so if kindness is a constant, unearned presence, it becomes an expectation rather than a virtue. This can lead to others feeling entitled to one’s generosity, making them less appreciative and less likely to reciprocate.
    • Appearing weak Unchecked kindness can be mistaken for weakness because society often associates strength with assertiveness and the ability to set boundaries. People who always say yes and never push back may be overlooked or taken advantage of.
    • Testing limits Human nature tends to test limits, and if someone is consistently kind without boundaries, others may push to see how far they can go. This isn’t necessarily malicious but rather a way of understanding what is acceptable.
    • Imbalance in relationships Excessive kindness can create an imbalance where one person is always giving and the other is always receiving. This can lead to the giver feeling drained, used, and unappreciated. People may begin to see the giver’s generosity as an obligation.
    • Target for manipulators Overly kind people can become targets for manipulators, who seek out those who are easy to sway and take advantage of. To those with bad intentions, kindness can signal weakness and an open invitation to push boundaries.
    • Emotional dependence Kindness that stems from a place of emotional dependence, such as a fear of rejection or a need for approval, can invite disrespect. People instinctively admire those who are emotionally independent and offer kindness from a place of inner abundance.
    • Sacrificing self-respect: True kindness comes from strength, not a need for approval. When actions are motivated by a desire to be liked or to gain validation, they lose their authenticity, and people sense when kindness is transactional.
    • Ignoring priorities: Overdoing kindness makes you the go-to person for everyone, but you begin to notice the important things are slipping. True kindness doesn’t require you to abandon your personal goals, it comes from balance where you have taken care of your own needs first.
    • Attracting opportunists: Although admirable, excessive kindness attracts opportunists who see your generosity as an endless resource to exploit.
    • Habit Forming: You create a dangerous imbalance when you overextend kindness because it leads to stress and triggers harmful coping mechanisms.

    To avoid these pitfalls, the sources suggest practicing kindness with wisdom and setting boundaries. This involves making kindness intentional rather than automatic, saying no when necessary, and ensuring that your generosity is valued and reciprocated. The key is to balance generosity with self-respect, ensuring that your kindness is a conscious choice and not a self-imposed burden.

    The Art of Setting Healthy Boundaries

    Setting boundaries is essential for maintaining healthy relationships and protecting one’s well-being. The sources emphasize that boundaries are not about withholding kindness but about ensuring that kindness is meaningful and does not lead to disrespect, exploitation, or burnout.

    Here’s a breakdown of key aspects related to setting boundaries, based on the sources:

    • Purpose of Boundaries:
    • Protecting Self-Respect: Setting limits indicates self-respect, which encourages others to follow suit.
    • Preserving Value: Establishing boundaries ensures that kindness remains a conscious act of virtue rather than an unconscious obligation.
    • Preventing Exploitation: Boundaries prevent others from taking advantage of one’s generosity.
    • Maintaining Balance: Setting limits ensures a balance between generosity and self-preservation, preventing exhaustion and bitterness.
    • How to Set Boundaries:
    • Saying No: Practice saying no without over-explaining or feeling guilty. A firm, clear “no” is enough.
    • Being Intentional: Make kindness a conscious choice rather than an automatic reaction.
    • Defining Limits: Clearly communicate your limits to others, teaching them how to respect your time and energy.
    • Enforcing Boundaries: Consistently uphold your boundaries and take action when they are crossed.
    • Protecting Energy: Guard your emotional and mental energy by limiting exposure to negativity and setting boundaries with your emotions.
    • Walking Away: Be willing to distance yourself from toxic dynamics or relationships where respect is absent.
    • Benefits of Setting Boundaries:
    • Earning Respect: Setting clear expectations and refusing to be taken advantage of often leads to greater respect from others.
    • Healthier Relationships: Boundaries foster relationships built on mutual respect rather than silent sacrifice.
    • Preventing Burnout: Establishing limits prevents overextension and burnout, ensuring that kindness is sustainable.
    • Promoting Self-Worth: Setting boundaries demonstrates self-worth, which encourages others to value your time and energy.
    • Avoiding Manipulation: Clear boundaries discourage manipulators and those who seek to exploit kindness.
    • Fostering Independence: Boundaries prevent you from over-helping, which allows other to discover their own strength.
    • Qualities of Effective Boundaries:
    • Firmness: Boundaries should be firm and unwavering.
    • Fairness: Boundaries should be fair and not cross over into being cruel.
    • Generosity: Boundaries should leave space for generosity, but not enable the other person to diminish your worth.
    • Mindfulness: Boundaries should be applied mindfully, and not used to punish someone.
    • Challenges and Misconceptions:
    • Fear of Disappointing Others: Overcome the fear of disappointing others or being seen as unkind.
    • Guilt: Recognize that saying no is not selfish but an act of self-respect.
    • Societal Pressure: Resist societal pressure to be endlessly accommodating.
    • Stoic Principles:
    • Self-Control: Exercise self-control and emotional regulation when setting and maintaining boundaries.
    • Wisdom: Use wisdom to discern when to say yes and when to say no.
    • Justice: Act with justice, ensuring fairness both to yourself and to others.
    • Virtue: Align your actions with virtue, making kindness a deliberate choice rather than an obligation.

    In essence, setting boundaries is about creating a framework that allows kindness to thrive without undermining one’s well-being. By setting limits, individuals can ensure that their generosity is valued, reciprocated, and sustainable, leading to healthier and more respectful relationships.

    Modern Stoicism: A Guide to Resilience, Regulation, and Virtue

    Modern Stoicism emphasizes the practical application of ancient Stoic philosophy to contemporary life. It focuses on cultivating inner resilience, emotional regulation, and ethical behavior to navigate the complexities of the modern world. Modern Stoicism adapts the core tenets of Stoicism—virtue, reason, and living in accordance with nature—to address the challenges and opportunities of today’s society.

    Here’s a breakdown of key aspects of Modern Stoicism, according to the sources:

    • Core Principles:
    • Virtue as the Only Good: Modern Stoicism, like its ancient counterpart, emphasizes that virtue (wisdom, justice, courage, and temperance) is the sole good and the foundation for a fulfilling life.
    • Control and Acceptance: A central tenet is differentiating between what one can control (thoughts, actions, and responses) and what one cannot (external events, others’ opinions). Modern Stoicism encourages focusing efforts on what is within one’s power and accepting what is not.
    • Living in Accordance with Nature: This involves understanding the natural order of the world and living in harmony with it, embracing reason and virtue in daily life.
    • Mindfulness: Modern Stoicism emphasizes being present in the moment, rather than dwelling on the past or worrying about the future.
    • Practical Applications:
    • Emotional Regulation: Modern Stoicism provides tools for managing emotions, helping individuals respond to challenges with reason rather than impulse. This involves recognizing emotions, understanding their triggers, and choosing thoughtful responses.
    • Setting Boundaries: Modern Stoicism underscores the importance of setting boundaries to protect one’s well-being and prevent exploitation. This includes learning to say no, defining limits, and enforcing those limits consistently.
    • Goal Setting: Stoicism encourages setting clear goals aligned with one’s values to give life direction and purpose, acting as a compass.
    • Cultivating Self-Awareness: Modern Stoicism emphasizes the importance of self-reflection and self-compassion, building self-worth from within and not relying on external validation.
    • Practicing Empathy and Compassion: While setting boundaries is vital, Modern Stoicism also promotes empathy and compassion, understanding others’ struggles and responding with kindness while maintaining one’s own emotional health.
    • Detachment: A key teaching involves detaching from the need to control external factors, and learning to give without expectation.
    • Recognizing relationships: It is key to recognize which relationships are opportunistic and which will help you grow. It’s also important to preserve energy by focusing on relationships that uplift you.
    • Habit Forming: Stoicism consistently encourages us to avoid extremes by embracing balance in our actions and protects us from chaos.
    • Benefits of Modern Stoicism:
    • Increased Resilience: Modern Stoicism equips individuals with the tools to bounce back from setbacks and navigate challenges with greater emotional stability.
    • Improved Relationships: By setting healthy boundaries and practicing empathy, Modern Stoicism promotes more balanced and respectful relationships.
    • Enhanced Self-Worth: Cultivating self-awareness and self-compassion leads to a stronger sense of self-worth, reducing dependence on external validation.
    • Greater Emotional Regulation: Learning to manage emotions and respond with reason promotes inner peace and reduces unnecessary conflict.
    • Purposeful Living: Aligning actions with values and setting clear goals fosters a sense of purpose and fulfillment.
    • Finding Peace: By letting go of past hurts and practicing forgiveness, Modern Stoicism unlocks freedom and creates room for joy.

    In essence, Modern Stoicism is a practical philosophy for living a virtuous and fulfilling life in the modern world, providing tools and techniques for cultivating inner strength, managing emotions, and building meaningful relationships. By focusing on what is within one’s control and acting with reason and virtue, individuals can navigate the complexities of life with greater resilience and peace.

    Modern Stoicism: Emotional Well-being Through Self-Awareness and Regulation

    Drawing upon the sources and our conversation history, emotional well-being involves several interconnected elements that, when cultivated, contribute to a balanced and fulfilling life. Modern Stoicism provides a framework for understanding and enhancing emotional well-being by emphasizing self-awareness, emotional regulation, and ethical behavior.

    Key components of emotional well-being, according to the sources, include:

    • Self-Worth and Self-Love:Cultivating self-worth from within, rather than relying on external validation, is essential for setting boundaries and protecting emotional well-being.
    • Practicing self-compassion and treating oneself with kindness reinforces self-esteem and emotional resilience.
    • Recognizing one’s intrinsic value and worthiness of love and respect is vital for maintaining healthy boundaries and relationships.
    • Emotional Regulation:Managing emotions and responding with reason rather than impulse is a core aspect of Stoicism.
    • Practicing emotional detachment involves understanding emotions without allowing them to dictate behavior, which helps in navigating challenging situations.
    • Developing the ability to pause and reflect before reacting to emotional triggers enables thoughtful responses aligned with one’s values.
    • Setting Boundaries:Establishing clear and healthy boundaries is crucial for protecting emotional energy and preventing exploitation.
    • Setting limits and saying “no” when necessary are acts of self-respect that ensure kindness comes from a place of strength rather than obligation.
    • Clearly communicating boundaries helps others respect one’s time, energy, and values.
    • Practicing Empathy and Compassion:Understanding and sharing the feelings of others allows for thoughtful responses rather than impulsive reactions.
    • Approaching difficult situations with kindness and understanding, while maintaining boundaries, fosters healing and balanced relationships.
    • Recognizing that others’ actions often stem from their own struggles promotes empathy and prevents resentment.
    • Letting Go of Past Hurts:Forgiveness is essential for freeing oneself from emotional burdens and releasing negative emotions.
    • Releasing emotional attachments to past events allows for a focus on personal healing and growth, enabling a more peaceful present.
    • Choosing peace over bitterness and focusing on personal growth helps in moving forward from past wrongs.
    • Living with Intention and Purpose:Setting clear goals aligned with one’s values provides direction and helps focus on what truly matters.
    • Aligning actions with values ensures that time and energy are directed toward pursuits that enrich personal growth and contribute to a sense of fulfillment.
    • Living in accordance with virtue and acting with reason fosters a sense of purpose and balance in life.
    • Managing External Influences:Distancing oneself from energy drainers and negative influences helps safeguard emotional and mental health.
    • Focusing on what is within one’s control and accepting what is not promotes inner peace and reduces unnecessary stress.
    • Surrounding oneself with supportive individuals fosters emotional resilience and personal growth.
    • Mindfulness and Self-Reflection:Being present in the moment, rather than dwelling on the past or worrying about the future, is essential for emotional regulation.
    • Regular self-reflection and self-assessment, including journaling and meditation, promote emotional awareness and help manage emotional overwhelm.

    These elements of emotional well-being are interconnected and mutually reinforcing. By cultivating self-awareness, practicing emotional regulation, setting healthy boundaries, and aligning actions with values, individuals can enhance their emotional resilience, build stronger relationships, and lead more fulfilling lives. Modern Stoicism provides practical tools and techniques for integrating these principles into daily life, enabling individuals to navigate challenges with greater clarity, purpose, and inner peace.

    The Art of Earning Respect: Kindness and Boundaries

    Drawing from the provided source, earned respect is achieved through a combination of kindness, wisdom, self-respect, and the establishment of clear boundaries. It is a reciprocal recognition of worth, not an entitlement or automatic response to generosity.

    Key aspects of respect earned, according to the sources:

    • Balance Between Kindness and Self-Respect:
    • Kindness, when given without boundaries, can lead to disrespect because people tend to devalue what is easily accessible.
    • Respect is commanded by acting in ways that show self-worth, not by simply giving oneself away.
    • Balancing generosity with self-preservation is crucial for earning genuine respect.
    • Setting and Enforcing Boundaries:
    • People respect the boundaries that are enforced.
    • Setting clear expectations and refusing to be taken advantage of often leads to greater respect.
    • Firmness and compassion are allies in earning respect; kindness should be strong, not weak.
    • Saying no is essential; those who know when to say no command true respect.
    • Intentional Kindness:
    • Kindness must be intentional, not automatic.
    • Acting with intention transforms kindness from appeasement to an expression of values.
    • Respect comes from being authentic, not just agreeable.
    • Kindness should be a conscious choice, not an unconscious habit.
    • Self-Control and Emotional Independence:
    • People instinctively admire those who are emotionally independent and do not seek validation through their acts of kindness.
    • True tranquility comes from mastering desires and detaching self-worth from others’ opinions.
    • A strong person offers kindness freely but does not beg for it in return.
    • Rewarding Appreciation, Not Entitlement:
    • Rewarding appreciation reinforces the right mindset and teaches people that kindness is a gift, not a debt.
    • Withdrawing kindness from those who demand it is necessary; self-respect is non-negotiable.
    • Avoiding Self-Sacrifice:
    • Generosity should not extend to the point of self-sacrifice or exhaustion.
    • True generosity involves offering help in a way that maintains dignity and well-being.
    • Kindness should never mean self-sacrifice at the expense of well-being.
    • Protecting Your Energy:
    • Protecting energy is as crucial as protecting time; respect is about how much of oneself is given, and to whom.
    • Being selective about where to invest energy and setting emotional boundaries are essential.
    • Knowing When to Walk Away:
    • Walking away from situations that undermine dignity demonstrates a commitment to self-respect and earns the respect of others.
    • It’s important to carefully discern where efforts are invested; kindness should not come at the cost of self-worth.

    In essence, earned respect is about creating a balance where kindness is a choice made from a position of strength and self-awareness, not a freely given resource that others can exploit. By setting boundaries, acting with intention, and valuing oneself, it’s possible to foster relationships built on mutual respect and appreciation.

    Why Kindness Makes People Disrespect You | Modern Stoicism

    The Original Text

    why kindness makes people disrespect you modern stoicism have you ever felt like the more kindness you show the less people respect you you offer a helping hand yet they start expecting it you go out of your way to be considerate yet you’re overlooked you try to be a good person yet somehow you become an easy target someone people take advantage of and here’s the real danger if you don’t recognize what’s happening you’ll keep wondering why people dismiss your needs walk over your boundaries and never truly appreciate you but don’t worry by the end of this video you’ll understand why kindness when used without wisdom can lead to disrespect and how to shift your approach to gain respect without losing losing your compassion because the problem isn’t kindness itself it’s how and when you apply it number one people value what they have to earn kindness is often seen as a virtue yet paradoxically in the modern world it can lead to disrespect when given too freely and without boundaries this is one of the great paradoxes of human nature people tend to devalue what is easily accessible the moment kindness becomes a constant unearned presence it transforms from a virtue into an expectation when others feel entitled to your generosity they no longer see it as a gift but as a given something they need not appreciate or reciprocate this is why modern stoicism teaches us the importance of self-respect and measured generosity Marcus Aurelius once wrote wa no more time arguing about what a good man should be be one but being a good person does not mean being a doormat if you are always available always saying yes and never establishing limits people will not admire your kindness they will assume it is simply who you are something they can take without consequence just as we value what we work hard for we also respect those whose kindness must be earned when you are too freely giving you teach others to expect rather than appreciate this is a hard truth that many people learn too late in life anything easily obtained is often overlooked while that which requires effort is cherished consider the example of luxury goods why do people covet designer Brands over cheap Alternatives it is not just about quality it is about scarcity and effort people respect what is rare what is different difficult to attain the same applies to Human Relationships if you are endlessly accommodating always bending over backward for others they may begin to see you as replaceable this is why setting boundaries is not about withholding kindness it is about ensuring that your kindness is Meaningful in letters from A stoic senica reminds us you act like Mortals in all that you fear and like Immortals in all that you desire if we desire respect we must act in ways that command it not simply give ourselves away expecting it in return the key is to make your kindness a conscious Choice rather than an unconscious habit when you say yes too often out of fear of disappointing others you become a tool rather than a person useful but not respected in Modern Life this lesson is especially relevant in a world driven by social media validation where people are pressured to be endlessly available many have lost the ability to say no the result burnout resentment and ironically a lack of true respect from those they strive to please the truth is when you set boundaries you teach others that your time and energy are valuable you show them that your kindness is not free flowing but intentional this is a core principle of stoic Secret control what is within your power and let go of what is not respect starts with self-respect if you respect yourself enough to set limits others will follow suit epic tetus taught how long will you wait before you demand the best for yourself this is a question worth reflecting on are you allowing yourself to be drained by the expectations of others or are you ensuring that your kindness remains a conscious Act of virtue rather than an unconscious obligation the practical application of this wisdom is simple yet powerful practice saying no when you are always agreeable people take you for granted but when you establish clear limits when you give selectively and intentionally your kindness retains its value instead of always being available be present on your own terms let people understand that your time and generosity are gifts not rights this will not make you less kind it will make your kindness more respected modern stoicism emphasizes the idea that true strength is found in balance between generosity and self-preservation between compassion and wisdom those who fail to find this balance often end up exhausted disrespected and bitter the world world does not reward unlimited self-sacrifice it rewards those who understand the value of their own worth number two unchecked kindness can make you seem weak unchecked kindness can often be mistaken for weakness not because kindness itself is a flaw but because the world respects those who balance compassion with self-respect in society we often see strength associated with assertiveness and the ability to set clear boundaries those who can confidently say no when necessary are viewed as people with strong principles while those who always say yes always yield and never push back can easily be overlooked or even taken advantage of stoicism teaches us that emotional control is a virtue but that does not mean we should be passive or allow others to walk all over us this does not mean being cold or unfeeling but rather understanding that true kindness cannot come at the cost of your own dignity real kindness isn’t just about how much you give it’s about knowing when to give and when to stand your ground think about the story of Daniel a man known for always helping others he never said no never stood up for himself and always put the needs of others before his own at first people admired his generosity but over time what happened his kindness was no longer seen as a virtue it became an expectation people stopped asking if he had the time or energy to help they simply assumed he would and the moment Daniel tried to say no people were upset they weren’t grateful for all he had done in the past instead they felt entitled to his help he hadn’t changed but the way people treated him had because he never established boundaries in the first place his kindness was real but Without Limits it lost its value now ask yourself how many times have you felt like your kindness was taken for granted how often have you agreed to something just to avoid disappointing others have you ever felt drained because you constantly put others before yourself this isn’t about becoming selfish or cruel it’s about realizing that kindness does not mean being a doormat for kindness to have meaning it must be given with intention and wisdom strength and kindness are not opposites they go handin hand if you don’t respect yourself don’t expect others to respect you a truly kind person is not someone who always says yes but someone who knows how to balance giving with self- protection someone who understands that saying no when necess necessary is not unkind it’s a sign of self-worth and that is the kind of person who earns true respect the lesson here is simple don’t let your kindness become a burden being kind does not mean letting others take advantage of you it means fostering relationships built on mutual respect if you want your kindness to be valued start by valuing yourself number three people test how far they can push you kindness is a virtue but when it is given without boundaries it can invite disrespect why because human nature tends to test limits if you consistently let things slide if you allow a friend to always be late without consequence if you accept extra work from a coworker without pushing back or if you tolerate a partner neglecting your needs what message are you really sending whether you realize it or not you’re teaching people how to treat you epicus wisely said you are not to blame for being uneducated but you are to blame for refusing to learn and one of the most crucial lessons in life is this people will only respect the boundaries that you enforce stoicism teaches us that while kindness is admirable it must be coupled with self-respect if not it becomes a silent signal that you don’t value yourself enough to stand firm have you ever noticed how those who set clear expectations who know their worth and refuse to be taken advantage of are often the most respected Marcus aelius one of the greatest stoic leaders understood this well he said the soul becomes dyed with the color of its thoughts if you constantly think that kindness means being endlessly accommodating your soul your character and your self-worth will reflect that but true stoic wisdom tells us that virtue is about balance be kind but never at the cost of your dignity imagine a river strong flowing and full of life it nurtures everything around it but it also carves Stone shapes Landscapes and determines its own path kindness should be like that generous but firm if people sense that you lack the strength to say no they will push to see how far they can go this isn’t because they’re necessarily malicious it’s simply how people operate even children test their parents patience to understand what is acceptable so why would adults be any different a stoic doesn’t resent this reality they accept it and act accordingly senica once said he who does does not prevent a crime when he can encourages it if you don’t set limits you are silently approving of mistreatment this doesn’t mean you should become harsh or unkind stoic lessons emphasize that self-control wisdom and Justice must work together be kind yes but let that kindness be strong not weak a person who truly embodies stoicism understands that firmness and compassion are not opposite they are allies so ask yourself are you being kind because it aligns with your values or because you fear confrontation are you letting others dictate your worth by how much you’re willing to endure the answer to these questions determines whether your kindness is a strength or a weakness kindness should never mean self-sacrifice at the expense of your well-being the stoics knew that a life lived with virtue requires wisdom knowing when to say yes and more importantly when to say no so the next time someone pushes your limits remember your response teaches them exactly how far they can go what lesson are you giving them if you’ve watched up to this point you’re already on the path to understanding the hidden dynamics of kindness and respect in today’s world comment below with stoic strength to affirm your commitment to master ing modern stoicism but don’t stop here there’s still valuable Insight ahead that can change the way you navigate respect boundaries and personal power stay until the end to uncover how true kindness Guided by wisdom earns genuine respect number four it creates an imbalance in relationships kindness when given without boundaries often creates an imbalance in relationships that many fail to recognize until they feel drained used or unappreciated modern stoicism teaches us the importance of equilibrium in human interactions giving without expectation but also ensuring we are not taken for granted senica once wrote he who gives when he is asked has waited too long this reminds us that generosity when excessive or poorly placed can foster dependency and entitlement rather than mutual respect when you are always the one offering help making sacrifices or accommodating others you unconsciously set a precedent one where you are expected to give and others feel entitled to receive the more you extend kindness Without Limits the less people value it and soon they no longer see it as generosity but as an obligation you owe them this leads to an unspoken Dynamic where one person carries the burden of maintaining the relationship while the other simply takes without feeling the need to reciprocate in the context of Modern Life we often see this imbalance in friendships workplaces and even within families the employee who always says yes to extra work without question soon becomes the one everyone relies on yet receives the least appreciation the friend who always listens and gives emotional support but never shares their own struggles becomes the emotional crutch for others yet is left alone in their own moments of need the partner who continuously compromises to keep the peace eventually realizes that their needs are ignored because they have never been firm about them this isn’t to say kindness is a weakness far from it the stoic secrets to maintaining respect lie in practicing kindness with wisdom Marcus Aurelius reminds us be tolerant with others and strict with yourself this means offering kindness but also setting boundaries that prevent you from being diminished by your own good nature if you give without discernment you risk turning your virtue into a vice where kindness is no longer an act of strength but a self-imposed burden a well-balanced relationship is built on mutual respect not silent sacrifice the greatest respect you can command from others is by demonstrating self-respect first people unconsciously mirror the way you treat yourself if you place no value on your time your energy and your efforts neither will they if you constantly say yes to every demand people will assume you have nothing better to do and your kindness will not only be undervalued but eventually ignored this means Having the courage to disappoint others sometimes to say no when necessary and to stand firm in your decisions this doesn’t mean becoming cold or indifferent but rather understanding that respect is built not on endless giving but on Mutual recognition of worth look at those who command true respect in life they are not the ones who say yes to everything but those who know when to say no they give where it matters but they also hold their ground when necessary modern stoicism reminds us that virtue is about balance if you lean too far into self-sacrifice you lose your own stability if you lean too far into selfishness you lose connection with others the key is to be kind but not to the extent that it breeds entitlement in others and exhaustion in yourself true generosity is not about giving endlessly but about giving wisely only to those who appreciate it and only when it does not come at the cost of your own dignity in relationships fairness must exist if your kindness is not reciprocated it is not kindness it is self- neglect so how do you correct this imbalance by first acknowledging your own worth by recognizing that your time and energy are not infinite resources to be drained by those who only take by understanding that true kindness does not mean always saying yes but knowing when to say no by reminding yourself that being a good person does not mean being a doormat and part of being a good person is ensuring that your kindness is respected not exploited in a world that often mistakes kindness for weakness be both firm and fair generous yet Discerning kindness should never be a burden but a gift one that when given wisely Fosters respect rather than diminishing it number five it makes you a target for manipulators kindness is a virtue but in a world where not everyone acts with good intentions it can also make you a target for manip ulators those who seek control whether it’s a toxic boss a selfish friend or a manipulative partner are always on the lookout for people who are easy to sway and who better to exploit than someone who always says yes always puts others first and never questions when their generosity is being taken advantage of you might believe that being kind will earn you respect but to the wrong people it signals weakness they see it as an open invitation to push boundaries to take without giving and to bend you to their will but here’s the truth kindness when paired with wisdom is not weakness it’s strength the stoics understood this well Marcus Aurelius one of the greatest stoic philosophers wrote the best revenge is to be unlike him who performed the injury this means you don’t have to become cold or cruel in response to manipulation but you do have to be Discerning being kind does not mean being naive and it certainly does not mean allowing others to take advantage of you consider the story of Jake a talented designer who always went out of his way to help his co-workers he would cover for them when they miss deadlines fix their mistakes and even stay late to ensure projects were completed on time at first he thought his kindness was appreciated until he realized his workload was twice that of anyone else’s and his so-called friends were dumping their responsibilities on him while taking credit for his efforts one day his boss asked him to stay late yet again to finish someone else’s work without so much as a thank you that was when it hit him his kindness wasn’t being respected it was being exploited Jake decided to set back boundaries he stopped saying yes to every request prioritized his own work and made sure his contributions were recognized some people resented this change but the ones who truly valued him adjusted he didn’t stop being kind but he stopped being an easy target so ask yourself are you being kind or are you being taken advantage of do the people in your life appreciate your kindness or do they simply expect it the stoics teach us to be mindful of who we allow into our Inner Circle and to recognize when kindness is being mistaken for weakness epicus reminds us the key is to keep company only with people who uplift you whose presence calls forth your best true kindness isn’t about pleasing everyone it’s about acting with Integrity wisdom and self-respect when you learn to balance kindness with strength you you command respect instead of inviting manipulation the lesson here is clear give kindness freely but not blindly because the moment you allow yourself to be used your kindness is no longer kindness It’s self-sacrifice at your own expense number six people see you as emotionally dependent kindness when rooted in strength is a powerful virtue but when it stems from a place of emotional dependence it can become a silent invitation for disrespect people instinctively admire those who are emotionally independent those who do not seek validation through their acts of kindness but rather offer it from a place of inner abundance if your kindness is driven by the fear of rejection or the need for approval it ceases to be an act of virtue and instead becomes a bargaining tool one that often backfires stoicism teaches us that true Tranquility comes from within from mastering our desires and detaching our selfworth from the fleeting opinions of others as Marcus Aurelius said you have power over your mind not outside events realize this and you will find strength a person who is kind because they choose to be because it aligns with their values not because they crave appreciation is naturally respected but when kindness is merely a mask for insecurity people sense it they may not always articulate it but they will feel it and in time their respect for you will diminish consider a man who tolerates blatant disrespect from a woman just to keep her in his life to an outsider it may seem like patience or devotion but in reality it signals weakness a person who does not set boundaries who allows mistreatment out of fear of loss is not truly kind they are emotionally dependent and dependence especially in relationships is rarely admired a strong person offers kindness freely but does not beg for it in return they do not tolerate abuse under the illusion of loyalty epic tetus reminds us if you want to improve be content to be thought foolish and stupid in other words prioritizing virtue over popularity requires the courage to be misunderstood to stand firm in your principles even when others do not immediately see their value why is it that people respect those who are willing to walk away but take advantage of those who cling too tightly the answer lies in human nature we are wired to Value what is scarce to admire what is self-sufficient when you are excessively kind in hopes of being liked you unwittingly communicate that your worth depends on the approval of others and the moment people sense that you need them more than they need you the power Dynamic shifts you become easy to take for granted this is why stoic lessons emphasize self-sufficiency the ability to be content with oneself regardless of external circumstances if your kindness is genuine it will not waver in the face of indifference if it is strategic it will eventually betray you think about it when was the last time you truly respected someone who lacked self-respect when you meet someone who stands firm in their values who does not compromise themselves for the sake of acceptance do you not instinctively admire them contrast that with someone who constantly seeks to please who bends over backward to accom odate everyone even at the cost of their dignity over time their efforts become predictable their presence easy to overlook this is not because kindness itself is weakness but because misplaced kindness kindness rooted in fear rather than principle is as senica wisely observed he who is brave is free the courage to assert yourself to establish boundaries to remain kind yet not sub subservient that is true Freedom so ask yourself is your kindness a choice or is it a strategy are you kind because it aligns with your character or because you hope to be liked in return if your answer leans toward the latter you must reassess your approach kindness should be an extension of strength not a symptom of emotional dependence true stoicism teaches us to act in accordance with virtue to do what is right without being attached to how others perceive us if people respect you for your kindness let it be because they recognize it as a reflection of your inner stability not because they see an opportunity to exploit it in the end the only approval that truly matters is the one you give yourself do you think kindness without self-respect leads to being taken for granted many mistake people pleasing for genuine kindness but true virtue comes from Inner Strength not the fear of rejection share your thoughts below kindness without self-respect invites disrespect number seven true kindness comes from strength not approval true kindness stems from strength not from a need for approval when your actions are motivated by a desire to be liked to gain validation or to secure affection they lose their authenticity people can sense when kindness is transactional when it’s a silent plea for acceptance rather than a genuine expression of Good Will the stoics teach us that our worth is not dictated by how others perceive us but by the virtues we embody when you give excessively gifts time attention without it being reciprocated and especially if your intent is to win favor it can come across has neediness rather than generosity and neediness repels imagine a man who constantly showers a woman with compliments expensive gifts and undivided attention not because he genuinely wants to give but because he hopes she will like him more he believes that by overwhelming her with kindness she will feel compelled to reciprocate but instead of admiration she may feel uncomfortable even pressured because the generosity is laced with expectation it’s not truly about her it’s about his need for validation this Dynamic plays out in friendships workplaces and even within families when someone senses that your kindness is a form of emotional bribery respect is lost think about it who do you admire more the person who gives freely because it is simply in their nature or the one who gives with the the silent hope of something in return people are drawn to those who are self-sufficient who give without attachment who are content with or without external validation the moment you make your selfworth dependent on how others receive your kindness you become vulnerable to manipulation and disappointment instead embrace the stoic principle of acting according to Virtue not reaction if you are kind be kind mind because it aligns with your values not because you need something in return the strongest relationships romantic or otherwise are built on mutual respect not desperation consider the story of Daniel a man who always put others first not because he was selfless but because he feared rejection he would go out of his way to please to avoid conflict to be liked by everyone yet despite his constant efforts people took him for granted they knew he wouldn’t say no that he was always available always seeking their approval over time he grew resentful feeling used and unappreciated but the truth was he had set the terms of those relationships by teaching others that his kindness came with an unspoken contract if I do this for you will you like me it wasn’t until he learned to give with without attachment to be kind without expectation that he found real peace some relationships faded but the ones that remained were genuine so ask yourself is your kindness an extension of your character or is it a strategy are you giving from a place of abundance or are you hoping to receive something in return true strength is found in self-sufficiency in knowing that your worth is not measured by how others respond to you when you stop seeking validation you become the kind of person who naturally commands respect and respect unlike approval is never begged for it is earned kindness is a virtue but when applied without wisdom it can become a liability the harsh truth is that people often take for granted what is freely given and unchecked generosity can lead to an imbalance in relationship exploitation and ultimately a loss of respect this is not because kindness itself is flawed but because human nature tends to test limits the stoics understood that true kindness must be paired with self-respect Marcus aelius senica and epicus all taught that a life of virtue requires balance between generosity and self-preservation between compassion and firmness to be truly kind you must also be Discerning you must recognize when your kindness is being valued and when it is being exploited and above all you must never let kindness come at the cost of your own dignity so where do we go from here how do we ensure that our kindness is respected rather than mistaken for weakness this is where Modern stoicism provides us with a practice iCal path forward in the next section we’ll explore how to command respect without losing your kindness because being kind does not mean being passive it does not mean saying yes to everything and it certainly does not mean allowing others to take advantage of you how to command respect without losing your kindness many believe that being kind means always saying yes of avoiding conflict and putting others first but as we’ve seen unchecked kindness can lead to disrespect and burnout so does this mean you should stop being kind not at all instead you must practice kindness with wisdom and boundaries stoicism teaches that true kindness isn’t about pleasing everyone it’s about acting with purpose Marcus Aurelius LED with virtue but never at the EXP expense of self-respect to command respect you must set limits give intentionally and ensure your kindness is valued not exploited in this section we’ll explore how to balance kindness with strength ensuring that your generosity earns respect rather than invites entitlement let’s begin number one kindness must be intentional not automatic commanding respect while maintaining kindness is a delicate balance but it is not about people pleasing or seeking approval it is about acting with intention Marcus Aurelius one of History’s Greatest stoic philosophers was known for his generosity and fairness but he never allowed himself to be controlled by the expectations of others his kindness was a choice not an obligation this is a crucial distinction in in modern stoicism to be truly kind one must be deliberate rather than reactive too often people mistake kindness for weakness thinking that saying yes to every request earns them admiration in reality respect is built on boundaries not blind compliance before extending your help or agreeing to something pause and ask yourself am I doing this because I genuinely want to or am I acting out of of fear of disapproval senica once said if you wish to be loved love but this love must be given freely not extracted through guilt or pressure when you make kindness intentional it transforms from an act of appeasement into an expression of your values this is a stoic secret true kindness does not seek validation it stems from inner strength in today’s world where social obligations workplace expectations and personal relationships often blur the lines between generosity and self-sacrifice it is vital to recognize that saying no does not make you unkind it makes you Discerning consider the difference between a leader who helps because they fear conflict versus one who helps because they see value in doing so the latter commands respect because their kindness is grounded in principle not in security people sense when kindness is genuine and when it is laced with silent resentment if you are constantly overextending yourself to avoid disappointing others you are not being kind you are being controlled the modern stoic understands that respect is earned by standing firm in their choices not by bending to every demand this does not mean turning cold or indifferent the key is to be as generous with your kindness as you are with your discipline epic tetus reminds us no man is free who is not master of himself if you allow external pressure to dictate your generosity you are no longer in command of your own will instead practice mindful kindness give when it aligns with your principles not when it is expected of you in the workplace for instance a boss who is always accommodating out of fear of being disliked will soon be taken for granted however a leader who helps when it makes sense while setting firm expectations earns both respect and appreciation in personal relationships the same rule applies consider a friend who always says yes to Favors even at their own expense over time their kindness loses its value because it is given without discernment but a friend who helps thoughtfully who knows when to give and when to say no is respected because their kindness holds weight this is why intentional kindness is so powerful it is rare it is valuable and it is given with meaning as the stoics teach respect comes not from being agreeable but from being authentic the modern world often pressures us to be endlessly accommodating mistaking self-sacrifice for virtue but self-sacrifice without purpose leads to resentment not respect true kindness as understood in modern stoicism is neither weak nor passive it is strong deliberate and aligned with your values to command respect without losing your kindness start by making each Act of generosity a conscious decision rather than an automatic reaction train yourself to pause before saying yes ensuring that your kindness is an expression of your strength not a response to fear as you practice this you will notice something remarkable people will respect you more not less they will see that your kindness is not a tool for approval but a reflection of your inner power this is the secret of those who live by stoic wisdom they do not seek to please yet they are deeply respected they do not Chase validation yet they are valued and they do not give out of fear but out of choice number two say no without explaining yourself respect and kindness are not opposites in fact the most respected people often possess both in Perfect Balance but one of the quickest ways to lose respect is to stretch yourself too thin to always be available always saying yes until your time energy and even selfworth become diluted senica said he who is everywhere is nowhere if you try to please everyone you’ll end up pleasing no one not even yourself the ability to say no without justifying without overe explaining without feeling guilty is one of the greatest strengths you can develop it’s a quiet assertion of self-respect and the world responds to it in kind think think about a time when someone asked you for a favor you didn’t really want to do maybe it was staying late at work when you had already sacrificed enough or a friend expecting you to drop everything to help when you were struggling with your own responsibilities you wanted to say no but you hesitated maybe you offered an excuse or softened your refusal with too much explanation but why why do we feel the need to justify protecting our own time and energy often it’s because we fear disappointing others or being seen as unkind but here’s the truth when you respect your own limits others do too if you constantly say yes to everything people will assume your time is free your boundaries are flexible and your needs come second that’s not kindness that’s self- neglect consider the story of James a hardworking designer who always said yes his colleagues knew they could count on him to pick up extra work his friends knew he’d always be there and his family knew he’d never say no even if it meant sacrificing sleep and personal time at first he felt good about being the Dependable one but over time resentment built up he felt exhausted used and strangely invisible the respect he thought he was earning by being agreeable wasn’t real it was conditional based on his willingness to be endlessly available one day when his boss asked him to take on yet another lastminute project James did something different he simply said I can’t do that no excuse no elaborate reason just a firm clear statement the room was silent for a moment then his boss nodded and moved on James realized in that moment that he had been giving away his power all along the fear of saying no had been far worse than the reality of it when you start saying no with confidence you may notice a shift in how people treat you some will push back especially if they’ve benefited from your constant compliance but others will respect you more recognizing that you are someone who values yourself and the most surprising thing the world doesn’t end when you say no your true friends the people who genuinely respect you won’t leave because you set a boundary they’ll stay and they’ll probably admire you even more for it ask yourself this what would change in your life if you stopped overexplained your refusals how much energy would you reclaim if you reserved your time for what truly matters learning to say no isn’t about being harsh it’s about being clear a simple I can’t or that doesn’t work for me is enough you don’t owe anyone an elaborate justification for prioritizing your well-being true kindness isn’t about sacrificing yourself it’s about offering your best self to the world and you can only do that when you protect your energy so the next time you feel pressured to explain your no pause let it stand on its own respect yourself first and others will follow number three reward appreciation not entitlement respect is not about being feared or blindly obeyed it’s about being valued and one of the strongest ways to ensure you are respected without losing your kindness is by rewarding appreciation not entitlement imagine this you offer someone your time your help your patience and they genuinely appreciate it they recognize your effort they thank you and they show gratitude in return you feel motivated to continue giving to continue being there because you know your kindness is respected but now imagine another scenario someone doesn’t acknowledge your efforts instead they expect them they assume you will always be there always saying yes always offering your kindness without question the moment someone demands your kindness rather than appreciates it they reveal their entitlement and this is where you must draw the line as Marcus aelius one of the greatest stoic philosophers said the happiness of your life depends upon the quality of your thoughts and this applies to how you allow others to treat you if you continue to give kindness to those who feel entitled to it you reinforce the wrong mindset not just in them but in yourself you teach them that you are always available no matter how they treat you but is that truly an act of kindness or is it self- neglect in stoicism self-respect is non-negotiable the stoics believed in virtue Justice and wisdom and part of that wisdom is knowing when to give and when to step back senica once wrote he who is not a good servant will not be a good Master this means that if you don’t command respect through your actions if you let others walk over your kindness you lose control not just over them but over yourself the person who respects themselves knows that kindness should never be given out of obligation it is a gift not a debt when people see that you reward appreciation and not entitlement they begin to respect your boundaries they understand that your kindness is not a weakness but a choice and this is where the power lies too often we fear that if we stop giving if we withdraw our kindness from those who take it for granted we will be seen as rude unkind or selfish but ask yourself why is it selfish to protect your energy why is it rude to expect basic respect in return the truth is it is not there is a crucial difference between kind and peop pleasing kindness is intentional strong and wise peop pleasing on the other hand is rooted in fear the fear of rejection the fear of conflict the fear of being disliked in other words if setting boundaries means some people see you as unkind let them your duty is not to meet the expectations of those who feel entitled to you it is to live virtuously with self-respect and wisdom a truly kind person does not just give endlessly they give wisely they recognize that kindness without boundaries turns into self-sacrifice and self-sacrifice without purpose leads to resentment have you ever found yourself exhausted drained or frustrated because you kept giving to someone who never appreciated it that frustration is a signal it is your mind telling you that something thing is off balance and balance is essential just like the stoics believed in controlling what is within our power you must take control of your kindness ask yourself who truly values what I give who sees my kindness as a gift rather than an expectation who if I stopped giving would still respect me these are the people worthy of your time your effort your kindness so what is the the takeaway do not be afraid to withdraw your kindness from those who demand it do not let fear dictate how you set your boundaries instead practice wise generosity give where appreciation exists and where it does not let go without guilt you are not unkind for choosing who gets access to your energy you are simply living by the stoic lessons that have guided the greatest thinkers of History self-respect wisdom and the courage to stand firm in your values because in the end respect is not commanded through endless giving it is earned through the way you value yourself and when you respect yourself others have no choice but to do the same if you’ve watched up to this point you’re already on the path to understanding the hidden dynamics of kindness and respect in today’s world comment below with stoic strength to airm your commitment to mastering modern stoicism but don’t stop here there’s still valuable Insight ahead that can change the way you navigate respect boundaries and personal power stay until the end to uncover how true kindness Guided by wisdom earns genuine respect number four be generous but not at your own expense being generous is a no quality but without boundaries it can lead to being undervalued or even taken for granted modern stoicism teaches us that true kindness is not about self-sacrifice to the point of exhaustion but about giving wisely to command respect without losing your kindness practice generosity from a place of strength not depletion if you constantly give without regard for your own well-being you risk becoming a resource rather than a person in the eyes of others the key to balanced generosity lies in discernment helping others while ensuring that your kindness is not exploited a common mistake people make is believing that being available and accommodating at all times earns them respect in reality the opposite often happens when you give without boundaries some will come to expect it and others will see it as a weakness to exploit this is why setting limits is not an act of selfishness it is an act of self-respect This Means holding yourself to high standards of kindness while also expecting fair treatment from others if a friend only reaches out when they need something but disappears when you need support it’s not wrong to step back people will respect you more when they realize that your generosity comes with principles a helpful rule in navigating gener generosity is to give from abundance not depletion if giving drains you whether it’s time energy or resources you need to reassess your approach imagine someone who always says yes to extra work hoping for recognition only to be overlooked for promotions while those who set boundaries are respected this happens because people respect what is valued and take for granted what is always available a wise leader knows that saying no at times allows them to say yes with greater impact when it truly matters it is the same in personal relationships if you are always available to solve problems for others while neglecting your own you teach them that your time is worth less than theirs true generosity is not about sacrificing yourself but about offering help in a way that maintains ains your dignity and well-being in Modern Life where people are often overwhelmed by demands from work family and social obligations understanding the stoic secrets of generosity is crucial the world is full of people who will take what you give without thinking twice but it is your responsibility to Define your limits a simple test if your generosity leaves you feeling drained unappreciated or resentful it’s time to adjust those who truly value you will respect the boundaries you set while those who only seek to benefit from you may fade away and that is a good thing this is not about withholding kindness but about ensuring that it is given to those who deserve it as Epictetus wisely noted attach yourself to what is spiritually Superior regardless of what other people think or do hold to your truth true aspirations no matter what is going on around you your kindness is a gift but only when given with wisdom does it truly command respect number five be kind but don’t seek approval if you want to command respect without losing your kindness one of the most powerful stoic lessons to embrace is this be kind but don’t seek approval too often people confuse kindness with people pleasing believing that in order to be liked they must always agree always comply always put others before themselves even at their own expense but in reality seeking approval is a weakness not a virtue when you make other people’s opinions the measure of your self-worth you give away your power Marcus aelius wisely said it never ceases to amaze me we all love ourselves more than other people but care more about their opinion than our own why do we exhaust ourselves trying to be liked trying to fit into a mold trying to meet expectations that were never ours to begin with when your kindness comes from a place of insecurity when you say yes just to avoid conflict when you go along with something just to keep the peace people will sense it and here’s the heart truth they will respect you less not more true kindness is an act of strength not submission a truly kind person does not need validation to feel whole they do good not because they want something in return but because it aligns with their principles epicus taught if you wish to be good first believe that you are bad that is to say recognize the ways in which you compromise yourself notice where your need for approval is dictating your actions and then correct it kindness when done right is not about making others comfortable at your own expense it’s about embodying your values regardless of how others respond imagine a scenario where someone presents an idea that you don’t agree with a people pleaser might nod along forcing a smile afraid to challenge the moment but a strong kind person will hold their ground while while remaining respectful I see where you’re coming from but I have a different perspective a simple sentence but one that shows you have your own mind you don’t need to agree to be agreeable you don’t need to please to be respected ask yourself how often do you say yes when you really mean no how many times have you swallowed Your Truth just to avoid disappointing someone else if you want to command respect start by respecting yourself the stoics believed that virtue courage wisdom Justice and Temperance should be the guiding force of your life not the shifting opinions of others when you live with Integrity when your kindness is rooted in genuine goodwi rather than a desperate need to be liked people notice they might not always agree with you but they will respect you and more importantly you will respect respect yourself so the next time you find yourself hesitating afraid to express your thoughts or enforce your boundaries remember this you were not put on this Earth to be agreeable you were put here to be strong wise and virtuous choose kindness yes but choose it on your terms not as a currency for approval now I want to hear from you what’s more important to you being liked or being respected com comment respect over approval always if you agree that self-worth comes before approval or comment being liked matters just as much if you think being liked matters just as much let’s see where you stand number six protect your energy as fiercely as your time if you want to command respect without losing your kindness one of the most critical stoic lessons to embrace is this protect your energy as fiercely as your time too often people focus on guarding their schedules setting boundaries around their availability and ensuring their time is not wasted but what about their emotional and mental energy respect is not just about how much time you give it’s about how much of yourself you give and to whom Marcus Aurelius wrote in meditations you have power over your mind not outside events realize this and you will find strength this wisdom applies directly to how you manage your energy you cannot control how others act but you can control how much of yourself you allow them to drain your energy is a limited resource if you give it away too freely to the wrong people to pointless conflicts to those who do not value it you will have little left for what truly matters consider the difference between two types of people one person allows everyone to vent their frustrations solve their problems and demand their attention Without Limits by the end of the day they feel drained frustrated and unseen the second person however is kind but selective they support others when they can but they do not absorb unnecessary negativity they listen but they do not take on burdens that are not theirs which person do you think commands more respect kindness does not mean allowing yourself to be an emotional Dumping Ground many people mistake being a good person for being endlessly available but true generosity true kindness comes from a place of strength not exhaustion if you constantly deplete yourself for others without replenishing your own energy you will become resentful not respected this is why stoicism teaches us to be intentional about where we place our Focus senica once said associate with those who will make a better man of you in other words surround yourself with people who uplift you not those who take without giving there is a difference between helping someone who genuinely appreciates your kindness and allowing someone to drain your energy because they see you as an easy source of support the modern world is full of distractions endless notifications unnecessary drama and people who thrive on conflict every time you engage in something meaningless you lose a piece of your energy ask yourself how often do I give my mental and emotional energy to things that do not serve me how many times have I left a conversation feeling worse than when I entered it to command respect you must first respect yourself enough to protect your energy this does not mean cutting people off or becoming indifferent it means recognizing when to engage and when to step back it means setting boundaries not just with your time but with your emotions for example if a friend only reaches out when they need something never offering support in return it it is not unkind to limit how much energy you invest in that relationship if a colleague constantly brings negativity into conversations it is not rude to excuse yourself if social media drains you rather than inspires you it is wise to reduce your time spent on it the strongest people are not those who give endlessly they are those who know when to say I have given enough the most respected leaders think of and mentors do not allow their energy to be dictated by external forces they decide where to invest their focus this is why Marcus Aurelius advised the Tranquility that comes when you stop caring what they say or think or do only what you do when you stop allowing outside distractions to consume your inner peace you gain power power over yourself power over your emotions and ultimately power over how others treat you so what does this mean in practice it means setting mental boundaries as firmly as you set time boundaries it means choosing your battles wisely deciding which conversations deserve your energy and knowing when to walk away from situations that add no value to your life if you want to command respect while maintaining your kindness remember this energy like time is finite give it wisely protect it fiercely and spend it only on what truly matters the world respects those who know their worth and nothing signals self-respect more than guarding your energy from those who do not deserve it number seven know when to walk away knowing when to walk away is one of the most understated yet powerful aspects of commanding resp effect without compromising your kindness in a world where many equate kindness with weakness the ability to step back from toxic Dynamics sends a message far louder than words people will often test boundaries consciously or unconsciously to gauge how much you are willing to tolerate if you allow disrespect to persist you inadvertently signal that such treatment is acceptable however when you decisively walk away from situation s that undermine your dignity you demonstrate an unwavering commitment to self-respect something that naturally earns the respect of others as the stoic philosopher epicus wisely said The more we value things outside our control the less control we have your time energy and peace of mind are among your most valuable assets and not everyone deserves access to them modern stoicism teaches that we must carefully discern where we invest our efforts there is a fine line between being patient and being a pushover you may think that by enduring mistreatment you are displaying resilience but in reality you may be enabling bad behavior whether it’s a friendship that drains your energy a workplace that consistently undervalues your contributions or a relationship that thrives on imbalance staying in such situations does not make you Noble it makes you complicit in your own suffering true kindness is not about allowing others to walk all over you it is about maintaining generosity while ensuring that your own worth is never diminished in the process walking away does not always mean Burning Bridges or severing ties in Anger it means making a conscious decision to remove yourself from situations where respect is absent sometimes distancing yourself is the only way to make people realize your value many only understand what they had once it is gone the moment you show that you are willing to leave when necessary people begin to treat your presence with the respect it deserves this is not about manipulation it’s about setting a standard for how you expect to be treated kindness should never come at the cost of self-worth as senica stated associate with people who are likely to improve you surrounding yourself with those who respect and uplift you is not selfish it is essential for personal growth and mental well-being in today’s fast-paced world where relationships and professional environments can often become transactional it is easy to fall into the Trap Of overgiving The Secret of modern stoicism lies in Striking the perfect balance being kind yet firm G generous yet Discerning compassionate yet self-respecting the ability to walk away when necessary does not make you unkind it makes you wise those who truly value you will respect your boundaries and those who do not were never worthy of your kindness in the first place life is too short to spend it proving your Worth to those who refuse to see it the moment you internalize this truth you not only command respect effortlessly but also cultivate inner peace the ultimate stoic secret to a fulfilling life kindness is a virtue but without wisdom it can lead to disrespect and exhaustion the key is balance being generous yet Discerning compassionate yet firm set boundaries protect your energy and give where your kindness is valued true respect starts with self-respect if you found this video helpful like share and subscribe to the channel turn on notifications so you don’t miss our next video on stoic wisdom for a stronger wiser life see you next time are you being too kind seven lessons on how to deal with those who hurt you modern stoicism don’t set yourself on fire to keep others warm this powerful saying captures a key lesson we often Overlook in our quest to be kind and generous while kindness is a virtue that strengthens relationships and builds character there are moments when being too kind can come at a cost our own well-being in today’s video we’ll dive into how modern stoicism offers invaluable wisdom on balancing generosity with self-care will explore seven powerful lessons on how to navigate relationships set healthy boundaries and stop sacrificing our mental emotional and physical health for the sake of others are you someone who tends to put others first even when it harms you let’s talk about how you can use stoic principles to protect your peace while still being the compassionate person you are if you’ve ever struggled with setting limits in your relationships leave a comment below and share your experience don’t forget to subscribe to stoic secrets for more insights on how stoicism can help you live a life of balance resilience and personal growth number one don’t set yourself on fire to warm others in Modern Life we often find ourselves caught in the cycle of giving whether it’s helping a colleague with a project so supporting a friend through a tough time or stepping in to fix someone else’s problem while kindness and generosity are noble virtues there’s a crucial lesson from stoicism that we must remember don’t set yourself on fire to warm others this stoic principle speaks to the importance of maintaining boundaries and not sacrificing your own well-being in the name of helping others stoicism encourages us to live in accordance with reason and virtue which includes making thoughtful decisions rather than acting impulsively or out of an emotional desire to please others it teaches that we must first tend to ourselves if we are to be of any true help to others there is a fine line between offering assistance and overextending ourselves to the point of exhaustion when we constantly give without checking in on our own needs we risk burning out physically emotionally and M mentally the act of self-sacrifice though often celebrated in modern culture can be counterproductive if it leads to our own suffering in today’s fast-paced world saying yes is often seen as a sign of commitment Good Will and even self-worth however this desire to be helpful or liked can make us blind to the toll it takes on our own lives we can easily become the person who is always ready to lend a hand but never takes time for their own needs as the stoic philosopher epicus wisely stated when you are about to start some task stand for a moment and reflect on the nature of the task you are about to perform this simple but profound advice encourages us to pause before jumping into another commitment it’s important to ask ourselves will helping this person take away from my ability to care for myself if the answer is yes it may be time to practice the stoic virtue of self-discipline and set a boundary this act of reflection doesn’t mean we lack compassion it simply means we recognize that true generosity comes from a place of balance not from self-destruction in our relationships especially with loved ones there’s an underlying temptation to give so much of ourselves that we lose sight of our own needs we may find find ourselves taking on too much thinking we can handle it all but just as a candle cannot burn at both ends in definitely we too cannot sustain endless self-sacrifice without burning out stoicism teaches us that our actions should be governed by Reason Not by guilt or obligation we need to assess whether the task at hand aligns with our values and whether it is a reasonable request to help others with without harming ourselves requires wisdom and discernment in modern stoicism this means taking a step back to ensure we are not giving at the expense of our mental and physical health moreover stoicism reminds us that we cannot control how others respond to our boundaries in fact we may face resistance or even criticism when we choose to say no but this too is part of the stoic practice of accepting what is beyond beond our control the most important thing is that our actions align with our own well-being and integrity Marcus Aurelius the Roman Emperor and stoic philosopher taught waste no more time arguing about what a good man should be be one this wisdom encourages us to act in accordance with our values without feeling the need to justify our choices to others saying no when needed is not a failure of kindness it is a conscious decision to preserve our own peace and resources so we can continue to offer help when it truly serves both others and ourselves in Modern Life where the pressure to constantly give and be available can be overwhelming practicing the art of balance is crucial remember that true generosity doesn’t mean sacrificing your happiness or health it means offering what you can in a sustainable and mindful way by learning to set boundaries and make thoughtful decisions we can live according to the wisdom of stoicism and cultivate a life that honors both our ability to help others and our need for self-care number two reciprocity has an expiration date in a world where we often seek validation stoicism offers us an alternative giving freely without the expectation of anything in return this ancient philosophy teaches that the true value of generosity lies not in what we receive but in what we offer to others when we extend kindness support or love without any anticipation of reciprocation we create a source of inner peace and fulfillment however as human beings we are naturally inclined to hope for some form of acknowledgement or return whether it’s a favor gratitude or simply a gesture of kindness this natural desire to receive something in return can lead to disappointment frustration and even bitterness when our expectations are not met the emotional toll of expecting reciprocity can be profound as we might start mentally tallying up what others owe us whether it’s a favor or a thank you when these debts go unpaid we can feel hurt or betrayed and that emotional burden can chip away at our sense of well-being modern stoicism however teaches us to break free from this cycle of expectation epicus one of the great stoic philosophers famously stated there are two rules to keep ready that there is nothing good or bad outside my own choice and that we should not try to lead events but follow them this powerful teaching reminds us that while we cannot control how others respond to our generosity we can control how we choose to act and react by relinquishing our expectations of reciprocation we free ourselves from the emotional roller coaster that often accompanies unfulfilled desires the more we give without expecting a return the more we cultivate a sense of emotional freedom in this way we are no longer dependent on others to meet our our emotional needs or validate our worth think about the peace that comes from giving for the sheer Joy of it without attaching any strings this sense of Detachment from expectations is not only liberating but essential for our mental well-being it allows us to preserve our peace of mind even in the face of indifference or in gratitude in the modern world we are constantly bombarded with messages that tell us to expect more or demand better but stoicism teaches us that true wealth doesn’t come from material possessions or reciprocal acts it comes from the ability to give without wanting anything in return when we practice this we enrich our lives in ways that are far deeper than any external rewards could provide by embracing this mindset we maintain a sense of equinity and inner tranquility regardless of how others respond to our kindness as you navigate life’s interactions remember that giving

    without expectation is not a sign of weakness or naivity it is a powerful form of emotional resilience in fact it strengthens your inner resolve and enables you to weather the ups and downs of relationships without being tossed around by every slight or unfulfilled promise the stoic philosophy ER Sena echoed this sentiment when he said it is not the man who has little but he who desires more that is poor by focusing on the act of giving rather than on what we might receive we redefine our sense of wealth and fulfillment in the end the key to True generosity is not what we get from others but the peace we cultivate within ourselves as a result of giving freely and without expectation [Music] in the fastpaced and often transactional world we live in today adopting the stoic practice of giving without the need for reciprocity is not only a way to preserve your peace of mind but it is also a profound Act of self-care it allows you to move through life with Grace undisturbed by the fluctuations of others Behavior so the next time you offer something to someone whether it’s a helping hand a kind kind word or an act of Love remember that your true reward is not in what you receive in return but in the calm and fulfillment that come from giving freely without the burden of expectation this is the essence of modern stoicism the freedom that comes when we stop seeking approval and start living according to our own principles of kindness and generosity number three received requests have no limits one of the core principles of stoicism that many of us tend to overlook in our busy fast-paced lives is the importance of setting limits especially when it comes to helping others in a world that constantly demands our attention it can feel like we’re always on call ready to assist give advice or offer emotional support to those who reach out and as human beings it’s natural to want to help we feel good when we are generous when we show kindness and when we make others feel supported but here’s the catch without clear boundaries our willingness to help can quickly spiral into frustration resentment and burnout have you ever said yes to someone even when you felt like saying no simply because you didn’t want to disappoint them or felt guilty for not being able to help it’s easy to slip into this pattern when we lack the courage to set limits however this unchecked eagerness to help others can leave us emotionally drained physically exhausted and mentally overwhelmed and worse it can prevent the very people we’re trying to help from developing the strength and Independence they need to navigate their own lives take the story of a mother who spent her entire life caring for her adult daughter who struggled with illness the mother’s love and support were constant always available and always filled with care but in her efforts to protect and care for her daughter the mother unintentionally stunted her daughter’s growth she did everything for her handled the chores managed the finances and even made decisions that the daughter should have been making herself the mother’s unrelenting desire to help created a pattern of dependency that kept the daughter from learning how to manage on her own when the mother passed away the daughter was suddenly forced to stand on her own to everyone’s surprise she adjusted remarkably well she stepped up took responsibility and began thriving without her mother’s constant help the tragedy here wasn’t the loss of the mother but that her constant giving prevented her daughter from learning how to take charge of her own life life the lesson here is simple yet profound when we overh help we risk preventing others from discovering their own strength from a stoic perspective this is a powerful illustration of why setting boundaries is not just a tool for protecting our own well-being but a crucial part of fostering Independence in others stoicism teaches us that we must learn to distinguish between times when we can truly offer help and times when our assistance may actually be more harmful than beneficial as Marcus Aurelius one of the greatest stoic philosophers famously said a man’s job is to stand upright not to be kept upright by others this quote is a reminder that while helping others is a noble and compassionate act there’s a limit to how much we should intervene in the lives of others by constantly offering assistance Without Limits we may inadvertently dis Empower others from developing the skills they need to face their own challenges think about it how many times have you stepped in to solve someone else’s problem only to realize later that your help didn’t actually solve anything or worse that it only delayed their growth in those moments it’s important to ask yourself is this a situation where my help is necessary or is it one where this person needs to learn and grow on their own own setting clear and healthy boundaries doesn’t mean you don’t care or that you’re unwilling to help it’s simply means that you recognize when your help will be empowering and when it might inadvertently prevent someone from standing on their own by setting limits you not only protect your own energy but also help the people you care about to build their own resilience stoic Secrets like this remind us that generosity isn’t just about giving Without Limits it’s about knowing when and how to give in a way that Fosters long-term growth for both the giver and the receiver we need to balance our kindness with wisdom and that starts with asking is my help really helping here or am I just making it easier for someone to avoid their own responsibility the next time someone asks for your assistance take a moment to reflect ask yourself whether this is a opportunity to guide them toward Independence or whether you’re simply doing what they could and should be doing for themselves by setting healthy boundaries you’re ensuring that your generosity doesn’t come at the cost of your well-being and that it empowers others to manage their own lives boundaries are not just a way to protect your time and energy they are a way to teach others how to take charge of their own growth so let your kindness be a gift that supports Independence rather than creating dependency remember true help isn’t about doing things for others but about giving them the tools and space to do things for themselves if something from today’s video resonated with you share your thoughts in the comments below whether you’re new here or have been with us for a while I want to hear from you if you’re just joining us comment I’m new here or if you’re a seasoned member of our Community drop her I’m seasoned member in the comments to let us know how you’ve been applying these stoic principles in your life your engagement means so much and is a constant source of inspiration for us to keep creating meaningful content now let’s continue our journey of stoic wisdom together number four being seen and treated as fragile in today’s fast-paced world where the pressure to be kind helpful and accommodating is ever present we often Overlook a critical aspect of personal well-being the importance of setting boundaries we may feel compelled to give freely help whenever we can and always say yes to the demands of others however if we give too much without recognizing our own limits we risk not only burning ourselves out but also being perceived as fragile or incapable of asserting our needs this perception can undermine our Authority erode respect and in the long run damage our sense of self-worth this is a fundamental lesson rooted in stoic philosophy which emphasizes Inner Strength self-control and the importance of respecting oneself when we fail to set clear boundaries in our relationships we inadvertently open ourselves up to exploitation it is easy to fall into the Trap of trying to please others driven by a desire to be liked or to feel needed we want to be seen as generous understanding and compassionate but there is a fine line between being helpful and over extending ourselves if we are always available always ready to lend a hand and never set a firm no we send a message to others that we lack the strength to protect our time energy and emotional well-being over time this continuous availability can lead to exhaustion and frustration as others may take advantage of our kindness expecting more from us than is reasonable the issue however is not our desire to help it’s that we haven’t properly safeguarded our own well-being by setting boundaries stoicism offers a powerful remedy for this situation at its core stoic philosophy teaches us to respect ourselves and our time by asserting our boundaries we communicate to others that we value our energy and resources and that we are not endlessly available for the taking saying no is not a sign of selfishness but an important exercise in self-respect when we set clear limits we redefine how others perceive us not as a person to be exploited but as someone who values their own time and well-being as Cicero a well-known stoic philosopher reminds us what you think of yourself is much more important than what others think of you this simple but profound statement reflects the stoic belief that our sense of self-worth should not be defined by external approval or the opinions of others but by our own principles and the respect we show ourselves while saying no might feel uncomfort able especially in a world that often equates kindness with accommodating others it is essential for maintaining our own mental and emotional health in Modern Life we are often made to feel guilty if we don’t help others or if we refuse requests that drain us we may worry about exclusion criticism or being seen as unkind these feelings are natural but from a stoic perspective they are opportunities for growth the discomfort we feel in asserting our boundaries reveals our attachment to the approval of others and challenges us to examine our priorities stoicism teaches us that such challenges are not obstacles but tests of our inner strength and wisdom by facing these tests we gain valuable insights into who truly respects our boundaries and who is simply taking advantage of our generosity over time we become more skilled at Discerning who deserves our time and energy and who simply seeks to exploit our kindness setting boundaries is not about shutting ourselves off from others it’s about creating space for the things that truly matter it’s about making sure we can give to others in a sustainable way without depleting ourselves healthy boundaries allow us to engage with the world from a place of strength not fragility they help us protect our well-being while still fostering meaningful relationships with those who respect us and reciprocate our efforts when we say no we are not rejecting others we are protecting ourselves ensuring that we can continue to contribute positively and maintain a healthy balance in our lives modern stoicism teaches us that by navigating the challenges of setting boundaries we cultivate resilience and self-awareness each time we practice the art of saying no we become better at balancing our generosity with self-respect ultimately leading to deeper more fulfilling relationships this practice strengthens us and those around us enriching our lives and helping us live with greater purpose and Clarity in a world that often demands more than we can give stoicism offers a frame work for reclaiming our strength and ensuring that our kindness is sustainable by setting boundaries with respect and Clarity we can navigate our relationships with wisdom avoid burnout and build a life where both our own needs and the needs of others are honored through Modern stoicism we learn that true strength comes not from constant giving but from knowing when to say no and preserving our energy for what truly matters number five we will see who our true friends are in today’s world where superficial and transactional relationships often dominate stoicism encourages us to approach our interactions with discernment and wisdom at the core of stoic philosophy is the belief that actions speak louder than words true friendship according to stoicism is Def defined by consistent thoughtful actions rather than Grand promises or declarations not all relationships are built on this Foundation often we encounter people who value US not for who we are but for what we can provide these individuals may seek our company when we are generous with our time energy or resources only to distance themselves once we stop overextending ourselves while this can be painful stoicism helps us view such experiences not as betrayals but as opportunities to understand the true nature of these relationships as Marcus Aurelius wisely said when you are offended at any man’s fault turn to yourself and reflect in what way you are a culprit by embracing this self-reflection we can move past resentment and accept that others Behavior often reflects their needs and limitations rather than our worth stoicism also emphasizes the practice of discernment which allows us to differentiate between genuine relationships and those that are opportunistic it teaches us to observe not only what people say but how they act especially in times of need this Discerning perspective is invaluable in navigating both personal and professional Relationships by focusing on those who truly appreciate us for Who We Are we can protect our emotional well-being and invest our energy in relationships that are mutually beneficial stoicism does not discourage generosity or kindness but it advocates for directing these qualities toward people who will value them when we stop overextending ourselves we create space for more authentic connections relationships that are based on respect reciprocity and shared growth by doing so we preserve our our energy and flourish in environments where our presence is respected not exploited the reality is that relationships may not always stay balanced people we thought would be there for us may turn away when the dynamic of give and take shifts however stoicism helps us deal with these disappointments with Grace it teaches us that we cannot control others actions but we can control how we respond we are not respons responsible for others choices but we are responsible for how we navigate these situations the stoic approach encourages us to let go of resentment and focus on cultivating relationships that support our growth and well-being true friends are not just there in times of convenience but are those who respect our boundaries offer support in struggles and encourage our development these are the relationships that bring True Value to Our Lives as we practice discernment we create space for Meaningful lasting connections that enhance our lives in profound ways these relationships grounded in mutual respect and understanding encourage us to reflect on the quality of the connections we maintain stoicism teaches that true friendship is about understanding and being understood as senica said one of the most beautiful qualities of true friendship is to understand and to be understood in Modern Life where we are often distracted and pulled in many directions this stoic perspective on friendship provides both Clarity and a sense of Peace it reminds us that the quality of our relationships not their quantity is what truly matters modern stoicism teaches us that the true measure of friendship lies not in what others cannot offer us but in how they value us as individuals by practicing discernment and reflecting on the quality of our relationships we can identify those who genuinely support us and invest our time and energy in those connections we are not obligated to maintain relationships that drain us or leave us feeling unappreciated instead we can focus on cultivating authentic meaningful relationships that contribute to our well-being and fulfillment embracing this stoic approach frees us from the disappointment of shallow one-sided friendships and opens the door to deeper more rewarding connections that sustain us over time number six the power of emotional Detachment one of the most commonly misunderstood Concepts in stoicism is emotional detachment many believe it means becoming cold indifferent or even heartless in reality emotional Detachment is about learning to manage our emotions so that they do not control our actions or reactions in a world where we are constantly faced with emotional triggers whether it’s a harsh comment from a coworker a misunderstanding with a friend or everyday stress stoicism offers a valuable tool for navigating this emotional turbulence it teaches us to respond with Reason Not impulse the goal isn’t to suppress or ignore our feelings but to understand them and choose how we respond to them by doing this we can avoid reacting in ways that do not align with our values or best interests when we practice emotional Detachment we are not denying our feelings we are simply preventing them from dictating our behavior for example imagine imagine you’re in a meeting and a colleague sharply criticizes your idea your first instinct might be to feel anger or frustration and perhaps even to respond defensively but stoic emotional Detachment encourages you to pause and reflect before reacting in that moment you can take a deep breath acknowledge your feelings and choose a response that is thoughtful measured and aligned with your values this pause between stimulus and response is key in stoic philosophy it allows us to see emotions as signals not commands rather than being Swept Away by emotional impulses we can choose the best course of action preserving our dignity and peace of mind this practice of emotional Detachment becomes especially important when others attempt to provoke us or manipulate our emotions for example if a friend or family member says something hurtful emotional Detachment helps prevent an impulsive reaction it doesn’t mean you stop caring about others or their feelings rather it means you don’t let their behavior disturb your inner peace by managing our emotions we can stay grounded and calm in situations that might otherwise lead to unnecessary conflict this approach isn’t about avoiding difficult conversations or conflict but responding to Life’s challenges from a place of clarity and reason take Sarah for example she often found herself in conflict with her friends every time someone made a critical or hurtful comment she immediately felt wounded which led to arguments and strained relationships one day Sarah decided to practice emotional Detachment the next time a comment upset her instead of reacting immediately with anger or hurt she paused she took a moment to the time Sarah found that she wasn’t as affected by the words of others she still cared about her friends but emotional Detachment helped her respond calmly and thoughtfully ultimately bringing her more peace as the stoic philosopher epicus wisely said wealth consists not in having great possessions but in having few wants this concept of Detachment is key when we detach from the need to control everything or everyone we open up space for Freedom emotional Detachment allows us to preserve our peace and respond to Life’s challenges in a measured way protecting our emotional well-being and avoiding unnecessary conflict it also helps us deal with toxic individuals who might try to drain our energy or bring negativity into our lives by practicing Detachment we can protect ourselves from their harmful behaviors and remain focused on what truly matters it’s important to note that emotional Detachment is not about becoming emotionally numb or disengaged rather it’s about consciously choosing how we respond to the world around us when we practice Detachment we gain the ability to respond with logic and Clarity instead of emotional impulsivity this practice helps us build healthier more balanced relationships because we are no longer at the mercy of emotional highs and lows we can still care deeply about others but we no longer let their actions determine our emotional state stoic Secrets like this teach us that by letting go of the need to control everything we gain control over our own happiness and inner peace the next time you find yourself in an emotional situation ask yourself am I reacting out of impulse or am I responding with calm and Clarity by practicing emotional Detachment you can maintain control over your emotions protect your inner peace and navigate even the most challenging situations with Grace emotional Detachment is not about being cold or detached from others it’s about being wise enough to recognize your emotions and choose the best response no matter what life throws your way this practice empowers you to live more peacefully thoughtfully and authentically number seven letting go of past hurts letting go of past hurts is one of the most liberating practices we can Embrace in our lives holding on to grudges anger or resentment only serves to poison our own minds and Spirits leaving us trapped in negative emotions that prevent us from fully experiencing the present in fact clinging to these feelings doesn’t harm the person who wronged us it harms us modern stoicism teaches us that forgiveness is not just a moral or ethical Choice it is a powerful means of freeing ourselves from emotional burdens that weigh us down the pain we hold from the past often tethers us to harmful emotions keeping us stuck in a cycle of frustration and Sorrow by choosing to release these emotional weights we open ourselves to a life of peace tranquility and emotional Freedom it’s crucial to understand that Letting Go doesn’t mean forgetting or excusing the actions of others it means making the conscious decision to sever the emotional attachment to those past events that caused us pain choosing instead to focus on our own healing and growth stoicism encourages us to focus on what is within our control our our thoughts feelings and responses while accepting that we cannot change the past in doing so we gain the power to move forward instead of being defined by the wrongs done to us Marcus Aurelius one of the most revered stoic philosophers offers powerful guidance on this subject when he says the best revenge is to be unlike him who performed the injury this wisdom teaches us that the most effective way to respond to harm is not through retaliation or bitterness but by Rising above it maintaining our integrity and using the experience as a stepping stone for personal growth choosing to rise above harm allows us to preserve our peace of mind and keep our emotional equilibrium intact rather than being Shackled by resentment we can reclaim our inner peace and emotional strength the act of Letting Go begins with a an knowledging the pain and reflecting on its source it is only through understanding the root causes of our emotional hurt that we can begin the process of releasing it mindfulness and self-reflection are invaluable Tools in this journey of forgiveness they allow us to step back and look at our emotions objectively helping us separate the person who hurt us from the emotional baggage We Carry in the process of forgiving we don’t condone the behavior that caused us harm we simply choose to no longer allow that behavior to have a hold on our present state of being in this way forgiveness becomes not a gift we give to others but a gift we give ourselves allowing us to break free from the chains of anger and resentment by letting go of past hurts we release ourselves from the cycle of pain and open up space for healing emotional balance and stronger more authentic relationships this is a practice that directly aligns with the stoic goal of cultivating emotional resilience allowing us to live more freely and fully in the present moment forgiveness is not about excusing harmful Behavior or forgetting the wrongs that were done to us it’s about choosing peace over bitterness it’s about acknowledging the hurt learning from it and then choosing to release the hold it has over us it’s a process of reclaiming our emotional freedom and taking back control of our lives in doing so we make space for a future that is not burdened by the weight of past grievances by choosing to forgive we become better versions of ourselves more compassionate more resilient and more focused on creating a life rooted in peace rather than past pain modern stoicism reminds us that we are the masters of our responses and by by letting go of past hurts we reclaim our power and create room for Joy growth and emotional balance when we practice forgiveness we are not only improving our emotional health but we are also strengthening our relationships and cultivating a future that is open to possibility rather than weighed down by the Shadows of the past letting go of past hurts is essential for emotional well-being and it is one of the most free steps we can take in our personal development by embracing the stoic principle of forgiveness we clear the path for emotional balance healing and deeper connections with others as we let go of resentment and bitterness we unlock the freedom to move forward with a lighter heart and a clearer mind in this way forgiveness becomes the key that unlocks the door to a brighter more peaceful future one that is no longer defined by past pain but by the strength resilience and wisdom we gain from overcoming it what do you think about setting boundaries to protect your peace it can be tough but it’s so necessary for our emotional health here are three simple responses you can share in the comments boundaries are essential for peace or it’s hard but necessary or setting limits saves energy be sure to watch until the end of the video especially the section on insights on healing and setting boundaries you’ll find some deep thought-provoking tips that could change how you approach relationships insights on healing and setting boundaries having explored seven crucial lessons on how to deal with those who hurt you it’s time to delve deeper into to the next phase of healing and personal growth once we understand these lessons the next step is to gain further insights into how to heal set healthy boundaries and cultivate emotional resilience let’s now explore how you can continue your journey toward emotional well-being and self-empowerment number one developing self-worth and self-love self-worth is the internal compass that shapes how we perceive ourselves and is essential for setting boundaries that protect our emotional well-being it is not determined by others opinions or actions but by recognizing our own intrinsic value modern stoicism teaches us to cultivate self-worth from within rather than relying on external validation when we learn to see ourselves as valuable and Des deserving of respect we naturally create boundaries that preserve our peace of mind this self-awareness serves as a shield preventing others from taking advantage of us or diminishing our sense of worth the Cornerstone of this process is self-love a practice that nurtures our emotional health and strengthens our ability to stand firm in our decisions self-love is not about selfishness or narcissism it is about cultivating a balanced sense of self-respect and treating ourselves with the same kindness and compassion we would offer to a dear friend by embracing self-love we set an example for how we wish to be treated and we can enforce the boundaries that Safeguard our emotional well-being without self-love asserting our needs or saying no can become difficult often leaving us conflicted or guilty building self-worth involves understanding that our value does not depend on external approval an essential part of this process is practicing self-compassion when we make mistakes or face setbacks instead of being harsh on ourselves we learn to treat ourselves with the same understanding we would extend to others as the stoic philosopher epicus said wealth consists not in having great possessions but in having few wants similarly our true wealth lies in our ability to recognize and affirm our own worth rather than depending on others opinions practicing self-compassion helps to strengthen our emotional resilience and positive affirmations can reinforce our self-esteem each small victory no matter how seemingly insignificant should be celebrated as it builds confidence and belief in ourselves by acknowledging our progress we reinforce our worthiness of love respect and care another key aspect of self-worth as taught by stoicism is focusing on what we can control in Modern Life we cannot control the actions of others or external circumstances but we can control our reactions by cultivating self-love we free ourselves from the need for external validation as we no longer depend on others to feel secure in our worth this emotional Independence is crucial for developing the resilience needed to set and maintain healthy boundaries as Marcus Aurelius wisely said the happiness of your life depends upon the quality of your thoughts if we reinforce our selfworth and treat ourselves with respect we create a solid foundation for emotional well-being this Inner Strength allows us to maintain boundaries without guilt or second guessing our decisions in a world where external pressures and societal expectations often fuel self-doubt developing a strong sense of self-worth has never been more important it empowers us to prioritize our needs and establish relationships that are mutually respectful and supportive by setting boundaries rooted in self-love we approach others from a place of emotional strength ensuring that our relationships enhance Our Lives rather than depleting us moreover developing self-worth and self-love is an ongoing Journey not a one-time effort each day presents an opportunity to reaffirm our value practice self-compassion and protect our emotional well-being with the wisdom of modern stoicism we are reminded that by focusing on what we can control our thoughts and actions and responses we can navigate life’s challenges with resilience and peace through self-love we build a deep Inner Strength that supports Us in all areas of life enabling us to grow heal and Thrive cultivating self-worth and self-love is essential for living a fulfilling and peaceful Life by recognizing our inherent value we create space for healthy relationships and meaningful connections modern stoicism teaches us that we are The Architects of our own happiness and by embracing our worth we free ourselves from the need for external validation this emotional Independence allows us to protect our well-being while fostering relationships that are rooted in mutual respect as we continue to nurture self-love we equip ourselves with the emotional resilience needed to face life’s challenges with confidence creating a life that aligns with our true values and is authentic to our inner selves number two the importance of patience and understanding one of the most powerful stoic Secrets is the virtue of patience and understanding particularly when we face the pain caused by others in a world where we’re often encouraged to react quickly and defend ourselves in the face of hurt stoicism offers a different approach creating space between our emotions and our actions when we’re hurt our immediate Instinct might be to lash out or defend ourselves but stoicism teaches us to pause Instead This pause allows us to reflect process our emotions and choose a thoughtful response rather than reacting impulsively practicing patience helps us build emotional resilience ensuring that we’re not controlled by our immediate reactions it doesn’t mean suppressing feelings but rather understanding and managing them to make better decisions in difficult situations a key aspect of patience is understanding the behavior of others it’s easy to take offense when someone says or does something hurtful but often their actions come from their own struggles when we see their behavior through the lens of compassion instead of of anger we realize their actions might be more about them than about us people often lash out because they’re dealing with their own pain or unresolved issues understanding this helps us respond with empathy not resentment this shift in perspective doesn’t excuse harmful Behavior but it allows us to protect our peace and avoid letting their actions disrupt our emotional state with patience we create space for both both ourselves and others to heal enabling us to respond with more clarity and calm emotional healing too requires patience when we’re hurt the natural urge is to move past the pain quickly however emotional wounds don’t heal overnight if we rush through the process we might only cover the wound temporarily without truly addressing the underlying issue stoicism teaches that emotional healing is a journey much like physical healing instead of suppressing or rushing our feelings we should give ourselves time to process and reflect on them patience allows us to heal more fully gaining Clarity and resilience this process isn’t always easy but through patience we grow stronger from our experiences and emerge with a healthier mindset consider a modern example Sarah a young woman who often found herself in conflict with her friends each time someone made a hurtful comment Sarah’s first reaction was anger leading to arguments and strained relationships one day she decided to apply the stoic principle of patience the next time she was hurt she paused and reflected she realized her friend was struggling with personal issues and the comment wasn’t a personal attack on her this shift in perspective allowed ER to respond with understanding instead of defensiveness over time her practice of patience not only helped her heal emotionally but also strengthened her relationships and brought her a deeper sense of Peace as stoic philosopher epicus said wealth consists not in having great possessions but in having few wants in the same way emotional wealth isn’t about avoiding pain or controlling every situation but about without cultivating patience and understanding in the face of adversity detaching from the need for immediate resolution allows us to approach challenges with wisdom and Grace practicing patience helps us respond thoughtfully preventing impulsive actions we might regret to cultivate patience in daily life try mindfulness practices like deep breathing meditation or simply taking a moment before reacting to emotional triggers these techniques help us slow down Center ourselves and respond more clearly the next time you’re hurt or facing a challenge ask yourself am I reacting out of impulse or am I responding with patience and understanding this question can help you apply the stoic secret of patience enabling you to navigate life with greater Clarity emotional resilience and peace patient allows us to protect our emotional well-being and respond with empathy both for others and for ourselves by practicing patience we can heal grow and ultimately find peace in the face of adversity number three the power of perspective one of the most powerful Tools in managing hurt and adversity is perspective modern stoicism teaches us that pain and suffering are inevitable but they don’t have to Define us while we can’t always control what happens we can control how we respond pain rather than being our enemy can be a catalyst for growth resilience and self-discovery by adjusting our perspective we can transform difficult situations into opportunities for personal development instead of letting negative emotions consume us we can shift our view seeing pain as a lesson rather than a burden in this way we lighten the emotional load and turn adversity into a stepping stone for growth reframing negative events is a crucial skill for maintaining emotional balance instead of seeing hurtful situations as personal attacks we can choose to view them as valuable lessons for instance a difficult conversation with a friend might reveal where our communication needs Improvement or a challenging situation at work May highlight areas where we need to assert ourselves more confidently this shift in perspective doesn’t deny the hurt but reframes it allowing us to focus on what can be learned from the experience by changing the narrative we gain control over our emotional response which is key to navigating life’s difficulties with resilience resilience the ability to bounce back from setbacks thrives on this mindset shift it’s not about avoiding pain but learning to navigate it without losing emotional stability resilient individuals focus on what’s within their control our thoughts feelings and actions and remain Anchored In what truly matters our integrity and growth instead of being paralyzed by setbacks we use them as fuel for Progress this perspective allows us to stay grounded and move forward with determination even in the face of adversity a practical tool for shifting perspective is the practice of gratitude in a world that often highlights the negative gratitude helps us focus on the positives even in the toughest of times there’s always something to be grateful for a supportive friend a moment of peace or simply the chance to learn from a difficult experience making gratitude a habit trains our minds to look for the good in every situation helping us maintain a positive outlook even in challenging times stoic philosopher senica wisely said we suffer more often in imagination than in reality this reminds us that much of our pain is not from external circumstances but from our own negative interpretations gratitude and mindfulness help us stay grounded in the present preventing us from spiraling into despair another way to shift perspective is by challenging negative thoughts as they arise in moments of difficulty it’s easy to fall into self-pity or blame the stoics understood that our thoughts shape our emotional experience if we can recognize and challenge negative thoughts we regain control over how we respond acknowledging painful emotions without letting them control us allows us to reframe the situation and move forward with Clarity and strength the power of perspective is about more than just denying pain or pretending challenges don’t exist it’s about choosing how to respond to adversity stoicism teaches that we are not at the mercy of external events we hold the key to our emotional Freedom through our thoughts and attitudes by reframing negative experiences and maintaining a resilient Outlook we reduce the emotional turbulence that life brings as Marcus Aurelius wisely said the happiness of your life depends upon the quality of your thoughts in today’s world where challenges are constant the ability to shift perspective is more important than ever by practicing gratitude mindfulness and reframing we can navigate life’s difficulty ulties with emotional balance and purpose the power of perspective is essential for managing hurt and adversity by adjusting our mindset we not only release the emotional weight of pain but also create space for growth resilience and emotional strength through the lens of modern stoicism we can transform hardships into opportunities for self-improvement and learning we are not defined by the pain we experience but by how we Rise Above It by reframing challenges and focusing on what is within our control we Empower ourselves to live with greater Clarity peace and emotional balance based on the stoic principles you’ve been learning you’re building a strong inner resilience to manage your emotions and create a more peaceful Life share with us in the comments I value myself or I see challenges as opportunities for growth to let us know how you’re applying these principles in your life and don’t forget to stay tuned there’s only one lesson left and you’ll regret missing it number four practicing empathy and compassion empathy is the ability to understand and share the feelings of another and it plays a vital role in managing challenging relationships and emotional pain even when we are hurt empathy allows us to to pause and consider the other person’s motivations struggles and challenges rather than reacting impulsively with anger or resentment empathy provides the emotional space needed to respond thoughtfully this concept is deeply rooted in modern stoicism which teaches us that while we cannot control the actions of others we can control how we react as epicus wisely said it’s not what happens to you you but how you react to it That Matters by stepping into the other person’s shoes we break free from the cycle of emotional retaliation fostering our own healing and building healthier more balanced relationships this approach helps us make conscious choices that align with our values allowing us to move forward with Clarity and resilience compassion an extension of empathy acts as an antidote to resentment holding on to anger or bitterness only empowers others to control our emotions trapping Us in the very pain we seek to escape compassion on the other hand releases us from this grip it doesn’t mean tolerating mistreatment but rather approaching difficult situations with kindness and understanding without compromising our boundaries as Marcus Aurelius said the best revenge is to be unlike him who performed the injury compassion allows us to respond with dignity healing from past wounds while still protecting our peace of mind it enables us to let go of negative emotions freeing us to move forward without becoming bitter or emotionally drained empathy and healthy boundaries are not mutually exclusive they can coexist understanding the struggles behind another person’s harmful Behavior allows us to set clear and compassionate boundaries without escalating conflict we can acknowledge the other person’s pain while asserting our own needs and protecting our emotional well-being modern stoicism teaches that we have the power to control how we respond to others by practicing empathy we can protect our emotional health without compromising our values or becoming overwhelmed setting healthy boundaries ensures our peace while still respecting the humanity of others fost ing a balanced emotional environment cultivating empathy requires active listening and a conscious effort to understand others perspectives this involves more than hearing words it means recognizing the emotions and struggles beneath the surface reflecting on our own experiences of pain can deepen our empathy reminding us that everyone faces challenges even if they are not visible when we recall moments when we were hurt or misunderstood we develop a greater sense of compassion for others empathy in this way becomes both a tool for personal growth and a bridge to Stronger more resilient Relationships by practicing empathy regularly we navigate difficult relationships with more grace setting boundaries that protect us while fostering meaningful connections modern stoicism provides a powerful framework for practicing EMP empathy and compassion it teaches us that we cannot control others actions but we can control our responses by adopting this stoic mindset we learn to understand those who may hurt us protecting ourselves in ways that Foster personal growth instead of conflict stoic philosophy reminds us that true peace comes not from external circumstances but from maintaining inner calm and compos Ure when we approach hurt and betrayal with empathy and compassion we strengthen our emotional resilience and create space to set healthy boundaries that preserve our well-being these practices lead to a more Balanced Life free from anger and resentment enabling us to thrive emotionally and mentally practicing empathy and compassion doesn’t mean being passive or tolerating mistreatment it means responding to hurt with understanding while still protecting our emotional health modern stoicism teaches us that while we cannot control what others do we can control how we react by cultivating empathy we approach difficult relationships with compassion turning potential conflicts into opportunities for growth this balanced approach not only Fosters emotional healing but also strengthens relationships empowering us to move forward with greater peace and Clarity in our lives as we wrap up today’s video I want to remind you of the key takeaways setting boundaries is not just important it’s essential for maintaining your mental and emotional well-being by establishing clear boundaries you protect yourself from burnout and preserve your energy for the things that truly matter remember practicing stoicism in daily life through self-discipline emotional awareness and discernment will help you build stronger healthier relationships and navigate challenges with resilience and peace of mind now take a moment to reflect on your current relationships are there areas where you’ve been overextending yourself where can you set healthier boundaries to prioritize your own needs if you found today’s lesson helpful please like this video share it with someone who could benefit And subscribe to stoic secrets for more content on stoicism and personal growth don’t forget to turn on notifications so you never miss out on our upcoming videos thank you for being here today remember it’s okay to say no when you need to and true generosity always comes from a place of balance and self-respect I wish you strength and peace as you continue to apply stoic principles in your life see you in the next video nine ways how kindness will ruin your life stoicism modern stoicism while kindness is often celebrated as one of life’s greatest virtues what happens when that kindness begins to hurt more than it helps Society constantly pushes us to put others first encouraging selflessness as the ultimate goal but in the modern world excessive kindness can have unintended consequences it can leave you drained exploited and even stripped of your own sense of self-worth here at stoic secrets we uncover the Truths Behind modern stoicism and how its ancient wisdom can help us navigate these challenges in this video we’ll explore nine ways how kindness will ruin your life and reveal how the principles of stoicism can Empower you to set boundaries protect your well-being and transform your relationships for the better stay with us as we uncover how to make kindness a strength not a sacrifice number one people will take advantage kindness is often celebrated as a noble and admirable virtue a quality that strengthens relationships and fosters good will however when offered without boundaries it can become a double-edged sword cutting into your well-being and opening the door for others to take advantage of your generosity excessive kindness given freely and Without Limits sends the message that your resources whether time energy or effort are infinite and always available this creates fertile ground for exploitation where people begin to rely on you not because they value your help but because it has become convenient for them over time this imbalance subtly erodes mutual respect and leaves you feeling unappreciated even resentful picture a coworker who constantly leans on you to complete their tasks but never offers to assist you in return such a scenario illustrates how unchecked kindness Fosters dependence undervaluing your contributions while ignoring your boundaries this cycle of overextending yourself is not just emotionally draining but also counterproductive when you consistently give without considering your own needs you inadvertently teach others that your kindness is not a gift but an obligation they may come to expect your help as a given rather than appreciating it as an intent Act of Good Will Marcus Aurelius one of the great stoic philosophers reminds us you have power over your mind not outside events realize this and you will find strength his words underscore the importance of self-awareness and mindful action qualities that modern stoicism emphasizes as essential for navigating the complexities of contemporary Life by understanding your own limits and acting with intention you can ensure that your kindness remains meaningful and does not become a source of personal depletion in today’s fast-paced world where demands on our time and energy are constant the principle of setting boundaries is more important than ever boundaries are not about denying kindness but about protecting its integrity and ensuring that it is given from a place of genuine care rather than obligation when you communicate your limits confidently and assertively you teach others to respect your time and effort this Clarity Fosters a dynamic where kindness is valued and mutual respect is preserved setting these boundaries may feel uncomfortable at first but It ultimately empowers you to maintain balance and prioritize your well-being kindness when practiced with wisdom and moderation becomes a source of strength rather than a vulnerability modern stoicism offers valuable guidance here teaching us to approach kindness not as a limitless resource but as a deliberate choice that aligns with our principles by embracing the stoic ideal of temperance you transform your kindness into a practice that uplifts both yourself and others creating meaningful connections that are rooted in mutual respect as you navigate life’s complexities remember that kindness should not come at the expense of your own Inner Harmony by balancing compassion with self-awareness you create a life where your generosity remains a powerful force for good both for others and for yourself number two you may be seen as weak kindness when given freely and without boundaries can sometimes bring about unintended and even painful consequences what starts as a genuine desire to help can be misunderstood as a lack of resolve or strength encouraging others to take advantage of your accommodating nature have you ever had a friend who repeatedly borrows money never repaying it not because they can’t but because they see your generosity as weakness or perhaps a colleague who constantly relies on you to clean up their messes assuming you’ll always step in over time these situations can leave you feeling drained unappreciated and even disrespected your energy is consumed your confidence eroded and you may begin to wonder why your kindness doesn’t lead to the connection and appreciation you had hoped for the respect you once commanded starts to diminish as others assume you’ll always comply no matter how inconvenient or costly it is to you this isn’t just a blow to your emotional well-being it’s a quiet Insidious erosion of your dignity stoicism however offers a perspective that can help you navigate this delicate balance the philosophy teaches that kindness while a noble virtue is most powerful when it is deliberate and measured kindness without boundaries loses its Essence and often its value picture a workplace scenario a colleague consistently dumps l last minute tasks on you knowing you’ll never say no at first you take on the extra work out of a desire to be helpful or a fear of being seen as uncooperative but the weight of their responsibilities starts to overwhelm you finally you draw a line calmly explaining that while you’re happy to support when needed you can’t manage additional tasks at the expense of your own priorities to your surprise they don’t react negatively instead they acknowledge your honesty and make an effort to respect your time by asserting yourself you not only protect your energy but also shift the dynamic earning respect and fostering a healthier interaction the stoic secrets of inner strength and self-discipline remind us that saying no is not an act of Cruelty but a declaration of self-respect when you practice assertiveness you send a clear message your kindness is a conscious choice not a limitless resource to be taken for granted this approach allows you to build relationships grounded in mutual respect where kindness is a shared exchange rather than a onesided expectation true kindness uplifts both the giver and the receiver creating connections rooted in understanding and balance take a moment to reflect how often have you felt depleted because someone mistook your kindness for weakness could things have been different if you had established firmer boundaries stoicism doesn’t ask you to close your heart or become indifferent instead it calls you to protect your energy and ensure that your actions stem from strength and Clarity when you give freely but thoughtfully your kindness becomes more impactful and sustainable the next time you’re tempted to stretch yourself too thin pause and ask is this act of kindness coming from a place of genuine willingness or is it depleting me are you helping others while neglecting your own well-being remember the courage to say no when necessary is not selfish it’s an act of self-preservation that ensures you can continue to give authentically even the smallest no can be one of the kindest gifts you offer to your yourself and to those around you by embracing this balance you’ll find that your kindness becomes a source of strength enriching your relationships and your life in ways that are both profound and enduring number three your priorities will be ignored let’s take a deeper dive into kindness and how when it’s not balanced it can become a silent thief of your priorities and personal growth both picture this you’ve become the go-to person for everyone around you a coworker needs help finishing a project a friend needs advice late into the night your neighbor needs a hand with their latest Home Improvement naturally you step up helping others feels rewarding at first it’s uplifting to know you’re making someone’s life easier bringing a smile to their face or being the person they can count on but over time you begin to notice something the things that are most important to you are slipping further and further down your priority list that weekend project you wanted to finish still untouched the quiet evening of rest you promised yourself forgotten you’re running on empty frustrated and wondering where all your energy went have you ever asked yourself at what point does helping others turn into sacrificing myself this is where Modern stoicism provides much needed Clarity and direction if you’re constantly pouring yourself into others needs while neglecting your own you’re not engaging in kindness you’re engaging in self-neglect and here’s the hard truth when you allow your well-being to erode the kindness you offer becomes less effective and authentic how can you truly support others if your own Foundation is crumbling true kindness doesn’t require you to exhaust yourself or abandon your personal goals it comes from a place of balance where you can give generously because you’ve taken care of your own needs first think of it like this you can’t pour from an empty cup when you fail to set boundaries or say no when necessary you risk burnout resentment and a gradual loss of self-respect epicus wisely observed no man is free who is not master of himself so ask yourself are you in control of your time or are you letting the demands of others dictate your life the key to preserving both your kindness and your sense of self lies in learning to set firm compassionate boundaries saying no isn’t a rejection of others it’s an affirmation of your priorities and self-respect take a moment to reflect on what truly matters to you when someone asks for your help pause and ask yourself will this align with the life I want to build communicating your needs openly and honestly with yourself and with others is one of the most empowering acts of self-care you can practice it’s not about being selfish it’s about ensuring that your kindness is sustainable and that it enhances your life as much as it supports others remember your time and energy are finite resources and how you spend them shapes the person you become balance is everything when you protect your priorities you’re not just benefiting yourself you’re also ensuring that the help and support you offer come from a place of genuine strength and abundance modern stoicism teaches us to live intentionally to focus on what we can control and to build lives filled with purpose and fulfillment so let me leave you with this question how can you give your best to the world if you’re not being true to yourself number four you will attract opportunists excessive kindness though admirable in intent can sometimes have unintended consequences attracting opportunists who see your generosity not as a meaningful Exchange but as an endless resource to exploit imagine a colleague who consistently asks for favors borrows your time or leans on your support yet never reciprocates when you are in need these are not mere instances of imbalanced kindness they are warning signs of relationships that take far more than they give over time such Dynamics do more than exhaust your physical energy they deplete your emotional Reserves leaving you feeling unvalued and drained the wisdom of the ancient stoic philosopher epicus offers Insight here it is impossible for a man to learn what he thinks he already knows this serves as a call to approach relationships with Clarity and self-awareness recognizing the critical difference between genuine connections and exploitative ones the core of stoic philosophy lies in discernment the ability to evaluate situations and relationships with wisdom and precision this is especially vital in today’s fastpaced and interconnected world where opportunities for connection abound but so too do the risks of engaging with individuals who lack mutual respect or Genuine appreciation relationships that thrive are built on shared effort Mutual care and a sense of equality while those with opportunists often become imbalanced leaving one party to carry the weight of the connection perhaps you’ve encountered people who consistently demand your attention time or resources but never offer anything meaningful in return in such situations the stoic secret to peace lies in establishing and maintaining boundaries a practice that isn’t selfish but essential for preserving your self-worth and well-being setting clear expect ations in relationships is a profound Act of self-respect by observing how others respond to your boundaries you can discern who truly values your kindness and who merely seeks to benefit at your expense those who genuinely care will respect your limits while opportunists will often become frustrated or withdraw when they realize they cannot take advantage of you senica’s Timeless advice associate with those who will make a better man of you serves as a reminder to carefully choose your companions and focus on fostering relationships that contribute to your growth and Happiness by prioritizing connections with individuals who uplift and support you you align yourself with stoic principles of balance and virtue ensuring your kindness is met with equal appreciation and reciprocity kindness should never come at the cost of your inner peace your emotional stability it is a powerful and transformative Force but it must be guided by wisdom and self-awareness to wield kindness effectively you must learn to balance generosity with discernment understanding that not every relationship is worth your time and energy by practicing self-reflection and remaining Vigilant in your interactions you protect yourself from the emotional toll of one-sided Connections in instead you create space for Meaningful enriching relationships that Inspire and fulfill you the stoic secrets of discernment and self-awareness provide Timeless guidance for navigating these challenges allowing your kindness to shine as a light that warms others while preserving your own flame in doing so you live in harmony with stoic ideals embodying a life of wisdom virtue and resilience let me ask you this are you ready to reclaim your time protect your energy and align your relationships with your values if so take the first step now like this post and share your thoughts below kindness with wisdom is power remember true strength lies not in giving endlessly but in Discerning where your kindness will truly Thrive number five you will be doubted kindness though often celebrated as one of Humanity’s greatest virtues can sometimes bring about unexpected challenges for those who practice it consistently and wholeheartedly while acts of generosity are generally appreciated when your kindness becomes frequent or seemingly excessive it may invite unwarranted skepticism people might question your motives suspecting that your actions are more strategic than sincere as if hidden agendas were driving your Goodwill for instance if you consistently assist a coworker with their tasks the quiet hum of office gossip might suggest you are seeking to Curry favor with your boss rather than simply extending a helping hand this type of Suspicion though often baseless has a way of straining relationships and unsettling your confidence over time the weight of being Mis understood or undervalued might even lead you to question the worth of your kindness causing you to suppress your natural inclination to do good modern stoicism however offers an empowering perspective to navigate these moments of doubt it encourages us to ground our actions in our principles and focus on the purity of our intent rather than the shifting perceptions of others Marcus aelius wisely noted waste no more time arguing about what a good man should be be one this Timeless reminder serves as a beacon guiding us to act according to our values regardless of how others interpret our actions if your kindness stems from a genuine place of virtue and integrity no amount of external doubt

    should deter you yes misunderstandings and skepticism are inevitable but your purpose is not to control control others opinions it is to remain steadfast in living as the person you strive to be the key to overcoming this challenge lies in unwavering consistency when your actions consistently reflect your values they create a pattern of sincerity that becomes impossible to ignore even those who doubt your motives initially may come to recognize the authenticity behind your Deeds As Time unfolds while you cannot expect to change every skeptical mind those who truly matter will eventually understand your intentions in the meantime it is essential to remember that others judgments are beyond your control as epicus wisely observed it is not what happens to you but how you react to it that matters holding on to this perspective allows you to maintain your integrity and peace of mind in the face of external doubts in today’s world where cynicism often overshadows Goodwill staying committed to kindness requires resilience and self-awareness your acts of generosity are not transactional they are a reflection of your character and values if others doubt your motives resist the temptation to retreat into bitterness or defensiveness instead treat these situations as opportunities to practice patience and self-discipline knowing that your worth is not tied to others perceptions modern stoicism reminds us that a life lived in harmony with our principles is its own reward by embracing this truth you free yourself from the weight of external validation and discover that the value of kindness when rooted in virtue far surpasses any Shadow of Doubt let this conviction anchor you for kindness is never wasted when it arises from a place of integrity and authenticity number six you may become dependent on others have you ever noticed how excessive kindness despite its good intentions can sometimes backfire leaving you feeling vulnerable or even disempowered one of the subtle dangers of being overly kind is falling into a cycle of dependency not just for practical support but for emotional validation when you consistently put others needs ahead of your own it can teach you often without realizing it to lean on their approval to feel worthy or valued imagine this have you ever delayed an important decision waiting for a friend or loved one to weigh in just so you could feel reassured while it might seem harmless at first over time this pattern can quietly erode your confidence making you doubt your own instincts and judgment it’s almost like handing over the control of your happiness to someone else only to feel a drift and unsure when that support disappears let me share a story that illustrates this Clara a woman admired for her boundless kindness was always there for her friends she was the one everyone could count on offering a listening ear solving problems and sacrificing her own needs to lift others up But as time went on Clara realized she had unknowingly become reliant on the feedback of her loved ones for every major decision in her life when her closest friend moved abroad Clara suddenly felt lost uncertain and unable to trust her own choices it was a harsh wakeup call she had spent so much time prioritizing others and seeking their input that she had forgotten how to stand on her own two feet Clara’s Journey back to self-reliance wasn’t easy it required uncomfortable periods of solitude self-reflection and rebuilding her trust in herself but through this process she uncovered a strength she hadn’t realized she possessed and she grew more resilient and self assured this is where stoicism comes in stoicism a philosophy rooted in self-mastery reminds us that our true worth and happiness come from within not from the everchanging opinions or actions of others when we become overly reliant on external validation we give others the power to dictate our inner peace this leaves us vulnerable not just to disappointment but also to manipulation or emotional instability when that validation is no longer available one of the most valuable stoic Secrets is learning to cultivate your inner strength and embrace solitude as a way to grow your resilience this doesn’t mean you shut others out or stop seeking connection instead it’s about building a strong Foundation of self- trust so that the support of others is a bonus not a necessity think about your own life are there times when you feel stuck unable to move forward without someone else’s input how often do you find yourself doubting your own decisions waiting for validation before taking a step these might be signs that it’s time to look Inward and build your Independence start small maybe today you make a decision any decision without consulting anyone else notice how it feels even if it’s uncomfortable at first over time as you practice trusting yourself you’ll find that your confidence grows and you’ll rely Less on external validation to feel grounded here’s the lesson kindness is a beautiful and necessary part of life but it must be balanced with self-reliance when you cultivate the ability to trust your own judgment and embrace the Stillness of solitude you strengthen not only your Independence but also your relationships instead of giving out of a place of neediness you give from a position of balance and Inner Strength this in turn allows you to live a life of Greater fulfillment and peace remember the key to a fulfilled life isn’t found in how much you give to others it’s found in how much you nurture your own resilience and Inner Strength so ask yourself are you ready to embrace this balance and reclaim control over your happiness number seven you will have unrealistic expectations one of the pitfalls of being overly kind is that it can lead to unrealistic expectations and this is a lesson many of us learn the hard way when you give generously of yourself your time your energy your support it’s natural to Hope even subconsciously that others will do the same in return after all doesn’t the world feel like it should operate on the principle of fairness but let me ask you how often does reality actually meet those hopes perhaps perhaps you’ve gone out of your way to help a friend during a difficult time only to find that when your own life took a turn for the worse they weren’t there to offer the same level of support that feeling of disappointment can sting and when it lingers it can even grow into resentment carrying that kind of emotional weight is exhausting and it’s here that modern stoicism offers us a Lifeline helping us realign our focus with what we can truly control our actions and reactions Marcus Aurelius one of the most revered stoic philosophers reminds us in his meditations you have power over your mind not outside events realize this and you will find strength this simple yet profound truth has the power to shift how we approach kindness by expecting others to reciprocate our generosity in the exact way we envision we hand over control of our emotional well-being to forces we cannot govern think about that for a moment do you really want your happiness your peace of mind to depend on whether others meet your expectations it’s a precarious position to be in and the burden can be heavy are you prepared to carry that weight or is it time to reexamine how and why you give in the first place the problem lies not in kindness itself but in the invisible strings we sometimes attach to it when we tie our happiness to how others respond to our generosity we create a recipe for frustration it’s like expecting a garden to bloom exactly the way you imagined forgetting that the soil the weather and the seeds all have their own will but what if you let go of that expectation entirely What If instead of giving with a silent hope for a return you gave simply because it aligns with your values senica another brilliant stoic thinker wisely observed he who is brave is free and isn’t it an act of courage to give without expecting anything back when you free yourself from the need for others to meet your standards you also free your emotions you can begin to accept people as they are in all their flaws and complexities rather than feeling disappointed when they don’t behave as you’d hoped this shift doesn’t just protect your peace it enhances your relationships making them more authentic and less transactional letting go of expectations is not an overnight process it’s a practice a mindset that requires patience and introspection it means confronting your own habits of mind and asking tough questions why do I give what am I hoping for in return am I aligning my actions with my values or am I seeking external validation every step you take toward answering these questions helps you cultivate a lighter Freer self modern stoicism teaches us that kindness should never be a transaction it should be an expression of who you are at your core a reflection of your character and your commitment to making the world a little brighter think of kindness as planting a seed you don’t plant it because you’re certain of a harvest or because you demand fruit from it you plant it because it’s an Act of Faith a way to contribute to the beauty and vitality Of The World Isn’t that reason enough when you Embrace this perspective you’ll find that the rewards of kindness come not from what others give you in return but from the quiet satisfaction of knowing you’ve acted in alignment with your values and here’s the beauty of it this approach not only nurtures your own peace of mind but it also creates healthier more meaningful relationships ask yourself are you ready to embrace this form of kindness can you see how it might transform not only your relationships but your own sense of inner peace modern stoicism reminds us that kindness is one of the purest forms of strength it’s not about what you gain it’s about who you become through the act of giving by shifting your focus from expectation to intention you can reclaim your emotional freedom and walk through life with a lighter heart and a stronger sense of purpose so the next time you give do it not because you’re waiting for something in return but because it’s a reflection of the person you choose to be can you imagine how much lighter your life could feel if you let go of those unrealistic expectations it’s worth considering isn’t it number eight you might develop harmful habits overextending yourself for others often feels Noble but it can quickly become a double-edged sword leading to stress and triggering harmful coping mechanisms when you constantly sacrifice your time energy and emotional reserves without prioritizing your well-being you create a dangerous imbalance in such states of exhaustion it’s easy to turn to Temporary Comforts like overeating binge watching TV excessive alcohol consumption or other self-destructive habits as a way to manage feelings of frustration and burnout these fleeting escapes May provide relief in the moment but they only mask the deeper issues never addressing their root causes instead they compound your challenges adding physical strain and emotional guilt to an already overwhelming situation stoicism with its core focus on moderation self-discipline and inner balance warns us against succumbing to such patterns of excess senica wisely noted it is not the man who has too little but the man who craves more that is poor chasing external distractions or temporary relief does not resolve the struggle within it amplifies it the key to Breaking Free from this cycle lies in taking a proactive approach rooted in stoic principles to confront stressors directly Begin by examining the sources of your stress is it a packed schedule that leaves no room for rest unrealistic expectations from yourself or others or perhaps the difficulty of setting boundaries which often comes with the fear of disappointing people but by identifying these triggers you can approach them with Clarity and reason taking intentional steps to address the actual problem rather than merely numbing its symptoms this deliberate confrontation requires courage and reflection but it is far more effective than avoidance stoicism also teaches us that self-care is not Indulgence it is wisdom as epicus remarked no man is free who is not master of himself this Mastery begins with cultivating habits that restore your energy and promote mental Clarity mindfulness is a powerful tool in this process by practicing mindfulness whether through meditation breathing exercises or simply taking time to notice the present moment you can anchor yourself in the here and now this helps you manage emotional overwhelm and maintain perspective preventing ing your emotions from hijacking your reason additionally creating space for restorative practices such as regular exercise journaling or spending time in nature allows you to recharge in a sustainable way building resilience against stress and reducing Reliance on unhealthy behaviors a balanced life is one of Harmony and stoicism consistently encourages us to avoid extremes when you Embrace balance in your actions you protect yourself from the chaos of overextension and the destructive Tendencies it Fosters for example learning to say no to excessive demands is not selfish it is an act of self-respect and self-preservation by doing so you cons serve the energy needed to focus on what truly matters and ensure that the kindness you offer others stems from a place of abundance not depete completion developing healthier habits aligned with reason and virtue not only safeguards your well-being but also strengthens your ability to navigate challenges effectively and with dignity these stoic Secrets remind us that life’s demands will always be present but how we respond determines our Peace of Mind balance is the essence of living meaningfully it Shields us from the chaos of harmful habits while imp empowering us to act with purpose ensuring that our kindness enriches both others and ourselves by prioritizing moderation and self-discipline we create a life that is steady and fulfilling a life where kindness becomes a strength rather than a burden if you’ve made it this far it’s clear that you value thoughtful reflection and the pursuit of balance in life let’s continue the conversation share your thoughts in the comments by say simply saying balance begins with boundaries together we’ll keep learning and growing through these powerful stoic principles stay tuned there’s more eye-opening content coming your way soon number nine your boundaries will be violated failing to establish and enforce boundaries in your acts of kindness often leads to others overstepping and disregarding your limits leaving you feeling overburdened and unappreciated when your time energy or values are ignored repeatedly you may find yourself in a cycle where your good will is taken for granted consider a coworker who continually adds tasks to your workload knowing you’ll always say yes while this might seem harmless initially overtime it Fosters a dynamic where your contributions are undervalued and your kindness is treated as an expectation rather than a choice these violations of boundaries not only deplete your emotional reserves but also lead to frustration and resentment as your efforts go unnoticed and unreciprocated senica the stoic philosopher aptly reminds us no person hands out their money to passes by but to how many do we hand out our lives we are tight-fisted with property and money yet think too little of what wasting time the one thing about which we should all be the toughest misers his words underscore the importance of valuing our own resources time and energy enough to safeguard them with boundaries in the philosophy of modern stoicism boundaries are seen as essential tools for living with intention and balance they are not walls to isolate yourself but Frameworks that allow your kindness to thrive without undermining your well-being setting clear limits ensures that your acts of generosity come from a place of genuine care rather than obligation creating healthier and more respectful relationships when you communicate your boundaries assertively whether by declining extra work or addressing a pattern of overreach in a relationship you teach others to respect you while preserving your emotional and physical health this act of self-respect also aligns with the stoic principle of living in accordance with nature as maintaining balance in our interactions is critical to a harmonious life in today’s world where demands on our time and energy are seemingly endless boundary setting becomes even more vital without boundaries you risk burnout and discontent as others May unknowingly exploit your generosity to prevent this practice the art of saying no when needed and follow through with consistent action when your limits are crossed for instance if a colleague continues to assign you tasks without your consent it is perfectly reasonable to redirect the work back to them or involve a supervisor to establish Clarity by doing so you protect your time and energy ensuring your kindness is appreciated rather than exploited through these measures you are uphold the stoic ideal of moderation transforming your kindness from a source of stress into a meaningful expression of Good Will kindness without boundaries is unsustainable both for you and those you aim to help by defining and enforcing limits you preserve the value of your generosity ensuring it uplifts rather than depletes you modern stoicism teaches us that living with purpose and mindfulness is key to maintaining Harmony in our lives protecting your boundaries allows you to act with intention offering kindness where it matters most and fostering relationships built on mutual respect in doing so you embody the wisdom of the stoics cultivating a life that balances compassion with self-respect a balance that not only benefits you but also enriches the lives of those around you the ways kindness can ruin your life reveal an important truth even positive traits can become burdensome if not carefully managed and applied correctly kindness itself isn’t wrong but when it’s abused it can lead to losing your sense of self setting unrealistic expectations and breaking personal boundaries to avoid falling into this negative cycle understanding the risks and knowing how to manage them is key five stoic strategies to stop being taken advantage of while kindness can have unintended consequences this doesn’t mean you need to abandon it instead you can apply stoic strategies to maintain kindness in a balanced and empowering way helping you protect yourself from being taken advantage of while staying true to your values keep watching to discover how stoicism can help you stay kind when without being exploited and transform your kindness into an unshakable source of Power number one understand your emotions understanding your emotions is the Cornerstone of stoic philosophy and a vital tool for preventing the emotional burnout that often stems from unchecked kindness when you give endlessly without considering your emotional capacity feelings of frustration exhaustion or even resentment can creep in and quietly take hold these emotions may seem insignificant at first but they tend to accumulate emerging later as irritability stress or a sense of being unappreciated imagine helping a friend over and over again offering rides lending a hand with projects or being their emotional support system only to feel overlooked when they don’t acknowledge your efforts that unspoken disappointment can morph into resentment souring not only your relationship with them but also your own emotional well-being stoicism with its Timeless wisdom encourages us to observe our emotions not as enemies to be suppressed but as signals guiding us toward balance the stoic Secrets teach us to approach our feelings with curiosity identifying their roots and understanding understanding their triggers for instance when you notice yourself feeling drained after yet another act of kindness pause and ask what’s behind this feeling am I giving too much too often without ensuring my own needs are met this self-awareness allows you to recognize when your boundaries are being stretched too thin giving you the power to recalibrate your actions before they lead to burnout it’s not about withdrawing your kindness it’s about offering it in ways that are genuine sustainable and aligned with your own emotional health one powerful strategy for cultivating this balance is regular self-reflection carve out time each day even just a few minutes to ask yourself questions like how do I feel after helping others or am I sacrificing my well-being for the sake of being seen as kind consider a modern example a co-worker who frequently asks you to cover their shifts or pick up their slack you comply fearing conflict or wanting to maintain a helpful image but over time you start to dread their requests and feel resentment building if you reflect on your emotions and acknowledge this pattern you can prepare yourself to set boundaries perhaps by saying I’d love to help but I can’t this time this small Act of assertion not only protects your energy but also reshapes the dynamic into one of mutual respect here’s an important truth to internalize understanding your emotions and setting limits isn’t selfish it’s essential if you’re running on empty how can you continue to give authentically the stoic approach to kindness isn’t about closing off your heart it’s about ensuring that your kindness flows from a place of strength and intention when you acknowledge and respect your emotions you create a healthier foundation for your relationships and preserve your capacity to give in meaningful ways let me ask you this how often do you find yourself feeling stretched too thin yet reluctant to say anything for fear of being seen as unkind could tuning into your emotions and adjusting your actions make a difference remember stoicism reminds us that a life lived with self-awareness is a life lived with purpose by embracing this principle you can transform your kindness into something that uplifts both you and those around you the next time you feel your generosity tipping into exhaustion take a step back and reflect is this kindness depleting you or is it rooted in genuine care when you find that balance you you’ll discover that your acts of kindness become not only more impactful but also deeply fulfilling for both you and the people in your life number two learn to say no let’s talk about one of the hardest but most liberating words you can ever learn to say no it seems simple doesn’t it but for so many of us saying no feels like a betrayal of kindness we worry about letting people down being seen as selfish or even facing rejection so we keep saying yes yes to the extra project at work even when your plate is already full yes to helping a friend move even when your own weekend is packed with plans at first it feels good to help like you’re being dependable and kind but then the weight of overc commitment sets in you feel overwhelmed stressed maybe even resentful and here’s the kicker the more you say yes when you really mean no the more you teach others to take your time and energy for granted have you ever stopped to ask yourself how much of my life am I giving away and at what cost to myself this is where Modern stoicism offers a Lifeline the stoics masters of wisdom and discipline understood the value of setting boundaries they knew that saying no is not an act of Cruelty it’s an act of self-respect Marcus Aurelius wrote it is not death that a man should fear but he should fear never beginning to live by overcommit and prioritizing everyone else’s needs above your own you risk losing the time and energy needed to live a life aligned with your values modern stoicism reminds us that we can only be truly kind when our action come from a place of strength not obligation learning to say no is one of the most powerful ways to protect your well-being when you decline a request that doesn’t align with your priorities you’re not just setting a boundary you’re reclaiming control over your life think of it this way every time you say yes to something you’re also saying no to something else often something that matters more to you how often are you sacrificing your own goals peace of mind or even your health just to avoid the discomfort of a no epic tetus reminds us he who is not master of himself is a slave are you a master of your time or are you letting fear of disapproval dictate your choices start small practice saying no to minor requests that don’t serve your priorities use polite but firm language like I’d love to help but I’m fully committed right now notice how empowering it feels to draw a line and stand by it over time this builds confidence and reinforces your boundaries it’s not about being selfish it’s about ensuring your kindness remains genuine and balanced when you protect your own time and energy the help you offer others comes from a place of abundance not exhaustion remember remember people who truly value you will respect your boundaries and those who don’t they likely never valued you only what you could do for them saying no isn’t rejection it’s an affirmation of your priorities and your Worth Modern stoicism teaches us that discipline and self-respect are key to living a fulfilling life so ask yourself what kind of Life am I building if I never Never Say No by learning to decline what doesn’t serve you you open the door to a life that truly does after all how can you live authentically if you’re always living for others number three dedicate time to yourself dedicating time to yourself is not an act of selfishness it is a fundamental pillar of emotional and mental well-being in a society that often confuses worth with constant availability self-care can easily feel like an Indulgence instead of a necessity however the truth is that endlessly prioritizing others whether by helping friends with their projects or giving up weekends for everyone else’s needs inevitably leads to neglecting your own this imbalance may not be noticeable at first but over time it breeds burnout dissatisfaction and even resentment senica’s Timeless wisdom reminds us it is not that we have a short time to live but that we waste a lot of it his words are a call to action urging us to use our time deliberately and to recognize that self-care is not a luxury but an essential practice for leading a balanced and meaningful life stoicism teaches that true strength and productivity arise from self-mastery and this Mastery is only possible when we make time for reflection renewal and personal growth taking time for yourself might involve physical activities like exercising mental enrichment through reading or learning or simply finding moments of peace to let your mind rest these practices are not escapes from your responsibilities they are the very Foundation that equips you to face life’s demands with Clarity and vigor the stoic secret to thriving lies in treating self-care as a non-negotiable commitment without it you risk depleting the energy you need to effectively support others and to pursue your own aspirations to make self-care a sustainable habit Begin by intentionally scheduling regular time for yourself and treating it as sacred whether it’s an hour in the morning to meditate an evening to journal or a week can to immerse yourself in a hobby this time should be viewed as non-negotiable just as you would not cancel a crucial meeting or family obligation respect these commitments to yourself set clear boundaries with those around you and don’t hesitate to say no when interruptions arise dedicating time to yourself is not about withdrawing from others but rather about fortifying your ability to connect with and care for them more effectively when you nurture your own well-being you create a foundation of strength and Clarity that benefits not only you but also those who rely on you this is an act of profound self-respect and a demonstration of the stoic principle of balance by prioritizing self-care you ensure that your kindness and efforts for others stem from a place of abundance rather than exhaustion taking care of yourself is the Cornerstone of a life lived with Clarity purpose and resilience by carving out time for your well-being you align with the stoic ideal of purposeful living ensuring that your energy is used wisely and your actions reflect your values this practice not only empowers you to thrive but also enriches your ability to contribute meaningfully to the lives of others your well-being is not just a gift to yourself it is the foundation from which all other aspects of your life can flourish if you’ve ever struggled with the guilt of taking time for yourself or felt drained from constantly prioritizing others share your resolve by commenting I honor my well-being below your commitment to self-care can Inspire others to recognize the importance of nurturing their own strength and living a life of balance and purpose let’s support each other in building lives rooted in clarity resilience and the stoic principle of balance number four set clear goals setting clear goals is essential for creating Direction in life giving you a sense of purpose and helping you focus on what truly matters without defined objectives it’s easy to be pulled in countless directions by the demands and expectations of others leaving you feeling aimless and unfulfilled imagine committing to a professional growth plan such as completing a certification course only to find your time consumed by favors and distractions that have no connection to your aspirations this scenario underscores the importance of clarity in your goals as they act like a compass guiding your actions toward meaningful progress sener a stoic philosopher wisely stated if one does not know to which Port one is sailing no wind is favorable this Timeless wisdom highlights how essential it is to know your destination both in life and in your daily actions to avoid being Swept Away by distractions modern stoicism teaches us the value of purposeful living emphasizing the importance of aligning your actions with your values and long-term aspirations setting goals is not just about productivity it is about ensuring that your time and energy are directed toward Pursuits that enrich your personal growth and contribute to your sense of fulfillment by having a Clear Vision of what you want to achieve you Empower yourself to make intentional decisions that serve your best interests rather than succumbing to external pressures or fleeting demands this Focus not only strengthens your resolve but also preserves your mental and emotional well-being as you no longer feel burdened by obligations that pull you away from your true priorities to make this principle actionable start by identifying your priorities and breaking them into smaller manageable steps for example if your goal is to enhance your career outline specific tasks such as researching programs enrolling in courses and dedicating time each week to study regularly revisit your goals to measure your progress and make adjustments as needed ensuring they remain relevant and achievable additionally clearly communicate your objectives to those around you helping them understand why your time and energy are dedicated to specific Pursuits this transparency not only reinforces your commitment to your goals but also teaches others to respect your boundaries and the importance of your aspirations in Modern Life where distractions are constant and time feels increasingly scarce having clear goals is more critical than ever without them it becomes easy to say yes to every request leaving little room for personal growth or meaningful achievement by consciously prioritizing your objectives and asserting your boundaries you not only Safeguard your progress but also Inspire others to do the same modern stoicism M encourages this disciplined approach showing us that living with purpose and Clarity leads to a more harmonious and fulfilling life when you align your actions with your goals you cultivate a mindset of intentionality turning every decision into a step toward a life of meaning and balance number five distance yourself from energy drainers distancing yourself from energy drainers those who constantly take from you without giving anything in return is one of the most important steps you can take to safeguard your emotional and mental health we’ve all encountered these people maybe it’s the coworker who always has a crisis but never listens when you share your struggles or a friend who endlessly leans on you for support yet is never available when you need a shoulder these individuals don’t intentionally set out to harm you but they’re relentless less negativity and one-sided needs leave you feeling exhausted undervalued and emotionally drained over time this imbalance can chip away at your well-being making you question why you’re always left Running on Empty while they seem to take and take without giving back stoicism offers Timeless wisdom for navigating these relationships one of its stoic Secrets is the principle of discernment carefully choosing who you invest your time and energy in the stoics teach that our peace of mind is precious and should be fiercely protected to do this you must first recognize the signs of an energy drainer do you feel consistently depleted after interacting with someone do their demands feel excessive or their negativity overwhelming if the answer is yes it might be time to re-evaluate that relationship take a modern example a colleague who always vents about their workload and demands your help with their tasks but never reciprocates when you’re under pressure by continuing to indulge them you not only reinforce the behavior but also neglect your own priorities and emotional balance creating distance doesn’t mean shutting people out cruy it means setting healthy boundaries start by limiting how often and How Deeply you engage with energy drainers if they they constantly ask for favors politely decline when their requests feel excessive redirect the conversation or gently remind them that you have your own responsibilities to focus on at the same time make a conscious effort to surround yourself with positive supportive individuals those who Inspire and encourage you a friend who listens as much as they talk or a colleague who collaborates and shares the workload can have an incredible impact on your sense of balance and fulfillment these are the relationships that build emotional resilience and promote personal growth let me ask you how often do you find yourself saying yes to someone even when it feels like too much how does that leave you feeling afterward recognizing and addressing these Dynamics can transform your relationships and protect your mental health remember stoicism isn’t about avoiding all challenges or cutting off everyone who frustrates you it’s about focusing your energy on what truly matters and what aligns with your values when you let go of negativity you create space for relationships that are uplifting and enriching where kindness and support flow both ways so the next time you find yourself faced with an energy drainer pause and reflect is this connection nurturing you or draining you is it time to step back and reclaim Your Peace by putting the stoic principle of discernment into practice you’ll discover that your kindness and energy are most valuable when shared with those who truly appreciate and respect them protect your peace of mind and you’ll cultivate a life filled with balance growth and meaningful connections as we wrap up let’s refle on the key takeaway kindness is a powerful and admirable trait but without boundaries it can lead to burnout frustration and a loss of self-respect by applying the Timeless strategies of modern stoicism like self-awareness clear goal setting and nurturing healthy relationships you can Safeguard your priorities while still fostering meaningful connections the wisdom of the stoics teaches us that true kindness begins with respect for yourself and when you live in alignment with your values you can give to others in a way that’s both genuine and sustainable now it’s time to take action how will you apply these stoic secrets to your own life are there areas where you need to set better boundaries or say no to protect your well-being I’d love to hear your thoughts share your experiences in the comments below and if if you found value in today’s video don’t forget to like subscribe to stoic secrets and hit the Bell icon for more insights into stoicism and personal growth together let’s continue this journey toward a balanced and fulfilling life Guided by the wisdom of the stoics

    By Amjad Izhar
    Contact: amjad.izhar@gmail.com
    https://amjadizhar.blog

  • Financial Accounting Fundamentals

    Financial Accounting Fundamentals

    This collection of text comprises a series of video lectures on financial accounting. The lectures explain fundamental accounting concepts, including the preparation of financial statements (income statement, statement of changes in equity, balance sheet), and the calculation and interpretation of financial ratios. The lectures also cover adjusting journal entries, bad debt expense, bank reconciliations, and depreciation methods. Specific accounting methods like FIFO, LIFO, and weighted average are demonstrated, and the importance of internal controls is emphasized. Finally, the lectures discuss the statement of cash flows and its preparation using both direct and indirect methods.

    Financial Accounting Study Guide

    Quiz

    Instructions: Answer each question in 2-3 sentences.

    1. What is the key difference between an asset and a liability?
    2. Explain the concept of “accounts payable” in the context of a company’s liabilities.
    3. How do revenues differ from expenses for a business?
    4. Define the term “dividends” and explain their relationship to a company’s profits.
    5. What distinguishes a current asset from a long-term asset?
    6. What does it mean for a company to “debit” an account?
    7. What is the significance of the accounting equation (A = L + SE)?
    8. What is the purpose of a journal entry in accounting?
    9. Explain the concept of “accumulated depreciation” and its function.
    10. Briefly describe the difference between the FIFO, LIFO, and weighted-average methods of inventory valuation.

    Quiz Answer Key

    1. An asset is something of value that a company owns or controls, whereas a liability is an obligation a company owes to someone else, requiring repayment in the future. Assets are what the company possesses, and liabilities are what the company owes.
    2. Accounts payable represents a company’s short-term debts, usually due within 30 days, often arising from unpaid bills like phone bills or supplier invoices. It is a common liability found on a balance sheet.
    3. Revenues are the money a company earns from its core business activities, such as sales or service fees, and expenses are the costs incurred in running the business, like salaries or utilities. Revenues are inflows, and expenses are outflows.
    4. Dividends are a portion of a company’s profits that shareholders receive, representing a distribution of earnings. They are a payout to owners of the company if they choose to take money out of the business.
    5. A current asset is expected to be used or converted into cash within one year, such as cash or inventory, while a long-term asset is intended for use over multiple years, such as land or equipment. The one-year mark is the distinguishing line.
    6. A debit is an accounting term that increases asset, expense, or dividend accounts while decreasing liability, shareholders’ equity, or revenue accounts. The usage of debits and credits is core to the accounting system.
    7. The accounting equation, A = L + SE, represents that a company’s total assets are equal to the sum of its liabilities and shareholders’ equity. It’s a foundational concept ensuring the balance of a company’s financial position.
    8. A journal entry is the first step in the accounting cycle and records business transactions by detailing debits and credits for at least two accounts. They create a trackable record for every transaction.
    9. Accumulated depreciation represents the total amount of an asset’s cost that has been expensed as depreciation over its life to date. It is a contra-asset account that reduces the book value of the related asset.
    10. FIFO (first-in, first-out) assumes that the oldest inventory is sold first. LIFO (last-in, first-out) assumes that the newest inventory is sold first. Weighted average uses the average cost of all inventory to determine the cost of goods sold.

    Essay Questions

    Instructions: Write an essay that thoroughly explores each of the following prompts, drawing on your understanding of the course material.

    1. Discuss the importance of understanding the differences between assets, liabilities, and shareholders’ equity for making sound business decisions. Consider how these elements interact and contribute to a company’s overall financial health.
    2. Explain the different types of journal entries covered in the source material and how the concept of debits and credits is essential for accurately recording financial transactions. Why is it so important that a journal entry balance?
    3. Compare and contrast the straight-line, units of production, and double-declining balance methods of depreciation. Under what circumstances might a business choose one method over another, and why?
    4. Describe the components of a cash flow statement and their importance to understanding a company’s overall financial performance. Discuss how the operating, investing, and financing sections are used to evaluate a company’s financial decisions.
    5. Explain the different inventory valuation methods (FIFO, LIFO, Weighted Average) and how they can affect a company’s cost of goods sold and net income. What are the implications of using one method over another?

    Glossary of Key Terms

    Accounts Payable: A short-term liability representing money owed to suppliers for goods or services purchased on credit.

    Accounts Receivable: A current asset representing money owed to a company by its customers for goods or services sold on credit.

    Accrued Expense: An expense that has been incurred but not yet paid in cash.

    Accrued Revenue: Revenue that has been earned but for which payment has not yet been received.

    Accumulated Depreciation: The total depreciation expense recorded for an asset to date; a contra-asset account that reduces the book value of an asset.

    Asset: Something of value that a company owns or controls, expected to provide future economic benefit.

    Balance Sheet: A financial statement that presents a company’s assets, liabilities, and equity at a specific point in time.

    Bond: A long-term debt instrument where a company borrows money from investors and promises to pay it back with interest over a specified period.

    Cash Flow Statement: A financial statement that summarizes the movement of cash into and out of a company over a specific period.

    Common Shares: A type of equity ownership in a company, giving shareholders voting rights and a claim on the company’s residual value.

    Contra-Asset Account: An account that reduces the value of a related asset (e.g., accumulated depreciation).

    Cost of Goods Sold (COGS): The direct costs of producing goods that a company sells.

    Credit: An accounting term that decreases asset, expense, or dividend accounts, while increasing liability, shareholders’ equity, or revenue accounts.

    Current Asset: An asset expected to be converted into cash or used within one year.

    Current Liability: A liability due within one year.

    Debit: An accounting term that increases asset, expense, or dividend accounts, while decreasing liability, shareholders’ equity, or revenue accounts.

    Depreciation: The allocation of the cost of a tangible asset over its useful life.

    Depreciable Cost: The cost of an asset minus its residual value, which is the amount to be depreciated over the asset’s useful life.

    Discount (on a bond): Occurs when a bond is sold for less than its face value. This happens when the market interest rate exceeds the bond’s stated interest rate.

    Dividend: A distribution of a company’s profits to its shareholders.

    Double-Declining Balance Depreciation: An accelerated depreciation method that applies a multiple of the straight-line rate to an asset’s declining book value.

    Equity (Shareholders’ Equity): The owners’ stake in the assets of a company after deducting liabilities.

    Expense: A cost incurred in the normal course of business to generate revenue.

    FIFO (First-In, First-Out): An inventory valuation method that assumes the first units purchased are the first units sold.

    Financial Statements: Reports that summarize a company’s financial performance and position, such as the income statement, balance sheet, and cash flow statement.

    General Ledger: A book or electronic file that contains all of the company’s accounts.

    Gross Profit (Gross Margin): Revenue minus the cost of goods sold.

    Income Statement: A financial statement that reports a company’s revenues, expenses, and profits or losses over a specific period.

    Inventory: Goods held by a company for the purpose of resale.

    Journal Entry: The recording of business transactions showing the debits and credits to accounts.

    Liability: A company’s obligation to transfer assets or provide services to others in the future.

    LIFO (Last-In, First-Out): An inventory valuation method that assumes the last units purchased are the first units sold.

    Long-Term Asset: An asset that a company expects to use for more than one year.

    Long-Term Liability: A liability due in more than one year.

    Net Income: Revenue minus expenses; the “bottom line” of the income statement.

    Premium (on a bond): Occurs when a bond is sold for more than its face value. This happens when the market interest rate is less than the bond’s stated interest rate.

    Preferred Shares: A type of equity ownership in a company, where shareholders have a preference over common shareholders in dividends and liquidation.

    Retained Earnings: The cumulative profits of a company that have been retained and not paid out as dividends.

    Revenue: Money a company earns from its core business activities.

    Residual Value (Salvage Value): The estimated value of an asset at the end of its useful life.

    Straight-Line Depreciation: A depreciation method that allocates an equal amount of an asset’s cost to depreciation expense each year of its useful life.

    T-Account: A visual representation of an account with a debit side on the left and a credit side on the right.

    Units of Production Depreciation: A depreciation method that allocates an asset’s cost based on its actual usage rather than time.

    Vertical Analysis: A type of financial statement analysis in which each item in a financial statement is expressed as a percentage of a base amount. On an income statement, it is usually expressed as a percentage of sales. On a balance sheet, it’s usually expressed as a percentage of total assets.

    Weighted-Average Method: An inventory valuation method that uses the weighted-average cost of all inventory to determine the cost of goods sold.

    Financial Accounting Concepts and Analysis

    Okay, here is a detailed briefing document summarizing the key themes and ideas from the provided text, incorporating quotes where relevant.

    Briefing Document: Financial Accounting Concepts and Analysis

    I. Introduction

    This document provides a review of core financial accounting concepts, focusing on assets, liabilities, equity, revenues, expenses, dividends, journal entries, and financial statement analysis. The source material consists of transcribed video lectures from an accounting course, delivered by a professor (likely “Tony”) with a conversational and relatable style.

    II. Core Accounting Terms and Concepts

    A. Assets: * Defined as “something of value that a company can own or control.” * Value must be “reasonably reliably measured.” * Examples: * Accounts Receivable: “our customer hasn’t paid the bill right we did some work for the customer they haven’t paid us yet we would expect to collect in less than a year” * Inventory: “Walmart expects to sell through any piece of inventory in less than a year” * Long-term investments, land, buildings, and equipment are also assets. * Distinction between Current vs Long-term Assets * Current Assets are expected to be liquidated or used up within one year. * Long-Term Assets are those expected to be used beyond one year.

    B. Liabilities: * Defined as “anything that has to be repaid in the future.” * Technical definition: “any future economic obligation.” * Examples: * Accounts Payable: “within typically within 30 days you’ve got to pay it back” * Notes Payable: “bank loans, student loans, mortgages,” all categorized under “note payable” which is a contract promising repayment. * Distinction between Current vs Long-term Liabilities * Current Liabilities are obligations to be repaid within one year. * Long-Term Liabilities are obligations to be repaid over a period longer than a year, such as a mortgage.

    C. Shareholders’ Equity: * Represents the owners’ stake in the company. * “If I were to sell them off pay off all my debts what goes into my pocket that is my equity in the company” * Includes common shares and retained earnings.

    D. Revenues: * Defined as what a company “earns” when it “does what it does to earn money.” * Examples: Sales revenue, tuition revenue, rent revenue. * “How is the money coming in? It’s the revenue-generating part of the business.”

    E. Expenses: * Defined as “costs” associated with running a business. * Examples: Salary expense, utilities expense, maintenance expense.

    F. Dividends: * Represent “shareholders pulling profits from the company,” essentially taking cash out of the company’s retained earnings. * Payable when “revenues exceed the expenses” or when the company is profitable. * Shareholders “can keep the money keep those profits in the company or the shareholders can say I’d like some of that money.”

    III. Journal Entries

    A. The Concept: * Based on Newton’s third law of motion, “for every action there is an equal and opposite reaction.” * “There’s not just one thing happening there’s always kind of equal and opposite forces acting in a journal entry.” * Every transaction has at least one debit and at least one credit and the value of the debits must equal the value of the credits.

    B. Debits and Credits: * Debits (Dr) and Credits (Cr) are not related to credit cards or bank accounts, but they are used to increase or decrease different types of accounts. * The basic accounting equation: Assets = Liabilities + Shareholders’ Equity (A=L+SE) * Accounts on the left side (Assets) go up with a debit and down with a credit. Accounts on the right side (Liabilities and Equity) go up with a credit and down with a debit.

    C. Journal Entry Table: * The presenter suggests this mnemonic “a equal L + SE” with “up arrow down arrow down arrow up arrow down arrow up Arrow then beneath I write Dr CR Dr CR R Dr CR.” * This is used as a visual aid to determine the correct debits and credits for a transaction.

    **D. Journal Entry Elements:**

    * Each journal entry must include:

    * A date.

    * A debit account.

    * A credit account.

    * The value of the debit and credit, which must be equal.

    * A description of the transaction, avoiding the use of dollar signs.

    E. Examples * Purchase a car for cash. * Debit: Car Asset * Credit: Cash Asset. * Purchase a car with a car loan. * Debit: Car Asset * Credit: Car loan payable Liability * Purchase a car with part cash and a car loan. * Debit: Car Asset * Credit: Cash Asset * Credit: Car loan payable Liability

    IV. Adjusting Journal Entries

    A. Types of Adjustments * Prepaids: When expenses are paid in advance, like insurance. The prepaid asset is reduced as the expense is recognized. * Example: Prepaid insurance becomes insurance expense over time. * Depreciation: When a long-term asset’s value is reduced over time. * Example: Vehicles, equipment. * Accrued Expenses: When expenses build up but are not yet paid. This creates a liability. * Example: Accrued interest on a loan. * Accrued Revenues: When revenues are earned but not yet received. This creates a receivable. * Example: Service revenue earned on account.

    B. The Purpose: * To ensure financial statements accurately reflect the company’s financial position at the end of the period. * Adjustments are necessary because “the lender isn’t calling me saying hey it’s December 31st where’s my money no they know they’re not getting paid till July so the accountant just has to know oh I’ve got a liability here that’s building up.”

    V. Financial Statement Analysis

    A. Trial Balance * An unadjusted trial balance is a list of all accounts and their balances before making any adjusting entries. * An adjusted trial balance is created after all adjusting entries are made. B. Income Statement * Shows the company’s revenues and expenses for a period. * Calculates net income: Revenues – Expenses. C. Balance Sheet * Shows the company’s assets, liabilities, and equity at a specific point in time. * The basic accounting equation (Assets=Liabilities + Equity) must always balance. D. Statement of Cash Flows * Categorizes cash flows into operating, investing, and financing activities. * It provides a summary of how cash changed during a given period. * Uses both changes in balance sheet accounts and information in the income statement to create the full picture.

    E. Ratio Analysis: * Liquidity Ratios assess a company’s ability to meet its short-term obligations. Includes the current ratio. * Profitability Ratios assess a company’s ability to generate profit. Includes gross profit margin and net profit margin. * Solvency Ratios assess a company’s ability to meet its long-term obligations, such as the debt-to-equity ratio.

    F. Vertical Analysis (Common Sized Statements): * Expresses each item on a financial statement as a percentage of a base figure. * On the income statement, each item is expressed as a percentage of total sales. * On the balance sheet, each item is expressed as a percentage of total assets. * Allows for comparison of companies of different sizes or for comparing trends across years.

    VI. Other Concepts Covered

    • Inventory Costing Methods: FIFO (First-In, First-Out), LIFO (Last-In, First-Out), and Weighted Average methods.
    • Bank Reconciliation: Adjusting bank statements and company records to reconcile the different balances in order to identify errors and discrepancies.
    • Allowance for Bad Debts: A contra-asset account used to estimate uncollectible receivables.
    • Bonds: Accounting for bonds issued at a premium or discount, and the amortization of those premiums or discounts over the life of the bond.
    • Shareholders’ Equity: Different types of shares, such as common shares and preferred shares.
    • Closing Entries: Resetting revenue, expense, and dividend accounts to zero at the end of an accounting period.

    VII. Key Themes

    • The Importance of Understanding Journal Entries: “you really need to understand them… if you haven’t understood it well it’s just going to haunt you for the rest of class”
    • Financial Accounting is About Tracking Financial Events: “accounting is all about tracking Financial events.”
    • Accounting is Logical and Systematic: The goal is to keep track of transactions “in a logical way that’s not going to drive you crazy.”
    • Practical Application: Emphasis is placed on real-world examples and applications.
    • Mistakes are Opportunities to Learn: “it’s not even the end of the world if you fail a course but really it’s not the end of the world if you fail a test you can put it together you can put yourself together and you can improve.”

    VIII. Conclusion This source material provides a detailed explanation of accounting and financial analysis concepts. The speaker employs practical examples and a relatable, conversational teaching style that aims to both inform and engage students, encouraging deep understanding and retention of these core principles.

    Financial Accounting Fundamentals

    Financial Accounting FAQ

    • What is the difference between an asset and a liability in accounting? An asset is something of value that a company owns or controls. This could include tangible items like inventory, buildings, and equipment, or intangible items like patents and trademarks. The key thing to remember is that an asset has economic benefit to the company, and the value can be reasonably and reliably measured. A liability, on the other hand, is an obligation of the company, something it owes to others, that has to be repaid in the future. Examples include bank loans, mortgages, accounts payable (money owed to suppliers), and even unpaid phone bills. Essentially, assets are what a company has and liabilities are what it owes.
    • What is the difference between “current” and “long-term” when classifying assets and liabilities? The distinction between “current” and “long-term” depends on the timeframe over which the asset will be converted to cash or the liability will be paid off. A current asset is expected to be liquidated (turned into cash) or used up within one year or less. Examples of current assets include cash, inventory (for companies that expect to sell it quickly), and accounts receivable (money due from customers for short-term credit). A long-term asset, in contrast, is not expected to be liquidated within a year; it includes things like land, buildings, and machinery that are intended for long-term use by the company. A current liability is an obligation that’s expected to be paid within a year, such as short-term debt, accounts payable, or wages payable. A long-term liability is an obligation that’s not due within a year; it includes things like long-term bank loans or mortgages. The one-year line is a key point in financial accounting.
    • What are the key components of shareholders’ equity, and how do they relate to the balance sheet? Shareholders’ equity represents the owners’ stake in a company. It’s comprised primarily of two main components. Common shares reflect the original investments made by shareholders in exchange for ownership in the company. Retained earnings represent the accumulated profits that a company has not distributed as dividends to its shareholders but has kept to reinvest in the business. These amounts are listed on the balance sheet under the heading “Shareholder’s Equity” and represent the residual value of the company after all its debts are paid. The basic accounting equation that connects all of these is Assets = Liabilities + Shareholders’ Equity.
    • How do revenues, expenses, and dividends affect a company’s profitability? Revenues are the income a company earns from its normal business operations, such as sales, service fees, or rent. They are the “earn” component of the income statement. Expenses are the costs a company incurs to generate revenue. This could include salaries, utilities, rent, cost of goods sold, and so on. If revenues exceed expenses, the company is profitable; if expenses exceed revenues, the company is operating at a loss. Dividends are payments of a portion of a company’s profits that are made to the shareholders (owners) of the business. They are not an expense but are instead a distribution of profits, so while they don’t affect net income, they do affect how much profit the company can keep for reinvestment.
    • What are journal entries, and why are they so important in financial accounting? Journal entries are the initial step in recording business transactions. Every journal entry will have at least one debit and at least one credit that balance with each other. They serve to record the financial effects of business transactions (like buying a car, getting a loan, selling services etc) in a formal and organized manner. They adhere to the fundamental accounting equation and follow a consistent debit/credit format so that the effects of each financial transaction are accurately tracked. They create an audit trail and prevent mistakes. Journal entries are very important because, without them, it would be difficult to track where a company’s resources are, what the company owes, and how successful the company is in generating profits. Without a solid understanding of journal entries, it is very difficult to learn more advanced topics in accounting.
    • What is a “T-account” and how is it used in accounting? A T-account is a simple visual representation of a general ledger account. It’s literally shaped like the letter T with the account name (e.g. Cash, Accounts Payable) above the T. The left side of the T is the “debit” side, while the right side is the “credit” side. After a transaction has been recorded in a journal entry, the details are transferred to the appropriate T-accounts, a process called “posting.” This helps to track the increases and decreases in every financial account of the company. T accounts are the basis for preparing financial statements and allow accountants to determine the ending balance of every account.
    • What are adjusting journal entries and what types are common? Adjusting journal entries are made at the end of an accounting period to correct errors, recognize transactions that have occurred over time but not yet been recognized, or to update the financial records. Common adjusting journal entries include: prepaid expenses, where a company pays for something in advance and uses it up over time, like insurance or rent; depreciation, which is where we record the wearing out of long term assets over time like equipment or buildings; accrued expenses, which are costs that have built up over time but have not yet been paid (think of interest owed or salaries earned by employees); and finally, accrued revenues which are revenues earned that have not been paid by customers yet. The core concept is that some transactions don’t happen in one single moment of time, they happen over a period of time and it is important to reflect this in a company’s financial statements.
    • What are closing entries and why are they important in the accounting cycle? Closing entries are made at the end of an accounting period to transfer the balances of temporary accounts (like revenues, expenses, and dividends) into a permanent account, which is normally the retained earnings account. Temporary accounts are used only to track an individual year’s performance. Once closed, they start fresh at zero for the next accounting period. The closing process ensures that revenue and expense information is summarized for each period, that they don’t carry forward from year to year, and that the profit generated by a company (net income) flows into retained earnings. Closing entries are a key part of closing one fiscal year and beginning another.

    Financial Accounting Fundamentals

    Okay, here is the detailed timeline and cast of characters based on the provided text:

    Timeline of Events (as presented in the text):

    • General Accounting Concepts Introduced:Discussion of Assets (things of value), Liabilities (obligations to repay), and Equity (what’s left after liabilities are paid from assets).
    • Explanation of Current vs. Long-term Assets and Liabilities (one year is the cutoff).
    • Explanation of Revenues (earned income), Expenses (costs incurred), and Dividends (shareholder profits taken from retained earnings).
    • Example of Account Classification:Categorization of various accounts as Assets, Liabilities, Equity, Revenue, or Expense (e.g., Long-term Investments, Accounts Receivable, Accounts Payable, Common Shares, etc).
    • Classification of assets and liabilities as current or long-term.
    • Personal Accounting Mistake and Encouragement:The speaker shares a story about getting a very low mark on their first accounting exam (28%) and the subsequent struggle, but ultimate success in the class and eventual career.
    • The speaker encourages viewers to keep going and improve if they struggle.
    • Introduction to Journal Entries:Explanation of the concept of debits and credits in journal entries, relating them to Newton’s third law (“for every action, there is an equal and opposite reaction”).
    • Example of a purchase (car for cash) to demonstrate journal entries (debit cars, credit cash).
    • Example of a purchase of a car using a loan (debit cars, credit car loan payable).
    • Example of buying a car with both cash and a loan (debit car, credit cash and credit car loan payable).
    • Practice with Journal Entries:Recording of several business transactions using journal entries including:
    • Share Issuance.
    • Payment of Rent.
    • Borrowing Money.
    • Equipment Purchase (part cash, part payable)
    • Purchase of Supplies on Account.
    • Completion of a Taxidermy Job on Account.
    • Dividend Payment.
    • Payment of Utilities Bill.
    • Payment for a past Equipment Purchase.
    • Receipt of Telephone Bill.
    • Collection of Receivable.
    • Payment for Supplies (Cash).
    • Sale of Taxidermy Services.
    • Rent Revenue.
    • Payment of Salaries.
    • Posting Journal Entries to T-Accounts:Introduction of T-accounts as a way of organizing journal entries into separate accounts (assets, liabilities, equity, revenue, expense).
    • Example of transferring debits and credits to T-accounts.
    • Adjusting Entries:Introduction to the concept of adjusting journal entries, which are not typically triggered by external transactions.
    • Examples of adjusting entries:
    • Prepaid Expenses: The example used was insurance, how to use up that asset over the life of the insurance.
    • Depreciation: Recording the reduction in value of an asset over time.
    • Accrued Expenses: Interest on a loan that is building up (but not yet paid).
    • Accrued Revenue: Revenue earned, but cash not received.
    • Discussion of how these adjusting entries are necessary for properly representing a company’s financial position.
    • Comprehensive Problem 1:A large multi-step problem that combined several concepts:
    • Making adjusting journal entries (for supplies, prepaid insurance, unearned revenue, depreciation etc.)
    • Preparing an Adjusted Trial Balance.
    • Preparing a full set of Financial Statements (Income Statement, Statement of Changes in Equity, Balance Sheet).
    • Closing Entries:Explanation of the purpose of closing entries (to reset temporary accounts).
    • Demonstration of closing entries with a focus on the income statement accounts.
    • Preparation of a Post-Closing Trial Balance.
    • Bank Reconciliations:Explanation of the purpose of a bank reconciliation.
    • Walk-through of bank reconciliation example.
    • Accounts Receivable and Bad Debts:Discussion of accounts receivable and the need for an allowance for uncollectible accounts.
    • Calculation and journal entry for bad debts expense and allowance for doubtful accounts.
    • Explanation of how a “write off” works to remove a bad debt.
    • Inventory and Cost of Goods Sold:Example of a simple inventory purchase and sale with the related journal entries.
    • Example of inventory purchases at multiple prices, and their impact on COGS.
    • Introduction of different inventory costing methods (FIFO, LIFO, Weighted Average).
    • Discussion of the Specific Identification method.
    • Inventory Methods (FIFO, LIFO, Weighted Average):Walk-through of inventory record example using FIFO (first in, first out).
    • Walk-through of inventory record example using LIFO (last in, first out).
    • Walk-through of inventory record example using weighted average method.
    • Depreciable Assets and Depreciation Methods: * Discussion of depreciation for assets with an estimated residual value. * Example and calculation of depreciation using straight-line method, including partial-year depreciation. * Example and calculation of depreciation using units of production method. * Example and calculation of depreciation using double declining balance method.
    • Sale of Assets:Example of selling a depreciated asset. Calculation of gains and losses on the sale and the related journal entries.
    • Bonds PayableDiscussion of Bonds Payable – both at a premium and at a discount, the need for amortization of premiums and discounts.
    • Examples of bond issue, interest payment and discount amortization.
    • Shareholder EquityDiscussion of preferred shares and their relative advantages to common shares.
    • Statement of Cash Flows:Explanation of the purpose of the Statement of Cash Flows and its three categories: Operating, Investing, and Financing.
    • Example of the reconciliation of retained earnings to arrive at dividends for the cash flow statement.
    • Preparation of a simple statement of cash flows from a balance sheet and income statement.
    • Financial Statement Analysis (Vertical Analysis):Introduction to Vertical Analysis and how it is useful to make comparisons between unlike sized companies.
    • Examples of preparing a common-sized income statement and a common-sized balance sheet.
    • Financial Ratio Analysis:Introduction to the importance and use of financial ratios for analysis.
    • Calculation and discussion of several financial ratios (current ratio, acid-test ratio, debt-to-equity ratio, return on equity, gross profit margin, return on assets).

    Cast of Characters (Principal People Mentioned):

    • The Instructor (Tony Bell): An accounting professor, presumably the narrator of the videos. He shares personal anecdotes about his own struggles with accounting, provides clear explanations of concepts, and guides viewers through the practice problems. He encourages viewer engagement with likes and subscribes.
    • Isaac Newton: A famous physicist whose third law is used as an analogy to explain the debit and credit relationship in journal entries.
    • Maria: The owner/shareholder of a company, implied in the journal entry example where they take a dividend.
    • W. White: The customer that wrote the bad NSF check in the bank reconciliation example.
    • The Car Dealer – the entity that sells the car to the instructor in the journal entry example.
    • MIT (Massachusetts Institute of Technology) The entity that issues bonds in an illustrative example.
    • Harvard University The entity used as a competitive example in the bond discussion.
    • Kemp Company: Hypothetical company used in the depreciation examples.
    • Bill’s Towing: The hypothetical company used in the asset sale example.
    • Tinger Inc. The hypothetical company used in the bond issuance examples.
    • Abdan Automart: The hypothetical company used in the inventory method examples.
    • Romney Inc.: Hypothetical company used in the combined purchase and sale inventory example.
    • Harre Gil & Hussein Inc.: The hypothetical entities compared using Vertical Analysis.

    This should give you a solid overview of the content covered in the provided text. Please let me know if you have any other questions or requests.

    Understanding the Income Statement

    An income statement, also called the statement of operations or profit and loss (P&L) statement, summarizes a company’s revenues and expenses to determine its profitability [1, 2].

    Key aspects of the income statement, according to the sources:

    • Purpose: To show whether a company was profitable, and if so, how much money it made [1]. It answers the question of whether earnings exceeded costs [2].
    • Components:
    • Revenues are what a company earns from its business activities [3]. Examples include sales revenue, tuition revenue, and rent revenue [3]. Revenues are considered “earned” [3].
    • Expenses are the costs of earning revenue [3]. Examples include salary expense, utilities expense, and maintenance expense [3].
    • Net Income or Profit is calculated by subtracting total expenses from total revenues [1].
    • Format:
    • A proper income statement title includes three lines: the company’s name, the name of the statement, and the date [4].
    • The date must specify the time period the statement covers (e.g., “for the year ended”) [4].
    • Revenues are listed first, followed by expenses [5].
    • A total for expenses is shown [5].
    • The net income is double-underlined [6].
    • Dollar signs are placed at the top of each column and beside any double-underlined number [6].
    • Gross Profit: In a retail business, the income statement includes the cost of goods sold (COGS). Sales revenue minus sales returns and allowances equals net sales. Net sales minus COGS equals gross profit [7, 8].
    • A gross profit percentage can be calculated by dividing gross profit by net sales [9].
    • Operating Income: The income statement lists operating expenses, which, when subtracted from gross profit, gives the operating income or profit [8, 9].
    • Non-operating Items: The income statement may include non-operating expenses, such as interest and income tax [10, 11].
    • Usefulness: An income statement is typically one of the first places an analyst will look to assess a company’s performance [2].

    It is important to note that the income statement should be compared to prior periods to assess whether a company’s profit is trending up or down [6]. An analyst may also compare the income statement to those of other companies [4].

    Statement of Changes in Equity

    A statement of changes in equity summarizes how a company’s equity accounts changed over a period of time [1, 2]. The statement details the changes in the owner’s stake in the company [1, 3].

    Key aspects of the statement of changes in equity, according to the sources:

    • Purpose: The statement shows the changes in equity accounts over a period [2]. It summarizes what happened to the shareholders’ equity accounts during the year [1].
    • Components:
    • Beginning Balance: The statement begins with the balances of each equity account at the start of the period [2]. For example, the beginning balance of common shares and retained earnings on January 1st [2].
    • Changes During the Period: The statement then shows how each equity account changed during the period.
    • For common shares, this may include increases from issuing new shares or decreases from repurchasing shares [3, 4].
    • For retained earnings, this includes increases from net income, and decreases from dividends [3, 4].
    • Ending Balance: The statement ends with the balance of each equity account at the end of the period [4].
    • Key Accounts: The main equity accounts that are tracked are:
    • Common shares [1, 3] (also called share capital [3]) which represents the basic ownership of the company [3].
    • Retained earnings [1, 3] which represents the accumulated profits of the company that have not been distributed to shareholders [3].
    • Preferred shares, which are a class of shares that have preferential rights over common shares, such as a preference for dividends [5].
    • Dividends:
    • Dividends represent the distribution of profits to shareholders [6].
    • Cash dividends reduce retained earnings and shareholders’ equity [3].
    • A stock dividend involves issuing new shares to existing shareholders [7]. This does not affect the total value of shareholders’ equity [8].
    • Format:
    • The statement includes a three-line title: company name, the name of the statement, and the date [2].
    • The date specifies the period the statement covers (e.g., “for the year ended”) [2].
    • Each equity account is listed as a column heading [2].
    • Dollar signs are placed at the top of each column and beside any double-underlined numbers [4].
    • Relationship to Other Statements:The net income from the income statement is used to calculate the change in retained earnings [4, 9].
    • The ending balances of the equity accounts are carried over to the balance sheet [10].
    • The changes in retained earnings shown on the statement of changes in equity are captured in the closing journal entries [9].

    In summary, the statement of changes in equity provides a detailed view of how the owners’ stake in the company has changed over time, linking the income statement and the balance sheet [1].

    Understanding the Balance Sheet

    A balance sheet, also called the statement of financial position, is a financial statement that presents a company’s assets, liabilities, and shareholders’ equity at a specific point in time [1, 2]. The balance sheet is based on the fundamental accounting equation: Assets = Liabilities + Shareholders’ Equity [3].

    Key aspects of the balance sheet, according to the sources:

    • Purpose: To provide a snapshot of what a company owns (assets), what it owes (liabilities), and the owners’ stake in the company (equity) at a specific date. It shows the financial position of the company at that moment in time [2].
    • Components:
    • Assets: These are things a company owns or controls that have value [4, 5]. They are resources with future economic benefits [5]. Assets are listed in order of liquidity, from most to least liquid [6].
    • Current assets are expected to be converted to cash or used up within one year [7]. Examples include cash, accounts receivable, inventory, and office supplies [5, 7, 8].
    • Long-term assets, also called property, plant, and equipment (PP&E), are assets that are not expected to be converted to cash or used up within one year. Examples include buildings, land, and equipment [5].
    • Assets are recorded at their net book value, which is the original cost minus any accumulated depreciation [9].
    • Liabilities: These are obligations of the company to others, or debts that must be repaid in the future [10]. They represent future economic obligations [10]. Liabilities are also categorized as either current or long-term.
    • Current liabilities are obligations due within one year [7]. Examples include accounts payable, wages payable, and notes payable [10].
    • Long-term liabilities are obligations due in more than one year. Examples include bank loans and mortgages [10].
    • Shareholders’ Equity: This represents the owners’ stake in the company, and is the residual interest in the assets of the company after deducting liabilities [3].
    • Key accounts include common shares (or share capital) and retained earnings [11].
    • Retained earnings are the accumulated profits that have not been distributed to shareholders [11].
    • Format:
    • The balance sheet has a three-line title: company name, the name of the statement, and the date [2].
    • Unlike the income statement or statement of changes in equity, the balance sheet is dated for a specific point in time, not for a period (e.g., “December 31, 2024,” not “for the year ended”) [2].
    • Assets are typically listed on the left side, and liabilities and shareholders’ equity are on the right side [6].
    • Assets are listed in order of liquidity, from the most current to the least [6].
    • Dollar signs are placed at the top of each column and beside any double-underlined numbers [12, 13].
    • Relationship to other Statements:
    • The ending balances of the equity accounts are taken from the statement of changes in equity [14].
    • The balance sheet provides information for the statement of cash flows, particularly for noncash assets and liabilities [15].
    • Balancing: The balance sheet must always balance, meaning that total assets must equal total liabilities plus total shareholders’ equity [1, 6].

    In summary, the balance sheet provides a fundamental overview of a company’s financial position at a specific point in time, showing the resources it controls, its obligations, and the owners’ stake in the company [2].

    Financial Ratio Analysis

    Financial ratios are calculations that use data from financial statements to provide insights into a company’s performance and financial health [1]. They are used to analyze and compare a company’s performance over time or against its competitors [1-3].

    Here’s a breakdown of key financial ratios discussed in the sources, categorized by the aspects of a company they assess:

    I. Liquidity Ratios These ratios measure a company’s ability to meet its short-term obligations [4, 5].

    • Current Ratio: Calculated as current assets divided by current liabilities [4, 6]. It indicates whether a company has enough short-term assets to cover its short-term debts [4, 6].
    • A general rule of thumb is that a current ratio above 1.5 is considered safe [5]. However, this may not apply to all companies [5].
    • A higher ratio generally indicates better liquidity [5].
    • Asset Test Ratio (or Quick Ratio): Calculated as (cash + short-term investments + net current receivables) divided by current liabilities [7, 8]. This ratio is a stricter measure of liquidity, focusing on the most liquid assets.
    • A general rule of thumb is that an asset test ratio of 0.9 to 1 is desirable [7].
    • It excludes inventory and prepaid expenses from current assets [7, 8].

    II. Turnover (Efficiency) Ratios These ratios measure how efficiently a company is using its assets [8].

    • Inventory Turnover: Calculated as cost of goods sold (COGS) divided by average inventory [8]. It measures how many times a company sells and replaces its inventory during a period [8].
    • A higher turnover indicates better efficiency [9].
    • Receivables Turnover: Calculated as net sales divided by average net accounts receivable [9]. It measures how many times a company collects its average accounts receivable during a period [9].
    • A higher turnover indicates a company is more effective in collecting its debts [9].
    • Days to Collect Receivables: Calculated as 365 divided by receivables turnover [9]. It measures the average number of days it takes a company to collect payment from its customers [9].
    • A lower number is generally better, as it indicates a company is collecting payments more quickly [9].

    III. Long-Term Debt-Paying Ability Ratios These ratios assess a company’s ability to meet its long-term obligations and its leverage [9].

    • Debt Ratio: Calculated as total liabilities divided by total assets [9]. It indicates the proportion of a company’s assets that are financed by debt [9].
    • A lower debt ratio is generally considered safer, as it indicates less reliance on debt financing [9, 10].
    • Times Interest Earned: Calculated as operating income divided by interest expense [10]. It measures a company’s ability to cover its interest expense with its operating income [10].
    • A higher ratio indicates a greater ability to pay interest [10].

    IV. Profitability Ratios These ratios measure a company’s ability to generate profits from its operations [10].

    • Gross Profit Percentage: Calculated as gross profit divided by net sales [11]. It measures a company’s profitability after accounting for the cost of goods sold [11].
    • A higher percentage indicates a better ability to generate profit from sales [11].
    • Return on Sales: Calculated as net income divided by net sales [11]. It measures how much profit a company generates for each dollar of sales [11].
    • A higher percentage indicates better profitability [11].
    • Return on Assets (ROA): Calculated as (net income + interest expense) divided by average total assets [11]. It measures how effectively a company is using its assets to generate profit [11].
    • A higher ROA indicates better asset utilization and profitability [12].
    • Return on Equity (ROE): Calculated as (net income – preferred dividends) divided by average common shareholders’ equity [12]. It measures how much profit a company generates for each dollar of shareholders’ equity [12].
    • A higher ROE indicates better returns for shareholders [12].

    V. Stock Market Performance Ratios These ratios assess a company’s performance from the perspective of stock market investors [13].

    • Price-Earnings Ratio (P/E Ratio): Calculated as market price per share divided by earnings per share [13]. It indicates how much investors are willing to pay for each dollar of a company’s earnings [13].
    • A higher P/E ratio may indicate that a stock is overvalued [1, 13].
    • Dividend Yield: Calculated as dividends per share divided by market price per share [13]. It indicates the percentage of the stock price that is returned to shareholders as dividends [13].
    • A higher yield can be attractive to income-focused investors [13].

    Additional Notes:

    • Horizontal Analysis compares financial data over different time periods (e.g. year over year) [14].
    • Vertical Analysis (or Common-Size Analysis) expresses each item in a financial statement as a percentage of a base number, such as net sales for the income statement or total assets for the balance sheet [3]. This helps compare companies of different sizes [3].
    • When analyzing ratios, it is important to compare them to industry averages or to a company’s historical performance to assess if the ratio is considered good or bad [1, 2].
    • It is important to note that a ratio may be interpreted differently depending on the company and industry [5, 10].
    • Many companies will focus on gross profit percentages, and will be especially interested if costs of goods sold are outpacing sales, impacting margins [2].
    • Analysts are typically interested in seeing positive and growing operating cash flows from the statement of cash flows [15].
    • A company’s cash flow statement and ratios are often used to determine if the company has enough cash on hand to meet its short-term obligations [16].

    Bank Reconciliation: A Comprehensive Guide

    A bank reconciliation is a process that compares a company’s cash balance as per its own records (book balance) with the corresponding cash balance reported by its bank (bank balance) [1]. The goal is to identify and explain any differences between these two balances and to correct any errors or omissions [1].

    Here are key points about bank reconciliations based on the sources:

    • Purpose:
    • To identify discrepancies between the bank’s record of cash and the company’s record of cash [1].
    • To ensure that a company’s cash records are accurate and up to date.
    • To identify errors made by either the company or the bank and make corrections to those errors [1, 2].
    • To detect fraud or theft by identifying unauthorized transactions [1, 2].
    • To provide better internal control of cash [1].
    • Timing: Bank reconciliations are typically prepared monthly [1].
    • Format:A bank reconciliation typically starts with the ending balance per bank statement and the ending balance per the company’s books [2].
    • It includes adjustments to each of these balances to arrive at an adjusted or reconciled cash balance [2].
    • The format of a bank reconciliation resembles a balance sheet, where the left side pertains to the bank’s perspective and the right side pertains to the company’s perspective [3].
    • Items Causing Differences:Bank side adjustments: These are items that the bank knows about but the company does not know about until it receives the bank statement.
    • Deposits in transit: Deposits made by the company but not yet recorded by the bank [3].
    • Outstanding checks: Checks written by the company but not yet cashed by the recipients, and thus not yet deducted from the bank balance [3, 4].
    • Book side adjustments: These are items that the company knows about, but that the bank doesn’t know about until it receives the company’s information [5].
    • Non-sufficient funds (NSF) checks: Checks received from customers that have bounced due to insufficient funds in the customer’s account [6].
    • Bank collections: Amounts collected by the bank on the company’s behalf, such as notes receivable [6].
    • Electronic funds transfers (EFT): Payments or collections made electronically that may not yet be recorded by the company [6].
    • Bank service charges: Fees charged by the bank [6].
    • Interest earned: Interest credited to the company’s account by the bank [6].
    • Errors: Mistakes in recording transactions by either the bank or the company [2].
    • For example, the company may have recorded a check for an incorrect amount [2]. If a check was recorded for too much, cash needs to be debited by the difference, and vice versa [6, 7].
    1. Steps in Preparing a Bank Reconciliation:Start with the ending cash balance per the bank statement and the ending cash balance per the company’s books [3].
    2. Identify and list all the deposits in transit and outstanding checks, and make the necessary additions to or subtractions from the bank balance [3, 4].
    3. Identify and list all items that need to be adjusted on the book side, such as NSF checks, bank collections, electronic funds transfers, bank service charges, and errors [5-7].
    4. Make the necessary additions to or subtractions from the book balance [5-7].
    5. Calculate the adjusted or reconciled cash balance on both the bank and book sides [5, 7]. These adjusted balances should be the same if the reconciliation is done correctly.
    • Journal Entries:
    • Journal entries are required for the adjustments made to the company’s book balance [7].
    • These entries are made to correct the company’s cash account for items that the company did not know about, as well as any errors discovered during the bank reconciliation process.
    • All of these entries will involve the cash account [7, 8].

    In summary, a bank reconciliation is a critical control activity that ensures the accuracy of a company’s cash records. It involves comparing the bank’s records to the company’s records, identifying any discrepancies, and making necessary adjustments to both sets of records. The process helps maintain accurate financial statements and protect the company from errors and fraud [1].

    By Amjad Izhar
    Contact: amjad.izhar@gmail.com
    https://amjadizhar.blog

  • Al Riyadh Newspaper: March 04, 2025 Charitable Campaigns, Diverse Insights, Efforts to Mediate

    Al Riyadh Newspaper: March 04, 2025 Charitable Campaigns, Diverse Insights, Efforts to Mediate

    The provided texts offer diverse insights into recent events and ongoing initiatives in Saudi Arabia and the wider world. Several articles focus on Saudi Arabia’s domestic affairs, highlighting cultural events, economic developments, and charitable campaigns, including those associated with Ramadan. A key focus is on Saudi Arabia’s foreign relations and its role in regional diplomacy, including the Lebanese president’s visit and efforts to mediate in the Russia-Ukraine conflict. Furthermore, some items discuss economic issues such as oil prices, stock market performance, and trade relations, both within the Kingdom and on a global scale. There is also coverage on events surrounding the Israeli-Palestinian conflict, and the impacts of geopolitical issues on oil prices. Finally, one article focuses on how brands such as Bata incorporate sustainable practices in order to foster a good public image.

    Study Guide: Saudi Arabia in 2025 – A Deep Dive

    Quiz (Short Answer)

    1. According to the text, what are the main benefits of the agreement between the US and Ukraine on rare minerals?
    2. What role does the Crown Prince of Saudi Arabia play in supporting charitable work, as mentioned in the text?
    3. How does the Saudi Vision 2030 relate to the economic cooperation between Saudi Arabia and Lebanon, according to the text?
    4. What are some of the consequences of Netanyahu’s actions in Gaza, according to the Israeli analyst Harel?
    5. What is the significance of the “Year of Handicrafts 2025” initiative in Saudi Arabia, as presented in the text?
    6. What is the “Markaz,” and what is its significance during Ramadan in Jeddah?
    7. What is the Saudi Forum for Media intended to do for the Kingdom?
    8. What are the two sides hoping to get out of the Saudi Pro League?
    9. What is the main impact of the Russian-Ukrainian war for European allies?
    10. According to the source, how does Bata integrate sustainable practices in its company culture?

    Quiz Answer Key

    1. The agreement allows the US to invest in rare minerals needed for industries like AI and renewable energy and provides Ukraine with security assurances against the war. It would give the US investment rights to 50% of Ukraine’s rare minerals.
    2. The Crown Prince actively supports charitable initiatives with significant funding, demonstrating the Kingdom’s commitment to improving citizens’ lives and promoting social solidarity. He donated 150 million riyals.
    3. Saudi Vision 2030 aligns with efforts to boost Saudi-Lebanese economic cooperation, particularly in sectors like technology and energy, fostering investment and development in Lebanon.
    4. Harel suggests that Netanyahu’s policies lead to resumed conflict, increased Israeli casualties, and the potential sacrifice of hostages due to Hamas’ strengthened defensive capabilities.
    5. It aims to boost the cultural tourism sector by promoting traditional crafts, supporting local artisans, and raising awareness of Saudi Arabia’s heritage, while integrating them to the new digital economy.
    6. The “Markaz” is a traditional gathering place, especially in historic Jeddah neighborhoods, where people gather to socialize, share Ramadan drinks, and enjoy traditional sweets, strengthening social bonds during the holy month.
    7. The Saudi Forum for Media intends to promote Saudi Arabia’s interests, enhance its positive image, and foster communication with global media organizations, highlighting the Kingdom’s progress and development in various sectors.
    8. The teams hope to overcome challenges and push for victory and the Asian Championship as the teams work towards the elimination stage.
    9. The Russian-Ukrainian war has encouraged allies like those in Europe to increase their defense spending in light of the dangers associated with war.
    10. Bata strategically integrates sustainability by launching internal awareness campaigns, publishing reports on sustainability initiatives, participating in dialogues with stakeholders, and engaging with media to improve its sustainable initiatives.

    Essay Format Questions

    1. Analyze the evolving relationship between Saudi Arabia and Lebanon as depicted in the article, considering historical ties and future economic prospects.
    2. Discuss the implications of the Israeli-Palestinian conflict, focusing on the role of international actors and the humanitarian consequences, as presented in the provided document.
    3. Evaluate the significance of Saudi Arabia’s Vision 2030 in the context of its domestic social and economic reforms and its role in regional and international affairs.
    4. Examine the impact of the Russian-Ukrainian war on global energy markets and the diplomatic efforts to resolve the conflict, as presented in the provided news excerpts.
    5. Assess the role of media and cultural initiatives in promoting Saudi Arabia’s national identity and its engagement with the international community, using examples from the document.

    Glossary of Key Terms

    • ولي العهد (Wali al-Ahd): Crown Prince. The designated successor to the throne in a monarchy.
    • روؤية 2030 (Ru’yah 2030): Vision 2030. Saudi Arabia’s strategic framework to reduce its dependence on oil, diversify its economy, and develop public service sectors.
    • مبادرات (Mubadarat): Initiatives. New programs or plans, often related to development or reform.
    • التحديات (Al-Tahadiyat): Challenges. Difficulties or obstacles that need to be overcome.
    • السلام (As-Salam): Peace. The condition marked by the absence of war or conflict.
    • القضايا المستدامة (Al-Qadaya al-Mustadama): Sustainable issues. Problems or challenges that deal with responsible resource management and environmental consciousness.
    • المنظمة الصحية (Al-Munazzama as-Sihiyya): Health Organization. Typically refers to an entity focused on health, like a public health ministry.
    • المسجد النبوي (Al-Masjid an-Nabawi): The Prophet’s Mosque. One of the holiest sites in Islam, located in Medina.
    • القطاع الصحي (Al-Qita’ as-Sihi): The Health Sector. Encompasses all health-related services and institutions.
    • الإعلام (Al-I’lam): Media. Various means of communication, such as newspapers, television, and the internet.
    • الحرف اليدوية (Al-Hiraf al-Yadawiyya): Handicrafts. Objects made by hand, often representing traditional culture.
    • دبلوماسية (Diblomasiya): Diplomacy. The art and practice of conducting negotiations between representatives of states.
    • الإصلاحات (Al-Islahat): Reforms. Changes to improve a system or institution.

    Al Riyadh Newspaper Analysis: Themes and Key Ideas

    Okay, here’s a briefing document based on the provided Arabic text excerpts.

    Briefing Document: Analysis of Themes and Key Ideas

    Date: October 26, 2023

    Source: Excerpts from “20705.pdf,” a daily newspaper published by Al Yamamah Press Foundation (Issue: 20705, dated March 4, 2025)

    Executive Summary:

    This document summarizes the main themes and key ideas found in the provided excerpts from the Al Riyadh newspaper. The articles cover a range of topics, including Saudi Arabia’s charitable initiatives, diplomatic relations with Lebanon and Bulgaria, internal matters such as the appointment of new governors, global economics (US-China Trade), and the Russian-Ukraine war, among others. The texts highlight Saudi Arabia’s commitment to domestic and international development, humanitarian aid, and regional stability, while also addressing pressing global issues and their impact on the Kingdom.

    Key Themes and Ideas:

    1. Saudi Arabia’s Commitment to Charity and Humanitarian Aid:
    • The excerpts emphasize the Kingdom’s dedication to charitable work, both domestically and internationally. This is portrayed as a long-standing tradition, particularly amplified during Ramadan.
    • Quote: “على اململكة اعتادت نوعية مبادرات واالرتقاء براجمه وتفعيل اخلري، صناعة صدام، ومنهج ممُ بشكل مشروعًا أصول بثمراته إلى أكر عدد من للو الرامج هذه كانت وإذا استفيدين، المل العام، شهور طيلة متواصلة والمشروعات فهي تكون مكثفة خالل شهر مصان المبارك، والأفراد الدولة مؤسسات تتنافس حيث الثواب كسب يف طمعًا اخلري، عمل يف من عند الله، بتشجيع من والة الأمر الذين ال يدخرون جهدًا يف دعم كل املبادرات” (Translation: “The Kingdom is accustomed to quality initiatives, upgrading its programs, and activating charity, creating an impact, and a systematic approach in its projects to deliver its fruits to the largest number of beneficiaries. These programs have been continuous throughout the year, and projects are intensified during the blessed month of Ramadan. Individuals and state institutions compete, where rewards are earned in the pursuit of good deeds, inspired by God, and encouraged by the rulers who spare no effort in supporting all initiatives.”)
    • Specific initiatives, such as the King Salman Center for Relief and Humanitarian Aid, are highlighted as models for effective aid delivery.
    • Quote: “متثل المناطق جود حملة كانت وإذا صناعة اخلري داخل مناطق ل من نموذجًا للإغاثة، سلمان امللك مركز يبقى اململكة، النموذج املتنوعة ومشروعاته براجمه الأبرز لصناعة اخلري خارج الحدود، ما ابتكار يف اململكة ريادة للجميع يؤكد النبيلة انسانية املبادرات وتنفيذ” (Translation: “If the ‘Joud’ campaign was a good example, then making goodness inside areas for example, the King Salman Center remains the Kingdom’s model for relief, a diverse model with programs and projects being the most prominent for making goodness outside the borders, any innovation in the Kingdom emphasizes its leadership for all noble human initiatives and implementation.”)
    • The “Joud” campaign is mentioned as a model, along with King Salman Center which also operates beyond the Kingdom’s borders.
    • Prince Faisal bin Salman launches “Ajer Ghair Mamnoun” (“Unrewarded Reward”) campaign to support charitable giving, and “Al Shefa” health waqf fund.
    1. Saudi-Lebanese Relations:
    • The visit of the Lebanese President to Saudi Arabia is framed as a turning point in relations, marking a new phase of cooperation after a period of challenges.
    • Quote: “لبنان بن التاريخية الروابط عمق تعكس خطوة يف واململكة العربية السعودية، حلَّ الرئيس اللبناني جوزيف عون صيفًا على الرياض تلبية لدعوة كرمية من صاحب السمو الملكي الأمير محمد بن سلمان، ويل العهد رئيس مجلس الوزراء. هذه الزيارة متثل نقطة حتول يف مسار العديد شهدت حلقبة حدًا وتضع البلدين، بن العالقات التعاون من جديدة مرحلة ببداية إيذانًا التحديات، من” (Translation: “A step that reflects the depth of the historical ties between Lebanon and the Kingdom of Saudi Arabia, the Lebanese President Joseph Aoun arrived in Riyadh in response to a generous invitation from His Royal Highness Prince Mohammed bin Salman, Crown Prince and Prime Minister. This visit represents a turning point in the course of relations between the two countries, setting a limit to an era that witnessed many challenges, marking the beginning of a new phase of cooperation.”)
    • Saudi Arabia’s “Vision 2030” is mentioned as an opportunity for Lebanon, and the Crown Prince’s efforts to support Lebanese stability are highlighted.
    • The historical support of Saudi Arabia for Lebanon, dating back to the reign of King Abdulaziz Al Saud and its role in the Taif Agreement, are also referenced.
    • The visit renewed hopes for economic ties, particularly in agricultural, industrial, and electronic products.
    • Investment opportunities are opening in technology, energy aligning with Vision 2030
    1. Saudi-Bulgarian Relations:
    • The Saudi leadership sent congratulatory messages to the President of Bulgaria on the occasion of the country’s National Day, indicating positive diplomatic engagement.
    1. Local events:
    • Princess Fahda bint Falah Al Hithlain sponsored the awarding of the King Salman Prize for memorizing the Quran
    • Prince Faisal bin Mishaal visits judges and sheikhs
    1. Israeli-Palestinian Conflict:
    • The article cites an Israeli analyst who criticizes Netanyahu’s government for taking “adventurous” steps in the Palestinian territories, Syria, and Gaza.
    • Quote: “احلكومة رئيس خطوات عواقب من إسرائيليون، حمللون حذر الإسرائيلية، بنيامن نتنياهو، يف الأيام الأخرية، بدعم من إدارة الرئيس الأمريكي، دونالد ترمب، وأشاروا إلى اأن مواقف ترمب احلالية ميكن اأن تتغري وفقا ملصالحه.” (Translation: “Israeli analysts warned of the consequences of the steps of the head of the Israeli government, Benjamin Netanyahu, in recent days, with the support of the administration of US President Donald Trump, and pointed out that Trump’s current positions may change according to his interests.”)
    • The analyst alleges violations of agreements, expansion of settlements, and restrictions on humanitarian aid to Gaza.
    • The article touches on potential international criticism and accusations of “deliberate starvation” against Israel.
    • The capture of 187 Palestinians
    1. Global Economic Issues:
    • The article addresses the potential impact of a trade war between the US and China, as well as the US with Canada and Mexico.
    • Quote: “الأمريكي الرئيس وإصرار اوكرانيا تواجهها التي املصاعب تظهر حجم عليها، وإن الإدارة الأمريكية ترى اأن كييف مدينة لها بهذه التفاقية مقابل” (Translation: “And given the insistence of the American president, the extent of the difficulties Ukraine is facing is evident, and the American administration sees that Kiev is indebted to it by this agreement in return for what was presented for American financial support.”)
    • The potential for a “minerals agreement” with Ukraine is discussed in the context of the Russia-Ukraine war and US support.
    • The article touches on the need for international partnerships and cooperation in the face of complex economic challenges.
    1. Russia-Ukraine War:
    • The article discusses French President Macron’s proposal for a truce in Ukraine during the Olympics, focusing on energy infrastructure.
    • Concerns about the war’s impact on global oil supplies and prices are mentioned.
    • There is an analysis of the conflict’s impact on the world economy.
    • Reported that Zelensky offered his resignation in exchange for NATO membership.
    1. Oil Market Dynamics:
    • The impact of the Trump-Zelensky “shouting match” on the global oil market.
    • Reports on attacks on Russian refineries impacting exports.
    1. Financial Market Activity:
    • Increase in Gold prices as Safe Haven
    • Increase in Saudi stock market (TASI)
    1. Cultural Initiatives:
    • Three films supported by the Red Sea Film Foundation win awards at the Berlin Film Festival
    • The “Abu Samel” family returns in “Jack Al-Alam 2”
    • Highlight on the “Year of Handicrafts 2025”
    • Importance of traditional medicine
    1. Other domestic events:
    • Arar’s Ramadan traditions
    • The inauguration of three free bus stations
    1. Sports:
    • Al-Nassr ties to Al-Istiqlal
    • Al-Ahli faces Al-Rayyan
    • Al-Hilal faces Pakhtakor
    • Al-Ittihad fails to beat Al-Akhdoud
    • The Saudi national weightlifting team travels to Turkey for preparations

    Quotes and specific article titles:

    • “The Kingdom and Lebanon Close the Page on ‘Challenges’”
    • “Saudi Arabia and Lebanon are turning a new page on economic relations. 100,000 Lebanese residents welcome the return of momentum to relations between the two countries”
    • “Israeli Analyst: Netanyahu’s Government Behaves Adventurously on All Fronts”
    • “Clash between Trump and Zelensky.. Disrupts Oil Markets”
    • “Russian-Ukrainian War and the Expected Riyadh Summit”
    • “Metals of Ukraine… and the American-Chinese Separation”
    • “‘Deaf with Health’… Awareness Efforts for Quality of Life”
    • “90% of residents of the Jenin camp displaced”

    Potential Implications:

    • The Saudi focus on charity and aid reinforces its image as a responsible global actor and leader in the Islamic world.
    • Improved Saudi-Lebanese relations could lead to increased economic cooperation and regional stability.
    • The concerns raised about Israeli policies may reflect a desire for a more balanced approach to the Israeli-Palestinian conflict.
    • Economic analysis suggests a cautious approach to global trade tensions, with a focus on diversification and partnerships.
    • Coverage of the Russia-Ukraine war highlights the need for diplomatic solutions and mitigation of economic consequences.

    Further Research:

    • Investigate the specific details of the “Joud” campaign and other Saudi charitable initiatives.
    • Analyze the economic impact of renewed Saudi-Lebanese cooperation.
    • Examine Saudi Arabia’s position on the Israeli-Palestinian conflict in greater depth.
    • Assess the potential consequences of a US-China trade war on the Saudi economy.

    I hope this briefing document is helpful.

    Global Affairs and Saudi Arabia’s Initiatives

    What is the significance of the Saudi campaign to promote good deeds during Ramadan?

    The campaign, supported by Saudi leadership, encourages charitable acts, highlights Islamic values, and fosters social solidarity. It provides support to citizens in need and aims to ensure adequate housing and promote unity. It emphasizes innovative initiatives, and aims to serve as an example for global relief efforts, reinforcing Saudi Arabia’s leadership in noble endeavors.

    What does the visit of the Lebanese President to Saudi Arabia signify?

    The visit signifies a renewal of economic ties and reflects the deep historical relations between Lebanon and Saudi Arabia. It marks a turning point in the relationship and the beginning of a new phase of cooperation, with the Saudi Vision 2030 offering opportunities for Lebanon’s development. This also hopes to boost financial stability between the two countries.

    How is the Israeli government behaving, according to analysts, and what are the potential consequences?

    According to Israeli analysts, the Netanyahu government is acting recklessly on all fronts, with the support of the U.S. administration. This includes violating agreements, seizing Syrian territory, threatening intervention in Syria, and restricting aid to Gaza. These actions risk reigniting conflict and sacrificing the well-being of hostages, as well as potentially further destabilizing the region.

    What is the “Ajer Ghair Mamnoon” campaign and what are its goals?

    The “Ajer Ghair Mamnoon” campaign, launched by Prince Faisal bin Salman, aims to promote charitable giving during Ramadan. It encourages individuals, organizations, and donors to contribute to the Waqf Fund, which supports healthcare initiatives and provides assistance to beneficiaries in Medina and Mecca. The campaign reflects Islamic values and fosters social cohesion.

    What are the economic implications of the tension between President Trump and President Zelensky?

    The tension between Presidents Trump and Zelensky has broader economic implications, potentially disrupting oil markets and global trade. Trump’s trade policies, including tariffs on goods from China, Mexico, and Canada, could harm the American economy and lead to increased inflation. The article also mentions the importance of stable relationships with oil exporting countries like Russia and Iraq.

    How are the arts and cultural programs in Saudi Arabia being promoted?

    Saudi Arabia actively promotes arts and culture through initiatives like the Red Sea International Film Festival, which supports local filmmakers and showcases Saudi talent on the global stage. Additionally, the Ministry of Culture’s initiative to recognize 2025 as the “Year of Handicrafts” aims to preserve and promote traditional crafts as a vital part of Saudi cultural heritage and tourism.

    How has the conflict between Russia and Ukraine impacted global oil prices and what factors might contribute to price stabilization?

    The Russia-Ukraine war disrupted global oil supplies, leading to price volatility. Attacks on Russian refineries further exacerbated concerns about exports of refined products. However, potential factors that could stabilize prices include increased oil production by OPEC+, a potential peace agreement between Russia and Ukraine, and increased U.S. pressure on Iraq to resume exports from the Kurdistan region.

    What are the diplomatic efforts aiming to address the Russia-Ukraine conflict, and what are the challenges involved?

    Diplomatic efforts include proposals for ceasefires during specific periods, but challenges persist, including Ukraine’s desire to regain territory and concerns over Russia’s territorial control. Negotiations are underway, with the United States playing a key role, but reaching a resolution that satisfies all parties remains difficult. The importance of effective diplomacy to mitigate conflict and promote sustainable solutions is emphasized.

    Saudi Arabia: Culture, Diplomacy, and Humanitarian Efforts

    Saudi Arabia’s internal and external efforts, plus some traditions, are mentioned throughout the sources:

    • Leadership & Philanthropy: Saudi Arabia is recognized for initiating programs, improving them, and activating charitable projects to benefit a large number of people. The state’s institutions compete in doing good, encouraged by the government.
    • Humanitarian Aid: The Kingdom has several humanitarian initiatives. The King Salman Center serves as a model for relief efforts inside the Kingdom. It is also considered a leading example for charitable work outside its borders through its diverse programs and projects.
    • Relationship with Lebanon: Saudi Arabia plays a pivotal role in supporting Lebanon, with deep-rooted historical ties dating back to the era of King Abdulaziz Al Saud. The Kingdom is portrayed not just as an economic gateway but also as a political partner to Lebanon. The Saudi market is a main source of Lebanese exports.
    • Cultural and Religious Significance: مكة (Mecca) is recognized as a central religious hub for Muslims. مدنة (Medina) is a destination for pilgrims. The country emphasizes the values of Islamic authenticity, societal cohesion, and sustained giving.
    • Economic Development: Saudi Arabia aims to achieve sustainable development goals by fostering a conducive environment for all citizens.
    • Modernization & Vision 2030: The Kingdom’s Vision 2030 aims to boost high-quality linguistic initiatives, strengthen identity, and enrich Arabic content.
    • ** رمضان (Ramadan) Celebrations & Traditions:**
    • Many people decorate the facades of houses in Jeddah with illuminated lanterns and Ramadan decorations that reflect the spiritual atmosphere of the month of Ramadan.
    • In Jazan, presenting ” الماء المبخر” ( Mabkhar water) is a tradition that symbolizes hospitality and generosity.
    • In the northern region, نق�ش الحناء (Henna نقش ) is used to encourage young girls to fast.
    • Global Diplomacy: Saudi Arabia is emerging as a crucial player in the changing geopolitical landscape, tackling challenges through dialogue. The upcoming summit hosted by Saudi Arabia, with the participation of the Crown Prince, is portrayed as a vital opportunity to stop losses and work towards a fair and lasting peace.

    Russia-Ukraine War: Negotiations, Global Impact, and Key Players

    Here’s what the sources say about the Russia-Ukraine war:

    • Saudi Arabia’s Role: Saudi Arabia is seen as a key regional player in resolving the Russia-Ukraine war.
    • US-Ukraine Relations: The US administration sees an agreement with Ukraine as a way to address challenges. However, conflicting reports suggest that the US president and the President of Ukraine had a heated meeting, and that the Ukrainian President left without an anticipated agreement concerning sharing rights to Ukrainian metals.
    • Negotiations & Peace Talks:European leaders are proposing a month-long truce in Ukraine.
    • There are increasing doubts about a US-brokered peace agreement between Russia and Ukraine.
    • The Ukrainian President’s advisor was critical of the US for trying to end the war while, in their view, the Ukrainian President has different goals.
    • Global Impact:The potential disruption of a peace agreement between Russia and Ukraine is causing instability.
    • The war is contributing to uncertainty and fluctuations in global markets, including oil prices.
    • The conflict is a significant concern for the United States and European countries.
    • Turkey’s Role:Turkey had halted a pipeline in March 2023 and is ready to resume operations that carry oil exports from the region of Kurdistan.

    Saudi Football League: Competition, Teams, and Players

    Here’s what the sources say about football leagues:

    • Saudi League Importance The Saudi League garners significant attention from followers and football enthusiasts.
    • Competition and Excitement The presence of approximately eight teams vying to avoid relegation to the First Division enhances the competitiveness and excitement of the league matches. These teams strive to win to secure their position among the top teams and remain in what is considered one of the greatest Arab leagues.
    • Increased Competition The current situation in the league makes the competition stronger among all competing teams.
    • Team Efforts and Fan Expectations Team coaches are doing everything they can, but more effort is needed to realize the dreams of sports fans.
    • Potential for Upsets There is an acknowledgement that the final weeks of the league could be unpredictable, with potential shifts in team positions.
    • Al-Nassr’s Position: If Al-Nassr does not improve to the top position, their current position is threatened. There is a mention of the team being affected by arrogance and poor luck.
    • Al-Ahli’s Performance: If Al-Ahli had been in good form from the start, they might have been a strong contender for the title.
    • Al-Hilal’s Performance: Al-Hilal is described as facing pressure with potential injuries, absences and exhaustion affecting the team.
    • Saudi Teams in the AFC Champions League:Al-Nassr tied a game against Esteghlal of Iran in the AFC Champions League.
    • Al-Ahli is preparing to play Al Rayyan of Qatar in the AFC Champions League.
    • Al-Hilal is set to face Pakhtakor in the AFC Champions League.
    • AFC Champions League Details:The final stages of the AFC Champions League Elite will be held in Jeddah, Saudi Arabia.
    • Matches are significant in the Saudi League.
    • Player Spotlight:Sami Al-Khabrani, despite being a distinguished player, has not been selected for the national team, prompting questions about the selection criteria.
    • There is hope that Al-Khabrani will get an opportunity to prove himself.
    • Salem Al-Dawsari shined in the league stage, tying for the top scorer position.
    • Riyad Mahrez is recognized as a standout player in Al-Ahli.

    Ramadan Traditions, Preparations, and Health Initiatives

    Here’s what the sources say about Ramadan events:

    • General Atmosphere: Ramadan is characterized by a spiritual atmosphere.
    • Traditions: There are several traditions associated with welcoming Ramadan:
    • Decorating homes with lights and Ramadan ornaments in areas like Jeddah.
    • Presenting ” الماء المبخر” (Mabkhar water) in the Jazan region, as a symbol of hospitality and generosity.
    • نق�ش الحناء (Henna نقش) is used to encourage young girls to fast in the northern region.
    • Efforts to help people observe Ramadan:
    • “حافلات المدينة” (Hafilat Al-Madinah) announced the development of 3 free subsidiary stations to facilitate access to the Prophet’s Mosque.
    • قطار الحرمين (Haramain Train) is raising its operating capacity to 1.6 million seats.
    • Health Initiatives: There are efforts to promote healthy habits during Ramadan. A campaign titled “صم بصحة” (Seh biseha) aims to promote a healthy lifestyle through Ramadan. It includes awareness of healthy eating, hydration, physical activity, and consultation with doctors to control chronic diseases.

    Media’s Influence: Shaping Opinion, Policy, and Global Diplomacy

    The sources discuss media power in the context of diplomacy, public perception, and cultural influence:

    • Influence on Public Opinion: The media has become a powerful force, capable of shaping public opinion, influencing policies, and affecting countries. The media is not just a means of conveying news, but a tool for directing and reshaping opinion, impacting policies, and influencing countries.
    • Media as a Battleground: The presence of journalists can be like a battle, especially when public statements are used to create doubt about something.
    • Impact on Political Leaders: The media can affect the standing of a political leader, influence public opinion, and even save or hurt them. The coverage can influence domestic and foreign public opinion.
    • Agenda Setting: Governments and leaders use the media to promote their agendas.
    • American Media’s Influence: The American media is a political and economic force that extends its influence beyond the United States. America uses its media as a tool to send specific messages to countries, using news channels and newspapers to shape how the global audience views events.
    • Examples of Media Influence: The meeting between President Trump and Ukrainian President Zelensky revealed the media’s role in shaping political discourse. The media can turn an event into a political tool and raise questions about the importance and danger of media on the international stage.
    • Need for Media Awareness: Because of the power of media, there is a need to be aware of its influence. The modern media is a force that can build or destroy alliances and promote or undermine leaders.
    • Sports Media: Media related to sports receives great attention from followers and those interested in the sport.
    • Communication strategies: Effective communication strategies include conveying specific messages, promoting interaction with the public, and building trust and transparency.
    • Cultural Dissemination: The “Literary Partner” initiative uses cafes to spread culture and literature, raising cultural awareness. The initiative contributes to opening new channels of communication between authors and society through the cultural sector.

    By Amjad Izhar
    Contact: amjad.izhar@gmail.com
    https://amjadizhar.blog

  • True Crime: British Killers – A Prequel by Jason Neal

    True Crime: British Killers – A Prequel by Jason Neal

    This excerpt from True Crime: British Killers – A Prequel: Six Disturbing Stories of Some of the UK’s Most Brutal Killers explores the lives and crimes of several notorious British murderers. The book presents detailed accounts of each killer’s background, motives, and methods, and details the investigations and trials. Among those profiled are Anthony Hardy, known as the Camden Ripper, Peter Bryan, the London Cannibal, John George Haigh, known as the Acid Bath Killer, Dena Thompson, the Black Widow, and Levi Bellfield, the Bus Stop Killer, and Steven Wright, the Suffolk Strangler. The text also examines the impact of the crimes on the victims, their families, and society, including potential healthcare failures.

    True Crime: British Killers – A Prequel Study Guide

    Key Figures and Cases

    Chapter 1: The Camden Ripper (Anthony Hardy)

    • Anthony Hardy: The “Camden Ripper,” a British serial killer who murdered three women in Camden, London. He was obsessed with Jack the Ripper and struggled with mental illness and violent tendencies.
    • Sally Rose White: A developmentally challenged prostitute murdered by Anthony Hardy.
    • Elizabeth Valad & Brigette MacClennan: The other two women murdered by Anthony Hardy, whose body parts were found in garbage bags near Hardy’s flat.
    • Freddy Patel: The pathologist who initially determined that Sally Rose White died of natural causes.

    Chapter 2: The London Cannibal (Peter Bryan)

    • Peter Bryan: The “London Cannibal,” a man with a history of mental illness who killed three people and engaged in cannibalism.
    • Nisha Sheth: A shop assistant Bryan murdered after she rejected his advances.
    • Brian Cherry: A friend of Bryan’s whom he murdered and cannibalized after being transferred to a low-support accommodation.
    • Richard Loudwell: An inmate at Broadmoor who was strangled by Bryan.
    • Giles Forrester: During the trial, this judge stated, “You killed on these last two occasions because it gave you a thrill and a feeling of power when you ate flesh.”

    Chapter 3: The Acid Bath Killer (John George Haigh)

    • John George Haigh: Known as the “Acid Bath Killer,” he murdered multiple people and dissolved their bodies in sulfuric acid.
    • William McSwan: Haigh’s first victim, whom he killed for financial gain.
    • Amy & Donald McSwan: William’s parents, who were also murdered by Haigh.
    • Archibald and Rose Henderson: Another wealthy couple murdered by Haigh.
    • Olivia Durand-Deacon: A wealthy widow and Haigh’s final victim.
    • Dr. Keith Simpson: The forensic pathologist who found traces of Mrs. Durand-Deacon, leading to Haigh’s arrest.

    Chapter 4: The Black Widow (Dena Thompson)

    • Dena Thompson: A con artist and attempted murderer known as the “Black Widow” for her manipulative relationships and schemes.
    • Lee Wyatt: Dena’s first husband, whom she defrauded and falsely accused.
    • Julian Webb: Dena’s second husband, whom she murdered with an overdose of drugs.
    • Robert Waite: A lover of Dena Thompson who was drugged while on vacation with her.
    • Richard Thompson: Dena’s third husband, whom she attempted to murder with a baseball bat.
    • Stoyan Kostavj: A Bulgarian native who was in a relationship with Dena Thompson and has been reported missing.

    Chapter 5: The Bus Stop Killer (Levi Bellfield)

    • Levi Bellfield: The “Bus Stop Killer,” convicted of murdering Milly Dowler, Marsha McDonnell, and Amélie Delagrange, and attempting to murder Kate Sheedy.
    • Milly Dowler: A thirteen-year-old girl who was abducted and murdered by Bellfield.
    • Marsha McDonnell: A nineteen-year-old woman murdered by Bellfield.
    • Amélie Delagrange: A twenty-two-year-old French student murdered by Bellfield.
    • Kate Sheedy: A young woman who survived an attempted murder by Bellfield.
    • Johanna Collins: Bellfield’s ex-partner who provided crucial information to the police.
    • Yusuf Rahim: Levi Bellfield’s name after converting to Islam.

    Chapter 7: A Tragic December (Vincent Tabak)

    • Joanna Yeates: A landscape architect murdered by her neighbor, Vincent Tabak.
    • Greg Reardon: Joanna Yeates’ boyfriend.
    • Christopher Jeffries: Joanna Yeates’ landlord, who was initially vilified by the media.
    • Vincent Tabak: Joanna Yeates’ murderer, who initially claimed the death was accidental.

    Quiz

    1. What was Anthony Hardy’s initial job after graduating from Imperial College London, and how did his career progress before his life spiraled downwards?
    2. Describe the events that led to Anthony Hardy’s arrest for the murder of Sally Rose White, focusing on the key pieces of evidence and his initial explanations to the police.
    3. Explain the circumstances surrounding Peter Bryan’s first murder and why he was originally charged with manslaughter on the grounds of diminished responsibility.
    4. What was Peter Bryan’s defense to the killing of Joanna Yeates?
    5. Describe John George Haigh’s method for disposing of bodies and his reasoning behind this approach.
    6. Explain how John George Haigh was ultimately caught, despite his efforts to destroy all evidence.
    7. Describe Dena Thompson’s elaborate scheme to convince her first husband, Lee Wyatt, that he needed to go into hiding.
    8. Explain how Julian Webb died.
    9. Describe the key evidence that linked Levi Bellfield to the murder of Milly Dowler.
    10. Why was Christopher Jeffries initially suspected of the death of Joanna Yeates, and what details were the media focusing on?

    Quiz Answer Key

    1. Anthony Hardy landed a high-paying job with British Sugar and quickly moved up the corporate ranks. However, a severe economic downturn in the mid-1970s led to him losing his job and suffering from depression, ultimately leading to deviant behavior.
    2. After a dispute with his upstairs neighbor, Hardy vandalized her door with graffiti and battery acid, leaving a trail of footprints that led back to his flat. When police searched his apartment, they found the naked body of Sally Rose White in a locked bedroom; Hardy claimed it was his roommate’s room, but police found the key in his coat pocket.
    3. Peter Bryan murdered Nisha Sheth after she rejected his advances and he was fired from his job due to theft. He struck her repeatedly with a claw hammer. He pleaded guilty to manslaughter on the grounds of diminished responsibility and was sentenced to a psychiatric unit.
    4. Vincent Tabak claimed that he was waving back at Joanna Yeates when she came to her kitchen window. He said that he went inside to chat with her, and when he tried to kiss her, she screamed, so he put his hands around her throat. He said it was not premeditated.
    5. John George Haigh dissolved his victims’ bodies in sulfuric acid, believing that if there was no body, there could be no murder conviction. He gained access to sulfuric acid while working in the tinsmith factory in Lincoln Prison.
    6. Despite Haigh’s efforts to destroy all evidence, police found traces of Mrs. Durand-Deacon’s sludge in the yard, including gallstones and false teeth. Additionally, bloodstains were found inside the workshop, leading to his arrest.
    7. Dena Thompson concocted an elaborate story involving a deal with Disney that supposedly went wrong and involved the mafia. She convinced Lee that the mafia was after him and would eliminate him, so he needed to go into hiding to protect himself.
    8. Julian Webb died from an overdose of dothiepin, an anti-depressant, and aspirin in his curry. Dena Thompson spiked the curry with a massive dose of the drugs. The coroner recorded an “Open Verdict” because there was insufficient evidence that Julian died of suicide.
    9. The only evidence they had on the murder was the CCTV footage of the red Daewoo Nexia pulling out of Collingwood Place just ten minutes after Milly was last seen. When police realized that Bellfield’s wife owned the Daewoo Nexia, it was clear that he was responsible for that murder as well.
    10. Christopher Jeffries was vilified by the tabloid press because he was the landlord of the building. Since there was no sign of forced entry, investigators believed that Joanna had been murdered by someone she knew or someone that had access to the flat; Jeffries had access to the flat.

    Essay Questions

    1. Discuss the role of mental illness in the cases of Anthony Hardy and Peter Bryan. To what extent did their mental states contribute to their crimes, and how did the legal system address this factor?
    2. Compare and contrast the methods used by John George Haigh and Anthony Hardy to attempt to evade detection. What made Haigh’s plan ultimately fail, and what similarities can be drawn between the two cases?
    3. Analyze the character of Dena Thompson. What were her primary motivations, and how did she exploit the vulnerabilities of others to achieve her goals?
    4. Examine the police investigation of Levi Bellfield. How did they eventually link him to the murders of Milly Dowler, Marsha McDonnell, and Amélie Delagrange, and what role did CCTV footage play in the investigation?
    5. Critically evaluate the media coverage of the Joanna Yeates case, focusing on the initial portrayal of Christopher Jeffries. How did the media contribute to public perception, and what were the consequences of their reporting?

    Glossary of Key Terms

    • Serial Killer: An individual who murders three or more people over a period of more than 30 days, with a “cooling off” period between each murder, and whose motives are often psychological.
    • Postmortem Examination (Autopsy): A surgical procedure consisting of a thorough examination of a corpse to determine the cause and manner of death and to evaluate any disease or injury that may be present.
    • CCTV: Closed-circuit television, a television system in which signals are not publicly distributed but are monitored, primarily for surveillance and security purposes.
    • Forensic Science: The application of scientific methods and techniques to matters of law and criminal justice.
    • Sulfuric Acid: A highly corrosive strong mineral acid with the molecular formula H2SO4; John George Haigh used this to dissolve the bodies of his victims.
    • Diminished Responsibility: A legal defense that argues a defendant’s mental capacity was impaired, reducing the severity of the charge.
    • Red-Light District: A specific area in a city where prostitution and other sexual activities are concentrated.
    • Luminol: A chemical that exhibits chemiluminescence, with a striking blue glow, when mixed with an oxidizing agent. It is used by forensic investigators to detect traces of blood, even if it has been cleaned or removed.
    • Curfew: A regulation requiring people to remain indoors between specified hours, typically at night.
    • Parole: The release of a prisoner temporarily (for a special purpose) or permanently before the completion of their sentence, on the promise of good behavior.

    True Crime: British Killers – A Prequel: Six Disturbing Stories

    Okay, here is a briefing document summarizing the main themes and key details from the provided excerpts from “True Crime: British Killers – A Prequel: Six Disturbing Stories…”.

    Briefing Document: “True Crime: British Killers – A Prequel”

    Overall Theme: The book appears to be a collection of true crime stories focusing on various British serial killers and other criminals, exploring their backgrounds, crimes, and the investigations that led to their capture or conviction. It also touches upon the failures and shortcomings of healthcare and justice systems.

    Chapter 1: The Camden Ripper (Anthony Hardy)

    • Background: Tony Hardy, born in 1952, grew up in a lower-middle-class family. He was driven by a desire for greatness and saw himself as intellectually superior. He attended Imperial College London and eventually became a mechanical engineer.
    • Obsession and Decline: He developed an obsession with Jack the Ripper, admiring his ability to evade police. His marriage deteriorated due to his extreme sexual desires, and he suffered a severe economic downturn, which led to depression and violent outbursts. He was diagnosed as bipolar.
    • Criminal Behavior: He attempted to murder his wife but was only charged with domestic violence and spent time in a mental hospital. After release, he stalked his ex-wife and hired prostitutes, eventually killing one (Sally Rose White). He was also found guilty of the murders of Elizabeth Valad and Brigitte MacClennan.
    • Key Points: Hardy believed he was too intelligent to be caught, mirroring his fascination with Jack the Ripper. Despite his mental illness, he was deemed fit for release from a mental hospital, only to commit murder shortly after.
    • Quote: A friend recounted, “Anthony was obsessed with serial killers and we talked about them on several occasions. We had long discussions about Jack the Ripper, and Anthony thought he had a brilliant mind. He reckoned Jack the Ripper was a very clever bloke because he murdered all those prostitutes and never got caught.”
    • Forensic Issues: Despite the bizarre staging of Sally Rose White’s body, the initial postmortem examination ruled that she died of natural causes. This highlights potential issues with the initial investigation.
    • Outcome: Hardy received three life sentences and was given a whole life tariff in 2012, meaning he will never be released from prison.

    Chapter 2: The London Cannibal (Peter Bryan)

    • Background: Peter Bryan had a troubled upbringing.
    • Crimes: He committed manslaughter and was sent to a psychiatric unit. Eventually, he was moved to a low-security facility and allowed to leave the building unsupervised. He murdered Brian Cherry, dismembering his body and reportedly eating parts of it. He also strangled Richard Loudwell at Broadmoor.
    • Key Points: Bryan’s case exemplifies failures in the mental healthcare system. Despite a history of violence and mental health issues, he was repeatedly moved to less secure facilities and given unsupervised access to the community.
    • Quote: Bryan said, “I ate his brain with butter. It was really nice.” This shows a lack of remorse and demonstrates his disturbing actions.
    • Failures in the System: Reports from the National Health Services point to extreme failures in the healthcare system at every level.
    • Outcome: Bryan was sentenced to two life terms and is unlikely to ever be released.

    Chapter 3: The Acid Bath Killer (John George Haigh)

    • Background: John George Haigh had a strict upbringing and was drawn to crime early on.
    • Crimes: He murdered William McSwan, Amy and Donald McSwan and disposed of their bodies using sulfuric acid to fully dissolve the body. Then he murdered Archibald and Rose Henderson and Olivia Durand-Deacon, again attempting to dissolve their bodies in acid.
    • Key Points: Haigh believed that if there was no body, there could be no murder conviction.
    • Quote: Haigh said, “Mrs. Durand-Deacon no longer exists. I have destroyed her with acid. You will find the sludge which remains at Leopold Road. Every trace of her body has gone. How can you prove a murder if there is no body?”
    • Forensic Triumph: Haigh was mistaken, and the police were able to convict Haigh using traces of the victims found in the sludge that remained.
    • Outcome: Haigh was found guilty of the murder of Mrs. Durand-Deacon and was hanged at Wandsworth prison.

    Chapter 4: The Black Widow (Dena Thompson)

    • Deception: Dena Thompson manipulated and deceived multiple men for financial gain. She defrauded her first husband Lee Wyatt, and she poisoned her second husband, Julian Webb.
    • Crimes: She was found guilty and sentenced to life in prison with a minimum sentence of sixteen years for the murder of her second husband. She attempted to murder her third husband but was acquitted of the attempted murder charges.
    • Parole: After Dena Thompson’s conviction, investigators teamed with Interpol to look at all of her past lovers. She was granted parole and subsequently released from prison.
    • Quote: Her third husband said upon news of her parole, “She definitely tried to kill me, and they proved that she murdered her second husband. She would have been a serial killer if she had been successful. God knows what else she has done.”

    Chapter 5: The Bus Stop Killer (Levi Bellfield)

    • Crimes: Levi Bellfield was convicted of the murders of Amélie Delagrange, Marsha McDonnell, and the attempted murder of Kate Sheedy. He was later found guilty of the murder of Milly Dowler.
    • Vehicle Link: A key piece of evidence was a grainy CCTV footage of a red Daewoo Nexia pulling out of Collingwood Place, just ten minutes after Milly Dowler was last seen. The car was owned by Bellfield’s girlfriend.
    • Motive and Patterns: Bellfield had an extreme hatred for young blonde women.

    Chapter 6: The Suffolk Strangler (Steven Wright)

    • Victims: Within a matter of six weeks, five young women had been murdered. The victims were Paula Clennell, Annel Alderton, Gemma Adams, Tania Nicol and Annette Nicholls.
    • CCTV and Forensic Evidence: The key to the case was the large amount of CCTV footage that showed Wright in the area of the crimes and the forensic evidence that linked Wright to the victims.
    • Quote: During the trial, the prosecutor asked Wright about the coincidences, to which Wright replied “It would seem so, yes.”
    • Outcome: Wright was sentenced to life imprisonment with a recommendation of no parole.

    Chapter 7: A Tragic December (Vincent Tabak)

    • Victim: Joanna Yeates was murdered in December.
    • Circumstantial Evidence: Vincent Tabak, the neighbor, was eventually arrested after Joanna’s body was found. Despite Tabak’s attempt to give himself an alibi, detectives found that Tabak had searched Google street view at the precise location on Longwood Lane where Joanna’s body was found just days before her body was found there. Blood was found in the trunk of his car that matched Joanna’s and the DNA that was found on Joanna’s body matched his own.
    • Confession and Conviction: Tabak confessed to a prison chaplain that he had killed Joanna. Vincent Tabak was given a life sentence with a minimum term of twenty years in prison.

    True Crime Case Studies

    • What was Tony Hardy’s early life and background?

    Tony Hardy was born in 1952 into a lower-middle-class family in Staffordshire, England. His father worked in the gypsum mines, and Tony was expected to follow in his footsteps. However, from a young age, Tony felt destined for greatness and desired a life beyond that of a laborer.

    • How did Tony Hardy’s obsession with Jack the Ripper manifest itself?

    While attending Imperial College, Tony developed a fascination with Jack the Ripper, reading every book he could find about the notorious killer. He admired the Ripper’s ability to evade police and considered him highly intelligent. He discussed his obsession with Jack the Ripper often with his friends and family, and spoke of him as being a “brilliant bloke”. After attempting to murder his wife in Tasmania, and subsequently deported back to the United Kingdom he would tell his friends it was an act to avoid jail time. He believed he could outwit everyone, just like Jack the Ripper.

    • What were the circumstances surrounding the murder of Sally Rose White and how was Tony Hardy involved?

    Tony Hardy’s roommate, Sally Rose White, who was developmentally challenged and worked as a prostitute, was found dead in their apartment. The scene was staged with disturbing elements like a rubber Satan mask, crucifixes, and photo equipment. Initially, a pathologist determined she died of natural causes, but investigators were suspicious due to the staged scene and blood evidence. After further investigation, Tony was arrested for the murder.

    • What was John George Haigh’s method for disposing of his victims, and why did he believe it would lead to acquittal?

    Haigh used sulfuric acid to dissolve the bodies of his victims. He believed that if there was no body, there could be no murder conviction, operating under the misunderstanding of the Latin term “corpus delicti.”

    • How did Dena Thompson manage to deceive her husbands and lovers, and what were her motives?

    Dena Thompson was a master manipulator who wove elaborate lies to deceive her husbands and lovers. Her motives were primarily financial, as she sought to enrich herself through insurance money, pension funds, and property. She created false narratives involving the mafia, forged documents, and even convinced one husband to go into hiding, all to maintain her deceit.

    • What were some of the key pieces of evidence that linked Levi Bellfield to his crimes?

    Key evidence included security camera footage placing his vehicles near the scenes of the crimes, his ex-partner’s testimony about his hatred of blonde women and his ownership of a white Ford cargo van, and DNA evidence linking him to the victims. Fiber analysis also connected carpet fibers from his van to the hair of one of the victims.

    • What role did CCTV play in the investigation into Levi Bellfield?

    CCTV was a critical component of the investigation into Levi Bellfield. Police used it to track Bellfield’s movements and identify vehicles of interest.

    • How was Joanna Yeates’s body discovered, and what was the cause of death?

    Joanna Yeates’s body was found on Christmas Day by a couple walking their dog. Her body was discovered in a snow-covered mound, and the cause of death was determined to be manual strangulation. She had been missing for eight days and was found with forty-three cuts and bruises on her body.

    UK Serial Killer Cases

    The source discusses several serial killer cases in the United Kingdom:

    • Anthony Hardy, also known as the Camden Ripper, was responsible for multiple murders of prostitutes in the Camden area of London. He had an obsession with Jack the Ripper and a history of mental illness and violent behavior. In 2012, Hardy received a whole life tariff, meaning he will never be released from prison.
    • Peter Bryan, known as the London Cannibal, was convicted of manslaughter for killing a girl with a hammer. Bryan was transferred to a low-security facility and later killed his friend. He was sentenced to two life terms and is unlikely to ever be released.
    • John George Haigh, also known as the Acid Bath Killer, murdered multiple victims and disposed of their bodies using sulfuric acid. He was found guilty and hanged in 1949.
    • Dena Thompson, known as the Black Widow, was convicted of deception and the murder of her second husband. On May 23, 2022 Dena Thompson was granted parole and subsequently released from prison.
    • Levi Bellfield, known as the Bus Stop Killer, was found guilty of the murders of Amélie Delagrange, Marsha McDonnell, and Milly Dowler. He was sentenced to a whole-life tariff.
    • Steven Wright, known as the Suffolk Strangler, was convicted of murdering five prostitutes in Ipswich. Wright was sentenced to life imprisonment with no parole.
    • Vincent Tabak was found guilty of the murder of Joanna Yeates and was given a life sentence with a minimum of twenty years in prison.

    British True Crime Cases: Notorious Killers

    The source provides details of several true crime cases involving British Killers.

    • Anthony Hardy: Also known as the Camden Ripper, Hardy murdered prostitutes in London. He was obsessed with Jack the Ripper and had mental health issues. He received a life sentence in 2012.
    • Peter Bryan: Known as the London Cannibal, Bryan was convicted of manslaughter for killing a girl with a hammer. While in a low-security facility, he killed his friend. Bryan received two life sentences.
    • John George Haigh: Also known as the Acid Bath Killer, Haigh murdered victims and disposed of their bodies with sulfuric acid. He was found guilty and hanged in 1949.
    • Dena Thompson: Known as the Black Widow, Thompson was convicted of deception and murdering her second husband. She was granted parole on May 23, 2022.
    • Levi Bellfield: Known as the Bus Stop Killer, Bellfield was found guilty of murdering Amélie Delagrange, Marsha McDonnell, and Milly Dowler. He received a whole-life tariff.
    • Steven Wright: Known as the Suffolk Strangler, Wright was convicted of murdering five prostitutes in Ipswich and received a life sentence with no parole.
    • Vincent Tabak: Tabak was found guilty of murdering Joanna Yeates and received a life sentence with a minimum of twenty years.

    British Serial Killer Investigations: Case Details

    The source provides details about the criminal investigations into several British serial killer cases:

    • Anthony Hardy: In December 2002, the police followed a trail of battery acid to Hardy’s door after he vandalized a neighbor’s property. Upon entering his apartment, they found the naked body of Sally Rose White, along with evidence suggesting a simulated rape. Later, investigators found dismembered body parts in garbage bags that Hardy had deposited using a loyalty card from a local Sainsbury’s grocery store.
    • John George Haigh: Police became suspicious of Haigh after Mrs. Lane reported Mrs. Durand-Deacon missing. They discovered Haigh had a history of fraud and forgery. A search of his workshop in Crawley revealed tools, chemicals, a gas mask, and a rubber apron with stains. Although Haigh claimed he had destroyed Mrs. Durand-Deacon with acid, police found traces of her remains, including bloodstains, gallstones, and false teeth.
    • Levi Bellfield: Police examined security camera footage and identified a silver Vauxhall Corsa stalking Marsha McDonnell. After another attack, police realized they were looking for a serial killer. They found a white Ford cargo van that had driven the route at the time of another murder. Bellfield’s ex-partner identified him as the owner of the van. Police put Bellfield under surveillance and then arrested him.
    • Steven Wright: Police discovered that Wright had a prior offense on his record and that his DNA was in the national DNA database. Detectives examined over 10,000 hours of security camera footage to map Wright’s movements. They found footage of Wright’s car in the areas where the victims disappeared. Forensic scientists found DNA from the victims in Wright’s car and home.
    • Vincent Tabak: Security cameras showed Tabak driving to a supermarket, going inside, leaving without buying anything, and then returning to buy items. Tabak had searched Google Street View for the location where Joanna Yeates’ body was discovered. Blood was found in the trunk of Tabak’s car, and his DNA matched the DNA on Joanna’s body.

    Forensic Investigations: Hardy, Haigh, Bellfield, Wright, and Tabak

    The source details how police forensics played a role in the investigations of several cases:

    • Anthony Hardy Police used Luminol to find traces of blood in Hardy’s apartment, even after attempts to clean. The police were able to connect Hardy to dismembered body parts found in garbage bags by using security camera footage that showed him depositing the bags using his Sainsbury’s loyalty card.
    • John George Haigh Although Haigh tried to dissolve the bodies of his victims using acid, forensic evidence was used to convict him of murder. Despite Haigh’s belief that a murder was unprovable without a body, Forensic Pathologist Dr. Keith Simpson found traces of Mrs. Durand-Deacon in the sludge remaining at Leopold Road. Police found bloodstains inside the workshop. Additionally, police found gallstones and Mrs. Durand-Deacon’s false teeth that were not dissolved by the acid.
    • Levi Bellfield Pollen experts analyzed foliage from Milly Dowler’s remains.
    • Steven Wright Forensic scientists were able to duplicate DNA samples and develop a full DNA profile. Microscopic comparison of a nylon fiber found in Tania Nicol’s hair matched the carpet in Wright’s car. Forensic tests on work gloves found in Wright’s home revealed DNA from three of the girls. Fabric from Wright’s home was found on four of the five bodies.
    • Vincent Tabak Vincent Tabak had searched Google street view at the precise location on Longwood Lane where Joanna’s body was found. Blood was found in the trunk of his car that matched Joanna’s DNA.

    British Killers and Their Crimes

    The source and conversation history provide details on several British killers and their crimes:

    • Anthony Hardy, known as the Camden Ripper, murdered prostitutes in London and was obsessed with Jack the Ripper. In December 2002, police found the naked body of Sally Rose White in his apartment and later discovered dismembered body parts in garbage bags linked to Hardy via his Sainsbury’s loyalty card. Despite a pathologist’s initial assessment of natural causes, investigators found the scene suspicious. Hardy received a life sentence in 2012.
    • Peter Bryan, the London Cannibal, was convicted of manslaughter for killing a girl with a hammer. He killed his friend while in a low-security facility. Bryan received two life sentences.
    • John George Haigh, the Acid Bath Killer, murdered victims and disposed of their bodies using sulfuric acid. Despite his attempts to destroy the evidence, traces of his victim Mrs. Durand-Deacon were found in the sludge at his workshop, including bloodstains, gallstones, and false teeth. Haigh was found guilty and hanged in 1949.
    • Dena Thompson, the Black Widow, was convicted of deception and the murder of her second husband. She was granted parole on May 23, 2022.
    • Levi Bellfield, the Bus Stop Killer, was found guilty of the murders of Amélie Delagrange, Marsha McDonnell, and Milly Dowler. Security camera footage showed a silver Vauxhall Corsa stalking Marsha McDonnell, and later, a white Ford cargo van was identified as being at the scene of another murder. Bellfield’s ex-partner identified him as the van’s owner, leading to his arrest. He received a whole-life tariff.
    • Steven Wright, the Suffolk Strangler, was convicted of murdering five prostitutes in Ipswich. His DNA was in the national DNA database due to a prior offense. Police used security camera footage to map his movements and found victim DNA in his car and home. Wright received a life sentence with no parole.
    • Vincent Tabak was found guilty of the murder of Joanna Yeates and received a life sentence with a minimum of twenty years. He searched Google Street View for the location where her body was discovered. Blood matching Joanna’s DNA was found in his car.

    By Amjad Izhar
    Contact: amjad.izhar@gmail.com
    https://amjadizhar.blog

  • Modern SQL Data Warehouse Project: A Comprehensive Guide

    Modern SQL Data Warehouse Project: A Comprehensive Guide

    This source details the creation of a modern data warehouse project using SQL. It presents a practical guide to designing data architecture, writing code for data transformation and loading, and creating data models. The project emphasizes real-world implementation, focusing on organizing and preparing data for analysis. The resource covers the ETL process, data quality, and documentation while building bronze, silver, and gold layers. It provides a comprehensive approach to data warehousing, from understanding requirements to creating a professional portfolio project.

    Modern SQL Data Warehouse Project Study Guide

    Quiz:

    1. What is the primary purpose of data warehousing projects? Data warehousing projects focus on organizing, structuring, and preparing data for data analysis, forming the foundation for any data analytics initiatives.
    2. Briefly explain the ETL/ELT process in SQL data warehousing. ETL/ELT in SQL involves extracting data from various sources, transforming it to fit the data warehouse schema (cleaning, standardizing), and loading it into the data warehouse for analysis and reporting.
    3. According to Bill Inmon’s definition, what are the four key characteristics of a data warehouse? According to Bill Inmon’s definition, the four key characteristics of a data warehouse are subject-oriented, integrated, time-variant, and non-volatile.
    4. Why is creating a project plan crucial for data warehouse projects, according to the source? Creating a project plan is crucial for data warehouse projects because they are complex, and a clear plan improves the chances of success by providing organization and direction, reducing the risk of failure.
    5. What is the “separation of concerns” principle in data architecture, and why is it important? The “separation of concerns” principle involves breaking down a complex system into smaller, independent parts, each responsible for a specific task, to avoid mixing everything and to maintain a clear and efficient architecture.
    6. Explain the purpose of the bronze, silver, and gold layers in a data warehouse architecture. The bronze layer stores raw, unprocessed data directly from the source systems, the silver layer contains cleaned and standardized data, and the gold layer holds business-ready data transformed and aggregated for reporting and analysis.
    7. What are metadata columns, and why are they useful in a data warehouse? Metadata columns are additional columns added to tables by data engineers to provide extra information about each record, such as create date or source system, aiding in data tracking and troubleshooting.
    8. What is a surrogate key, and why is it used in data modeling? A surrogate key is a system-generated unique identifier assigned to each record to make the record unique. It provides more control over the data model without dependence on source system keys.
    9. Describe the star schema data model, including the roles of fact and dimension tables. The star schema is a data modeling approach with a central fact table surrounded by dimension tables. Fact tables contain events or transactions, while dimension tables hold descriptive attributes, related via foreign keys.
    10. Explain the importance of clear documentation for end users of a data warehouse, as highlighted in the source.

    Clear documentation is essential for end users to understand the data model and use the data warehouse effectively.

    Quiz Answer Key:

    1. Data warehousing projects focus on organizing, structuring, and preparing data for data analysis, forming the foundation for any data analytics initiatives.
    2. ETL/ELT in SQL involves extracting data from various sources, transforming it to fit the data warehouse schema (cleaning, standardizing), and loading it into the data warehouse for analysis and reporting.
    3. According to Bill Inmon’s definition, the four key characteristics of a data warehouse are subject-oriented, integrated, time-variant, and non-volatile.
    4. Creating a project plan is crucial for data warehouse projects because they are complex, and a clear plan improves the chances of success by providing organization and direction, reducing the risk of failure.
    5. The “separation of concerns” principle involves breaking down a complex system into smaller, independent parts, each responsible for a specific task, to avoid mixing everything and to maintain a clear and efficient architecture.
    6. The bronze layer stores raw, unprocessed data directly from the source systems, the silver layer contains cleaned and standardized data, and the gold layer holds business-ready data transformed and aggregated for reporting and analysis.
    7. Metadata columns are additional columns added to tables by data engineers to provide extra information about each record, such as create date or source system, aiding in data tracking and troubleshooting.
    8. A surrogate key is a system-generated unique identifier assigned to each record to make the record unique. It provides more control over the data model without dependence on source system keys.
    9. The star schema is a data modeling approach with a central fact table surrounded by dimension tables. Fact tables contain events or transactions, while dimension tables hold descriptive attributes, related via foreign keys.
    10. Clear documentation is essential for end users to understand the data model and use the data warehouse effectively.

    Essay Questions:

    1. Discuss the importance of data quality in a modern SQL data warehouse project. Explain the role of the bronze and silver layers in ensuring high data quality, and provide examples of data transformations that might be performed in the silver layer.
    2. Describe the Medan architecture and how it’s implemented using bronze, silver, and gold layers. Discuss the advantages of this architecture, including separation of concerns and data quality management, and explain how data flows through each layer.
    3. Explain the process of creating a detailed project plan for a data warehouse project using a tool like Notion. Describe the key phases and stages involved, the importance of defining epics and tasks, and how this plan contributes to project success.
    4. Explain the importance of source system analysis in a data warehouse project, and describe the key questions that should be asked when connecting to a new source system.
    5. Compare and contrast the star schema with other data modeling approaches, such as snowflake and data vault. Discuss the advantages and disadvantages of the star schema for reporting and analytics, and explain the roles of fact and dimension tables in this model.

    Glossary of Key Terms:

    • Data Warehouse: A subject-oriented, integrated, time-variant, and non-volatile collection of data designed to support management’s decision-making process.
    • ETL (Extract, Transform, Load): A process in data warehousing where data is extracted from various sources, transformed into a suitable format, and loaded into the data warehouse.
    • ELT (Extract, Load, Transform): A process similar to ETL, but the transformation step occurs after the data has been loaded into the data warehouse.
    • Data Architecture: The overall structure and design of data systems, including databases, data warehouses, and data lakes.
    • Data Integration: The process of combining data from different sources into a unified view.
    • Data Modeling: The process of creating a visual representation of data structures and relationships.
    • Bronze Layer: The first layer in a data warehouse architecture, containing raw, unprocessed data from source systems.
    • Silver Layer: The second layer in a data warehouse architecture, containing cleaned and standardized data ready for transformation.
    • Gold Layer: The third layer in a data warehouse architecture, containing business-ready data transformed and aggregated for reporting and analysis.
    • Subject-Oriented: Focused on a specific business area, such as sales, customers, or finance.
    • Integrated: Combines data from multiple source systems into a unified view.
    • Time-Variant: Keeps historical data for analysis over time.
    • Non-Volatile: Data is not deleted or modified once it enters the data warehouse.
    • Project Epic: A large task or stage in a project that requires significant effort to complete.
    • Separation of Concerns: A design principle that breaks down complex systems into smaller, independent parts, each responsible for a specific task.
    • Data Cleansing: The process of correcting or removing inaccurate, incomplete, or irrelevant data.
    • Data Standardization: The process of converting data into a consistent format or standard.
    • Metadata Columns: Additional columns added to tables to provide extra information about each record, such as creation date or source system.
    • Surrogate Key: A system-generated unique identifier assigned to each record, used to connect data models and avoid dependence on source system keys.
    • Star Schema: A data modeling approach with a central fact table surrounded by dimension tables.
    • Fact Table: A table in a data warehouse that contains events or transactions, along with foreign keys to dimension tables.
    • Dimension Table: A table in a data warehouse that contains descriptive attributes or categories related to the data in fact tables.
    • Data Lineage: Tracking the origin and movement of data from its source to its final destination.
    • Stored Procedure: A precompiled collection of SQL statements stored under a name and executed as a single unit.
    • Data Normalization: The process of organizing data to reduce redundancy and improve data integrity.
    • Data Lookup: Joining tables to retrieve specific data, such as surrogate keys, from related dimensions.
    • Data Flow Diagram: A visual representation of how data moves through a system.

    Modern SQL Data Warehouse Project Guide

    Okay, here’s a detailed briefing document summarizing the main themes and ideas from the provided text excerpts.

    Briefing Document: Modern SQL Data Warehouse Project

    Overview:

    This document summarizes the key concepts and practical steps outlined in a guide for building a modern SQL data warehouse. The guide, presented by Bar Zini, aims to equip data architects, data engineers, and data modelers with real-world skills by walking them through the creation of a data warehouse project using SQL Server (though adaptable to other SQL databases). The project emphasizes best practices and provides a professional portfolio piece upon completion.

    Main Themes and Key Ideas:

    1. Data Warehousing Fundamentals:
    • Definition: The project begins by defining a data warehouse using Bill Inmon’s classic definition: “A data warehouse is subject oriented, integrated, time variant, and nonvolatile collection of data designed to support the Management’s decision-making process.”
    • Subject Oriented: Focused on business areas (e.g., sales, customers, finance).
    • Integrated: Combines data from multiple source systems.
    • Time Variant: Stores historical data.
    • Nonvolatile: Data is not deleted or modified once entered.
    • Purpose: To address the inefficiencies of data analysts extracting and transforming data directly from operational systems, replacing it with an organized and structured data system as a foundation for data analytics projects.
    • SQL Data Warehousing in Relation to Other Types of Data Analytics Projects: The guide mentions that SQL Data Warehousing is the foundation of any data analytics projects and that it is the first step before being able to do exploratory data analyzes (EDA) and Advanced analytics projects.
    1. Project Structure and Skills Developed:
    • Roles: The project is designed to provide experience in three key roles: data architect, data engineer, and data modeler.
    • Skills: Participants will learn:
    • ETL/ELT processing using SQL.
    • Data architecture design.
    • Data integration (merging multiple sources).
    • Data loading and data modeling.
    • Portfolio Building: The guide emphasizes the project’s value as a portfolio piece for demonstrating skills on platforms like LinkedIn.
    1. Project Setup and Planning (Using Notion):
    • Importance of Planning: The guide stresses that “creating a project plan is the key to success.” This is particularly important for data warehouse projects, where a high failure rate (over 50%, according to Gartner reports) is attributed to complexity.
    • Iterative Planning: The planning process is described as iterative. An initial “rough project plan” is created, which is then refined as understanding of the data architecture evolves.
    • Project Epics (Main Phases): The initial project phases identified are:
    • Requirements analysis.
    • Designing the data architecture.
    • Project initialization.
    • Task Breakdown: The project uses Notion (a free tool) to organize the project into epics and subtasks, enabling a structured approach.
    • It is also mentioned the importance of icons to add a personal style to the project and to keep it more organized.
    • Project success: One important element of the project to be successful is to be able to visualize the whole picture in the project by closing small chunks of work and tasks that gives a sense of motivation and accomplishment.
    1. Data Architecture Design (Using Draw.io):
    • Medallion Architecture: The guide advocates for a “Medallion architecture” (Bronze, Silver, Gold layers) within the data warehouse.
    • Separation of Concerns: A core architectural principle is “separation of concerns.” This means breaking down the complex system into independent parts, each responsible for a specific task, with no duplication of components. “A good data architect follow this concept this principle.”
    • Layer Responsibilities:Bronze Layer (Raw Data): Contains raw data, with no transformations. “In the bronze layer it’s going to be the row data.”
    • Silver Layer (Cleaned and Standardized Data): Focuses on data cleansing and standardization. “In the silver you are cleans standard data.”
    • Gold Layer (Business-Ready Data): Contains business-transformed data ready for analysis. “For the gos we can say business ready data.”
    • Data Flow Diagram: The project utilizes Draw.io (a free diagramming tool) to visualize the data architecture and data lineage.
    • Naming Conventions: A naming convention is created to ensure clarity and consistency, creating specific naming rules for tables and columns. Examples include fact_sales for a fact table and dim_customers for a dimension. It is recommended to create clear documentation about each rule and to add examples so that there is a general consensus about how to proceed.
    1. Project Initialization and Tools:
    • Software: The project uses SQL Server Express (database server) and SQL Server Management Studio (client for interacting with the database). Other tools include GitHub and Draw.io. Notion is used for project management.
    • Initial Database Setup: The guide outlines the creation of a new database and schemas (Bronze, Silver, Gold) within SQL Server.
    • Git Repository: The project emphasizes the importance of using Git for version control and collaboration. A repository structure is established with folders for data sets, documents, scripts, and tests.
    • ReadMe: it is important to create a read me file at the root of the repo where the main characteristics and goal of the repo are specified so that other developers can have a better understanding of the project when collaborating.
    1. Building the Bronze Layer
    • The process to build the bronze layer is by first doing data analysis about what is to be built. The goal of this first process is to interview source system experts, identify the source of the data, the size of the data to be processed, the performance of the source system so that it is not to be affected and authentication/authorization like access tokens, keys and passwords.
    • The project also makes a step-by-step approach from creating all the required queries and stored procedures to loading them efficiently. This step contains steps about testing that the tables have no nulls and that the separator used matches with the data.
    1. Building the Silver Layer
    • The specifications of the silver layer are to have clean and standardized data and building tables inside the silver layer. The data should be loaded from the bronze layer using full load, truncating and then inserting the data after which we will apply a lot of data transformation.
    • In the silver layer, we will implement metadata columns where more data information is stored that doesn’t come directly from the source system. Some examples that can be stored are create and update dates, the source system, and the file location where this data came from. This can help track where there are corrupted data as well as find if there is a gap in the imported data.
    1. Building the Gold Layer

    *The gold layer is very focused on business goals and should be easy to consume for business reports. That is why we will create a data model for our business area. *When implementing a data model, it should contain two types of tables: fact tables and dimension tables. Dimension tables are descriptive and give some context to the data. One example of a dimension table is to use product info to use the product name, category and subcategories. Fact tables are events like transactions that contain IDs from dimensions. The question to define whether we should use a dimension table or a fact table comes to be: * How much and How many: fact table *Who, What, and Where: dimension table

    1. General Data Cleaning
    • In the project we will be building data transformations and cleansing where we will be writing insert statements that will have functions where the data will be transformed and cleaned up. This will include data checks in the primary keys, handling unwanted space, identifying the inconsistencies of the cardinality (the number of elements in a table) where we will be replacing null values, and fixing the dates and values of the sales order.
    • During the data cleaning process, one tool to check the quality of our data is through quality checks where we can go and select data that is incorrect, and then we can have a quick fix. For any numerical column it is best to validate it against the negative numbers, null values, and against the data type to make sure to convert into the right format. *In the silver layer, some techniques will have to be applied for the data that is old, in that case, it will have to be removed or have a flag, and for the birthday, we can filter data in the future. *To find errors in SQL, it is possible to use try and catch in between code blocks and then print error messages, numbers, and states so that the messages can be handled to find errors easier. *There is a lot of information that might have missing values. The code includes techniques to fill missing values and then also to provide data normalization.

    In summary, this guide provides a comprehensive, practical approach to building a modern SQL data warehouse, emphasizing structured planning, sound architectural principles, and hands-on coding experience. The emphasis on building a portfolio project makes it particularly valuable for those seeking to demonstrate their data warehousing skills.

    SQL Data Warehouse Fundamentals

    # What is a modern SQL data warehouse?

    A modern SQL data warehouse, according to the excerpt from “A Journey Through Grief”, is a subject-oriented, integrated, time-variant, and non-volatile collection of data designed to support management’s decision-making process. It consolidates data from multiple source systems, organizes it around business subjects (like sales, customers, or finance), retains historical data, and ensures that the data is not deleted or modified once loaded.

    # What are the key roles involved in building a data warehouse project?

    According to the excerpt from “A Journey Through Grief”, building a data warehouse involves different roles including:

    * **Data Architect:** Designs the overall data architecture following best practices.

    * **Data Engineer:** Writes code to clean, transform, load, and prepare data.

    * **Data Modeler:** Creates the data model for analysis.

    # What are the three types of data analytics projects that can be done using SQL?

    The three types of data analytics projects, according to the excerpt from “A Journey Through Grief”, are:

    * **Data Warehousing:** Focuses on organizing, structuring, and preparing data for analysis, which is foundational for other analytics projects.

    * **Exploratory Data Analysis (EDA):** Involves understanding and uncovering insights from datasets by asking the right questions and finding answers using basic SQL skills.

    * **Advanced Analytics Projects:** Uses advanced SQL techniques to answer business questions, such as identifying trends, comparing performance, segmenting data, and generating reports.

    # What is the Medici architecture and why is it relevant to designing a data warehouse?

    The Medici architecture is a layered approach to data warehousing, which this source calls “Medan” and which is composed of:

    * **Bronze Layer:** Raw data “as is” from source systems.

    * **Silver Layer:** Cleaned and standardized data.

    * **Gold Layer:** Business-ready data with transformed and aggregated information.

    The Medici architecture enables separation of concerns, allowing unique sets of tasks for each layer, and helps organize and manage the complexity of data warehousing. It provides a structured approach to data processing, ensuring data quality and consistency.

    # What tools are commonly used in data warehouse projects, and why is creating a project plan important?

    Common tools used in data warehouse projects include:

    * **SQL Server Express:** A local server for the database.

    * **SQL Server Management Studio (SSMS):** A client to interact with the database and run queries.

    * **GitHub:** For version control and collaboration.

    * **draw.io:** A tool for creating diagrams, data models, data architectures and data lineage.

    * **Notion:** A tool for project management, planning, and organizing resources.

    Creating a project plan is essential for success due to the complexity of data warehouse projects. A clear plan helps organize tasks, manage resources, and track progress.

    # What is data lineage, and why is it important in a data warehouse environment?

    Data lineage refers to the data’s journey from its origin in source systems, through various transformations, to its final destination in the data warehouse. It provides visibility into the data’s history, transformations, and dependencies. Data lineage is crucial for troubleshooting data quality issues, understanding data flows, ensuring compliance, and auditing data processes.

    # What are surrogate keys, and why are they used in data modeling?

    Surrogate keys are system-generated unique identifiers assigned to each record in a dimension table. They are used to ensure uniqueness, simplify data relationships, and insulate the data warehouse from changes in source system keys. Surrogate keys provide control over the data model and facilitate efficient data integration and querying.

    # What are some essential naming conventions for data warehouse projects, and why are they important?

    Essential naming conventions help ensure consistency and clarity across the data warehouse. Examples include:

    * Using prefixes to indicate the type of table (e.g., `dim_` for dimension, `fact_` for fact).

    * Consistent naming of columns (e.g., surrogate keys ending with `_key`, technical columns starting with `dw_`).

    * Standardized naming for stored procedures (e.g., `load_bronze` for bronze layer loading).

    These conventions improve collaboration, code readability, and maintenance, enabling efficient data management and analysis.

    Data Warehousing: Architectures, Models, and Key Concepts

    Data warehousing involves organizing, structuring, and preparing data for analysis and is the foundation for any data analytics project. It focuses on how to consolidate data from various sources into a centralized repository for reporting and analysis.

    Key aspects of data warehousing:

    • A data warehouse is subject-oriented, integrated, time-variant, and a nonvolatile collection of data designed to support management’s decision-making process.
    • Subject-oriented: Focuses on specific business areas like sales, customers, or finance.
    • Integrated: Integrates data from multiple source systems.
    • Time-variant: Keeps historical data.
    • Nonvolatile: Data is not deleted or modified once it’s in the warehouse.
    • ETL (Extract, Transform, Load): A process to extract data from sources, transform it, and load it into the data warehouse, which then becomes the single source of truth for analysis and reporting.
    • Benefits of a data warehouse:
    • Organized data: A data warehouse helps organize data so that the data team is not fighting with the data.
    • Single point of truth: Serves as a single point of truth for analyses and reporting.
    • Automation: Automates the data collection and transformation process, reducing manual errors and processing time.
    • Historical data: Enables access to historical data for trend analysis.
    • Data integration: Integrates data from various sources, making it easier to create integrated reports.
    • Improved decision-making: Provides fresh and reliable reports for making informed decisions.
    • Data Management: Data management is important for making real and good decisions.
    • Data Modeling: Data modeling is creating a new data model for analyses.

    Different Approaches to Data Warehouse Architecture:

    • Inmon Model: Uses a three-layer approach (staging, enterprise data warehouse, and data marts) to organize and model data.
    • Kimball Model: Focuses on quickly building data marts, which may lead to inconsistencies over time.
    • Data Vault: Adds more standards and rules to the central data warehouse layer by splitting it into raw and business vaults.
    • Medallion Architecture: Uses three layers: bronze (raw data), silver (cleaned and standardized data), and gold (business-ready data).

    The Medallion architecture consists of the following:

    • Bronze Layer: Stores raw, unprocessed data directly from the sources for traceability and debugging.
    • Data is not transformed in this layer.
    • Typically uses tables as object types.
    • Full load method is applied.
    • Access restricted to data engineers only.
    • Silver Layer: Stores clean and standardized data with basic transformations.
    • Focuses on data cleansing, standardization, and normalization.
    • Uses tables as object types.
    • Full load method is applied.
    • Accessible to data engineers, data analysts, and data scientists.
    • Gold Layer: Contains business-ready data for consumption by business users and analysts.
    • Applies business rules, data integration, and aggregation.
    • Uses views as object types for dynamic access.
    • Suitable for data analysts and business users.

    The ETL Process: Extract, Transform, and Load

    The ETL (Extract, Transform, Load) process is a critical component of data warehousing used to extract data from various sources, transform it into a usable format, and load it into a data warehouse. The data warehouse then becomes the single point of truth for analyses and reporting.

    The ETL process consists of three key stages:

    • Extract: Involves identifying and extracting data from source systems without changing it. The goal is to pull out a subset of data from the source in order to prepare it and load it to the target. This step focuses solely on data retrieval, maintaining a one-to-one correspondence with the source system.
    • Transform: Manipulates and transforms the extracted data into a format suitable for analysis and reporting. This stage may include data cleansing, integration, formatting, and normalization to reshape the data into the required format.
    • Load: Inserts the transformed data into the target data warehouse. The prepared data from the transformation step is moved into its final destination, such as a data warehouse.

    In real-world projects, the data architecture may have multiple layers, and the ETL process can vary between these layers. Depending on the data architecture’s design, it is not always necessary to use the complete ETL process to move data from a source to a target. For example, data can be loaded directly to a layer without transformations or undergo only transformation or loading steps between layers.

    Different techniques and methods exist within each stage of the ETL process:

    Extraction:

    • Methods:
    • Pull: The data warehouse pulls data from the source system.
    • Push: The source system pushes data to the data warehouse.
    • Types:
    • Full Extraction: All records from the source tables are extracted.
    • Incremental Extraction: Only new or changed data is extracted.
    • Techniques:
    • Manual extraction
    • Querying a database
    • Parsing a file
    • Connecting to an API
    • Event-based streaming
    • Change data capture (CDC)
    • Web scraping

    Transformation:

    • Data enrichment
    • Data integration
    • Deriving new columns
    • Data normalization
    • Applying business rules and logic
    • Data aggregation
    • Data cleansing:
    • Removing duplicates
    • Data filtering
    • Handling missing data
    • Handling invalid values
    • Removing unwanted spaces
    • Casting data types
    • Detecting outliers

    Load:

    • Processing Types:
    • Batch Processing: Loading the data warehouse in one large batch of data.
    • Stream Processing: Processing changes as soon as they occur in the source system.
    • Methods:
    • Full Load:
    • Truncate and insert
    • Upsert (update and insert)
    • Drop, create, and insert
    • Incremental Load:
    • Upsert
    • Insert (append data)
    • Merge (update, insert, delete)
    • Slowly Changing Dimensions (SCD):
    • SCD0: No historization; no changes are tracked.
    • SCD1: Overwrite; records are updated with new information, losing history.
    • SCD2: Add historization by inserting new records for each change and inactivating old records.

    Data Modeling for Warehousing and Business Intelligence

    Data modeling is the process of organizing and structuring raw data into a meaningful way that is easy to understand. In data modeling, data is put into new, friendly, and easy-to-understand formats like customers, orders, and products. Each format is focused on specific information, and the relationships between those objects are described. The goal is to create a logical data model.

    For analytics, especially in data warehousing and business intelligence, data models should be optimized for reporting, flexible, scalable, and easy to understand.

    Different Stages of Data Modeling:

    • Conceptual Data Model: Focuses on identifying the main entities (e.g., customers, orders, products) and their relationships without specifying details like columns or attributes.
    • Logical Data Model: Specifies columns, attributes, and primary keys for each entity and defines the relationships between entities.
    • Physical Data Model: Includes technical details like data types, lengths, and database-specific configurations for implementing the data model in a database.

    Data Models for Data Warehousing and Business Intelligence:

    • Star Schema: Features a central fact table surrounded by dimension tables. The fact table contains events or transactions, while dimensions contain descriptive information. The relationship between fact and dimension tables forms a star shape.
    • Snowflake Schema: Similar to the star schema but breaks down dimensions into smaller sub-dimensions, creating a more complex, snowflake-like structure.

    Comparison of Star and Snowflake Schemas:

    • Star Schema:
    • Easier to understand and query.
    • Suitable for reporting and analytics.
    • May contain duplicate data in dimensions.
    • Snowflake Schema:
    • More complex and requires more knowledge to query.
    • Optimizes storage by reducing data redundancy through normalization.
    • The star schema is commonly used and perfect for reporting.

    Types of Tables:

    • Fact Tables: Contain events or transactions and include IDs from multiple dimensions, dates, and measures. They answer questions about “how much” or “how many”.
    • Dimension Tables: Provide descriptive information and context about the data, answering questions about “who,” “what,” and “where”.

    In the gold layer, data modeling involves creating new structures that are easy to consume for business reporting and analyses.

    Data Transformation: ETL Process and Techniques

    Data transformation is a key stage in the ETL (Extract, Transform, Load) process where extracted data is manipulated and converted into a format that is suitable for analysis and reporting. It occurs after data has been extracted from its source and before it is loaded into the target data warehouse. This process is essential for ensuring data quality, consistency, and relevance in the data warehouse.

    Here’s a detailed breakdown of data transformation, drawing from the sources:

    Purpose and Importance

    • Data transformation changes the shape of the original data.
    • It is a heavy working process that can include data cleansing, data integration, and various formatting and normalization techniques.
    • The goal is to reshape and reformat original data to meet specific analytical and reporting needs.

    Types of Transformations There are various types of transformations that can be performed:

    • Data Cleansing:
    • Removing duplicates to ensure each primary key has only one record.
    • Filtering data to retain relevant information.
    • Handling missing data by filling in blanks with default values.
    • Handling invalid values to ensure data accuracy.
    • Removing unwanted spaces or characters to ensure consistency.
    • Casting data types to ensure compatibility and correctness.
    • Detecting outliers to identify and manage anomalous data points.
    • Data Enrichment: Adding value to data sets by including relevant information.
    • Data Integration: Bringing multiple sources together into a unified data model.
    • Deriving New Columns: Creating new columns based on calculations or transformations of existing ones.
    • Data Normalization: Mapping coded values to user-friendly descriptions.
    • Applying Business Rules and Logic: Implementing criteria to build new columns based on business requirements.
    • Data Aggregation: Aggregating data to different granularities.
    • Data Type Casting: Converting data from one data type to another.

    Data Transformation in the Medallion Architecture In the Medallion architecture, data transformation is strategically applied across different layers:

    • Bronze Layer: No transformations are applied. The data remains in its raw, unprocessed state.
    • Silver Layer: Focuses on basic transformations to clean and standardize data. This includes data cleansing, standardization, and normalization.
    • Gold Layer: Focuses on business-related transformations needed for the consumers, such as data integration, data aggregation, and the application of business logic and rules. The goal is to provide business-ready data that can be used for reporting and analytics.

    SQL Server for Data Warehousing

    The sources mention SQL Server as a tool used for building data warehouses. It is a platform that can run locally on a PC where a database can reside.

    Here’s what the sources indicate about using SQL Server in the context of data warehousing:

    • Building a data warehouse: SQL Server can be used to develop a modern data warehouse.
    • Project platform: In at least one of the projects described in the sources, the data warehouse was built completely in SQL Server.
    • Data loading: SQL Server is used to load data from source files, such as CSV files, into database tables. The BULK INSERT command is used to load data quickly from a file into a table.
    • Database and schema creation: SQL scripts are used to create a database and schemas within SQL Server to organize data.
    • SQL Server Management Studio: SQL Server Management Studio is a client tool used to interact with the database and run queries.
    • Three-layer architecture: The SQL Server database is organized into three schemas corresponding to the bronze, silver, and gold layers of a data warehouse.
    • DDL scripts: DDL (Data Definition Language) scripts are created and executed in SQL Server to define the structure of tables in each layer of the data warehouse.
    • Stored procedures: Stored procedures are created in SQL Server to encapsulate ETL processes, such as loading data from CSV files into the bronze layer.
    • Data quality checks: SQL queries are written and executed in SQL Server to validate data quality, such as checking for duplicates or null values.
    • Views in the gold layer: Views are created in the gold layer of the data warehouse within SQL Server to provide a business-ready, integrated view of the data.
    SQL Data Warehouse from Scratch | Full Hands-On Data Engineering Project

    The Original Text

    hey friends so today we are diving into something very exciting Building Together modern SQL data warehouse projects but this one is not any project this one is a special one not only you will learn how to build a modern Data Warehouse from the scratch but also you will learn how I implement this kind of projects in Real World Companies I’m bar zini and I have built more than five successful data warehouse projects in different companies and right now I’m leading big data and Pi Projects at Mercedes-Benz so that’s me I’m sharing with you real skills real Knowledge from complex projects and here’s what you will get out of this project as a data architect we will be designing a modern data architecture following the best practices and as a data engineer you will be writing your codes to clean transform load and prepare the data for analyzis and as a data Modell you will learn the basics of data moding and we will be creating from the scratch a new data model for analyzes and my friends by the end of this project you will have a professional portfolio project to Showcase your new skills for example on LinkedIn so feel free to take the project modify it and as well share it with others but it going to mean the work for me if you share my content and guess what everything is for free so there are no hidden costs at all and in this project we will be using SQL server but if you prefer other databases like my SQL or bis don’t worry you can follow along just fine all right my friends so now if you want to do data analytics projects using SQL we have three different types the first type of projects you can do data warehousing it’s all about how to organize structure and prepare your data for data analysis it is the foundations of any data analytics projects and in The Next Step you can do exploratory data analyzes Eda and all what you have to do is to understand and cover insights about our data sets in this kind of project you can learn how to ask the right questions and how to find the answer using SQL by just using basic SQL skills now moving on to the last stage where you can do Advanced analytics projects where you going to use Advanced SQL techniques in order to answer business questions like finding Trends over time comparing the performance segmenting your data into different sections and as well generate reports for your stack holders so here you will be solving real business questions using Advanced SQL techniques now what we’re going to do we’re going to start with the first type of projects SQL data warehousing where you will gain the following skills so first you will learn how to do ETL elt processing using SQL in order to prepare the data you will learn as well how to build data architecture how to do data Integrations where we can merge multiple sources together and as well how to do data load and data modeling so if I got you interested grab your coffee and let’s jump to the projects all right my friends so now before we Deep dive into the tools and the cool stuff we have first to have good understanding about what is exactly a data warehouse why the companies try to build such a data management system so now the question is what is a data warehouse I will just use the definition of the father of the data warehouse Bill Inon a data warehouse is subject oriented integrated time variance and nonvolatile collection of data designed to support the Management’s decision-making process okay I I know that might be confusing subject oriented it means thata Warehouse is always focused on a business area like the sales customers finance and so on integrated because it goes and integrate multiple Source systems usually you build a warehouse not only for one source but for multiple sources time variance it means you can keep historical data inside the data warehouse nonvolatile it means once the data enter the data warehouse it is not deleted or modified so this is how build and mod defined data warehouse okay so now I’m going to show you the scenario where your company don’t have a real data management so now let’s say that you have one system and you have like one data analyst has to go to this system and start collecting and extracting the data and then he going to spend days and sometimes weeks transforming the row data into something meaningful then once they have the report they’re going to go and share it and this data analyst is sharing the report using an Excel and then you have like another source of data and you have another data analyst that she is doing maybe the same steps collecting the data spending a lot of time transforming the data and then share at the end like a report and this time she is sharing the data using PowerPoint and a third system and the same story but this time he is sharing the data using maybe powerbi so now if the company works like this then there is a lot of issues first this process it take too way long I saw a lot of scenarios where sometimes it takes weeks and even months until the employee manually generating those reports and of course what going to happen for the users they are consuming multiple reports with multiple state of the data one report is 40 days old another one 10 days and a third one is like 5 days so it’s going to be really hard to make a real decision based on this structure a manual process is always slow and stressful and the more employees you involved in the process the more you open the door for human errors and errors of course in reports leads to bad decisions and another issue of course is handling the Big Data if one of your sources generating like massive amount of data then the data analyst going to struggle collecting the data and maybe in some scenarios it will not be any more possible to get the data so the whole process can breaks and you cannot generate any more fresh data for specific reports and one last very big issue with that if one of your stack holders asks for an integrated report from multiple sources well good luck with that because merging all those data manually is very chaotic timec consuming and full of risk so this is just a picture if a company is working without a proper data management without a data leak data warehouse data leak houses so in order to make real and good decisions you need data management so now let’s talk about the scenario of a data warehouse so the first thing that can happen is that you will not have your data team collecting manually the data you’re going to have a very important component called ETL ETL stands for extract transform and load it is a process that you do in order to extract the data from the sources and then apply multiple Transformations on those sources and at the end it loads the data to the data warehouse and this one going to be the single point of Truth for analyzes and Reporting and it is called Data Warehouse so now what can happen all your reports going to be consuming this single point of Truth so with that you create your multiple reports and as well you can create integrated reports from multiple sources not only from one single source so now by looking to the right side it looks already organized right and the whole process is completely automated there is no more manual steps which of course it ru uses the human error and as well it is pretty fast so usually you can load the data from the sources until the reports in matter of hours or sometimes in minutes so there is no need to wait like weeks and months in order to refresh anything and of course the big Advantage is that the data warehouse itself it is completely integrated so that means it goes and bring all those sources together in one place which makes it really easier for reporting and not only integrate you can build in the data warehouse as well history so we have now the possibility to access historical data and what is also amazing that all those reports having the same data status so all those reports can have the same status maybe sometimes one day old or something and of course if you have a modern Data Warehouse in Cloud platforms you can really easily handle any big data sources so no need to panic if one of your sources is delivering massive amount of data and of course in order to build the data warehouse you need different types of Developers so usually the one that builds the ATL component and the data warehouse is the data engineer so they are the one that is accessing the sources scripting the atls and building the database for the data warehouse and now for the other part the one that is responsible for that is the data analyst they are the one that is consuming the data warehouse building different data models and reports and sharing it with the stack holders so they are usually contacting the stack holders understanding the requirements and building multiple reports based on the data warehouse so now if you have a look to those two scenarios this is exactly why we need data management your data team is not wasting time and fighting with the data they are now more organized and more focused and with like data warehouse and you are delivering professional and fresh reports that your company can count on in order to make good and fast decisions so this is why you need a data management like a data warehouse think about data warehouse as a busy restaurant every day different suppliers bring in fresh ingredients vegetables spices meat you name it they don’t just use it immediately and throw everything in one pot right they clean it shop it and organize everything and store each ingredients in the right place fridge or freezer so this is the preparing face and when the order comes in they quickly grab the prepared ingredients and create a perfect dish and then serve it to the customers of the restaurant and this process is exactly like the data warehouse process it is like the kitchen where the raw ingredients your data are cleaned sorted and stored and when you need a report or analyzes it is ready to serve up exactly like what you need okay so now we’re going to zoom in and focus on the component ETL if you are building such a project you’re going to spend almost 90% just building this component the ATL so it is the core element of the data warehouse and I want you to have a clear understanding what is exactly an ETL so our data exist in a source system and now what we want to do is is to get our data from the source and move it to the Target source and Target could be like database tables so now the first step that we have to do is to specify which data we have to load from the source of course we can say that we want to load everything but let’s say that we are doing incremental loads so we’re going to go and specify a subset of the data from The Source in order to prepare it and load it later to the Target so this step in the ATL process we call it extract we are just identifying the data that we need we pull it out and we don’t change anything it’s going to be like one to one like the source system so the extract has only one task to identify the data that you have to pull out from the source and to not change anything so we will not manipulate the data at all it can stay as it is so this is the first step in the ETL process the extracts now moving on to the stage number two we’re going to take this extract data and we will do some manipulations Transformations and we’re going to change the shape of those data and this process is really heavy working we can do a lot of stuff like data cleansing data integration and a lot of formatting and data normalizations so a lot of stuff we can do in this step so this is the second step in the ETL process the transformation we’re going to take the original data and reshape it transformat into exactly the format that we need into a new format and shapes that we need for anal and Reporting now finally we get to the last step in the ATL process we have the load so in this step we’re going to take this new data and we’re going to insert it into the targets so it is very simple we’re going to take this prepared data from the transformation step and we’re going to move it into its final destination the target like for example data warehouse so that’s ETL in the nutshell first extract the row data then transform it into something meaningful and finally load it to a Target where it’s going to make a difference so that’s that’s it this is what we mean with the ETL process now in real projects we don’t have like only source and targets our thata architecture going to have like multiple layers depend on your design whether you are building a warehouse or a data lake or a data warehouse and usually there are like different ways on how to load the data between all those layers and in order now to load the data from one layer to another one there are like multiple ways on how to use the ATL process so usually if you are loading the data from the source to the layer number one like only the data from the source and load it directly to the layer number one without doing any Transformations because I want to see the data as it is in the first layer and now between the layer number one and the layer number two you might go and use the full ETL so we’re going to extract from the layer one transform it and then load it to the layer number two so with that we are using the whole process the ATL and now between Layer Two and layer three we can do only transformation and then load so we don’t have to deal with how to extract the data because it is maybe using the same technology and we are taking all data from Layer Two to layer three so we transform the whole layer two and then load it to layer three and now between three and four you can use only the L so maybe it’s something like duplicating and replicating the data and then you are doing the transformation so you load to the new layer and then transform it of course this is not a real scenario I’m just showing you that in order to move from source to a Target you don’t have always to use a complete ETL depend on the design of your data architecture you might use only few components from the ETL okay so this is how ETL looks like in real projects okay so now I would like to show you an overview of the different techniques and methods in the etls we have wide range of possibilities where you have to make decisions on which one you want to apply to your projects so let’s start first with the extraction the first thing that I want to show you is we have different methods of extraction either you are going to The Source system and pulling the data from the source or the source system is pushing the data to the data warehouse so those are the two main methods on how to extract data and then we have in the extraction two types we have a full extraction everything all the records from tables and every day we load all the data to the data warehouse or we make more smarter one where we say we’re going to do an incremental extraction where every day we’re going to identify only the new changing data so we don’t have to load the whole thing only the new data we go extract it and then load it to the data warehouse and in data extraction we have different techniques the first one is like manually where someone has to access a source system and extract the data manually or we connect ourself to a database and we have then a query in order to extract the data or we have a file that we have to pass it to the data warehouse or another technique is to connect ourself to API and do their cods in order to extract the data or if the data is available in streaming like in kfka we can do event based streaming in order to extract the data another way is to use the change data capture CDC is as well something very similar to streaming or another way is by using web scrapping where you have a code that going to run and extract all the informations from the web so those are the different techniques and types that we have in the extraction now if you are talking on the transformation there are wide range of different Transformations that we can do on our data like for example doing data enrichment where we add values to our data sets or we do a data integration where we have multiple sources and we bring everything to one data model or we derive a new of columns based on already existing one another type of data Transformations we have the data normalization so the sources has values that are like a code and you go and map it to more friendly values for the analyzers which is more easier to understand and to use another Transformations we have the business rules and logic depend on the business you can Define different criterias in order to build like new columns and what belongs to Transformations is the data aggregation so here we aggregate the data to a different granularity and then we have type of transformation called Data cleansing there are many different ways on how to clean our data for example removing the duplicates doing data filtering handling the missing data handling invalid values or removing unwanted spaces casting the data types and detecting the outliers and many more so we have different types of data cleansing that we can do in our data warehouse and this is very important transformation so as you can see we have different types of Transformations that we can do in our data warehouse now moving on to the load so what do we have over here we have different processing types so either we are doing patch processing or stream processing patch processing means we are loading the data warehouse in one big patch of data that’s going to run and load the data warehouse so it is only one time job in order to refresh the content of the data warehouse and as well the reports so that means we are scheduling the data warehouse in order to load it in the day once or twice and the other type we have the stream processing so this means if there is like a change in the source system we going to process this change as soon as possible so we’re going to process it through all the layers of the data warehouse once something changes from The Source system so we are streaming the data in order to have real time data warehouse which is very challenging things to do in data warehousing and if you are talking about the loads we have two methods either we are doing a full load or incremental load it’s a same thing as extraction right so for the full load in databases there are like different methods on how to do it like for example we trate and then insert that means we make the table completely empty and then we insert everything from the scratch or another one you are doing an update insert we call it upsert so we can go and update all the records and then insert the new one and another way is to drop create an insert so that means we drop the whole table and then we create it from scratch and then we insert the data it is very similar to the truncate but here we are as well removing and drubbing the whole table so those are the different methods of full loads the incremental load we can use as well the upserts so update and inserts so we’re going to do an update or insert statements to our tables or if the source is something like a log we can do only inserts so we can go and Abend the data always to the table without having to update anything another way to do incremental load is to do a merge and here it is very similar to the upsert but as well with a delete so update insert delete so those are the different methods on how to load the data to your tables and one more thing in data warehousing we have something called slowly changing Dimensions so here it’s all about the hyz of your table and there are many different ways on how to handle the Hyer in your table the first type is sd0 we say there is no historization and nothing should be changed at all so that means you are not going to update anything the second one which is more famous it is the sd1 you are doing an override so that means you are updating the records with the new informations from The Source system by overwriting the old value so we are doing something like the upsert so update and insert but you are losing of course history another one we have the scd2 and here you want to add historization to your table so what we do so what we do each change that we get from The Source system that means we are inserting new records and we are not going to overwrite or delete the old data we are just going to make it inactive and the new record going to be active one so there are different methods on how to do historization as well while you are loading the data to the data warehouse all right so those are the different types and techniques that you might encounter in data management projects so now what I’m going to show you quickly which of those types we will be using in our projects so now if we are talking about the extraction over here we will be doing a pull extraction and about the full or incremental it’s going to be a full extraction and about the technique we are going to be passsing files to the data warehouse and now about the data transformation well this one we will cover everything all those types of Transformations that I’m showing you now is going to be part of the project because I believe in each data project you will be facing those Transformations now if we have a look to the load our project going to be patch processing and about the load methods we will be doing a full load since we have full extraction and it’s going to be trunk it and inserts and now about the historization we will be doing the sd1 so that means we will be updating the content of the thata Warehouse so those are the different techniques and types that we will be using in our ETL process for this project all right so with that we have now clear understanding what is a data warehouse and we are done with the theory parts so now the next step we’re going to start with the projects the first thing that you have to do is to prepare our environment to develop the projects so let’s start with that all right so now we go to the link in the description and from there we’re going to go to the downloads and and here you can find all the materials of all courses and projects but the one that we need now is the SQL data warehouse projects so let’s go to the link and here we have bunch of links that we need for the projects but the most important one to get all data and files is this one download all project files so let’s go and do that and after you do that you’re going to get a zip file where you have there a lot of stuff so let’s go and extract it and now inside it if you go over here you will find the reposter structure from git and the most important one here is the data ass sets so you have two sources the CRM and the Erp and in each one of them there are three CSV files so those are the data set for the project for the other stuffs don’t worry about it we will be explaining that during the project so go and get the data and put it somewhere at your PC where you don’t lose it okay so now what else do we have we have here a link to the get repository so this is the link to my repository that I have created through the projects so you can go and access it but don’t worry about it we’re going to explain the whole structure during the project and you will be creating your own repository and as well we have the link to the notion here we are doing the project management here you’re going to find the main steps the main phes of the SQL projects that we will do and as well all the task that we will be doing together during the projects and now we have links to the project tools so if you don’t have it already go and download the SQL Server Express so it’s like a server that going to run locally at your PC where your database going to live another one that you have to download is the SQL Server management Studio it is just a client in order to interact with the database and there we’re going to run all our queries and then link to the GitHub and as well link to the draw AO if you don’t have it already go and download it it is free and amazing tool in order to draw diagrams so through the project we will be drawing data models the data architecture a data lineage so a lot of stuff we’ll be doing using this tool so go and download it and the last thing it is nice to have you have a link to the notion where you can go and create of course free account accounts if you want to build the project plan and as well Follow Me by creating the project steps and the project tasks okay so that’s all those are all the links for the projects so go and download all those stuff create the accounts and once you are ready then we continue with the projects all right so now I hope that you have downloaded all the tools and created the accounts now it’s time to move to very important step that’s almost all people skip while doing projects and then that is by creating the project plan and for that we will be using the tool notion notion is of course free tool and it can help you to organize your ideas your plans and resources all in one place I use it very intensively for my private projects like for example creating this course and I can tell you creating a project plan is the key to success creating a data warehouse project is usually very complex and according to Gardner reports over 50% of data warehouse projects fail and my opinion about any complex project the key to success is to have a clear project plan so now at this phase of the project we’re going to go and create a rough project plan because at the moment we don’t have yet clear understanding about the data architecture so let’s go okay so now let’s create a new page and let’s call it data warehouse projects the first thing is that we have to go and create the main phases and stages of the projects and for that we need a table so in order to do that hit slash and then type database in line and then let’s go and call it something like data warehouse epic and we’re going to go and hide it because I don’t like it and then on the table we can go and rename it like for example project epics something like that and now what we’re going to do we’re going to go and list all the big task of the projects so an epic is usually like a large task that needs a lot of efforts in order to solve it so you can call it epics stages faces of the project whatever you want so we’re going to go and list our project steps so it start with the requirements analyzes and then designing data architecture and another one we have the project initialization so those are the three big task in the project first and now what do we need we need another table for the small chunks of the tasks the subtasks and we’re going to do the same thing so we’re going to go and hit slash and we’re going to search for the table in line and we’re going to do the same thing so first we’re going to call it data warehouse tasks and then we’re going to hide it and over here we’re going to rename it and say this is the project tasks so now what we’re going to do we’re going to go to the plus icon over here and then search for relation this one over here with the arrow and now we’re going to search for the name of the first table so we called it data warehouse iix so let’s go and click it and we’re going to say as well two-way relation so let’s go and add the relation so with that we got a fi in the new table called Data Warehouse iix this comes from this table and as well we have here data warehouse tasks that comes from from the below table so as you can see we have linked them together now what I’m going to do I’m going to take this to the left side and then what we’re going to do we’re going to go and select one of those epics like for example let’s take design the data architecture and now what we’re going to do we’re going to go and break down this Epic into multiple tasks like for example choose data management approach and then we have another task what we’re going to do we’re going to go and select as well the same epic so maybe the next step is brainstorm and design the layers and then let’s go to another iic for example the project initialization and we say over here for example create get repo prepare the structure we can go and make another one in the same epic let’s say we’re going to go and create the database and the schemas so as you can see I’m just defining the subtasks of those epics so now what we’re going to do we’re going to go and add a checkbox in order to understand whether we have done the task or not so we go to the plus and search for check we need the check box and what we’re going to do we’re going to make it really small like this and with that each time we are done with the task we’re going to go and click on it just to make sure that we have done the task now there is one more thing that is not really working nice and that is here we’re going to have like a long list of tasks and it’s really annoying so what we’re going to do we’re going to go to the plus over here and let’s search for roll up so let’s go and select it so now what we’re going to do we have to go and select the relationship it’s going to be that data warehouse task and after that we’re going to go to the property and make it as the check box so now as you can see in the first table we are saying how many tasks is closed but I don’t want to show it like this what you going to do we’re going to go to the calculation and to the percent and then percent checked and with that we can see the progress of our project and now instead of the numbers we can have really nice bar great so as well we can go and give it a name like progress so that’s it and we can go and hide the data warehouse tasks and now with that we have really nice progress bar for each epic and if we close all the tasks of this epic we can see that we have reached 100% so this is the main structure now we can go and add some cosmetics and rename stuff in order to make things looks nicer like for example if I go to the tasks over here I can go and call it tasks and as well go and change the icon to something like this and if you’d like to have an icon for all those epics what we going to do we’re going to go to the Epic for example design data architecture and then if you hover on top of the title you can see add an icon and you can go and pick any icon that you want so for example this one and now now as you can see we have defined it here in the top and the icon going to be as well in the pillow table okay so now one more thing that we can do for the project tasks is that we can go and group them by the epics so if you go to the three dots and then we go to groups and then we can group up by the epics and as you can see now we have like a section for each epic and you can go and sort the epics if you want if you go over here sort then manual and you can go over here and start sorting the epics as you want and with that you can expand and minimize each task if you don’t want to see always all tasks in one go so this is really nice way in order to build like data management for your projects of course in companies we use professional Tools in order to do projects like for example Gyra but for private person projects that I do I always do it like this and I really recommend you to do it not only for this project for any project that you are doing CU if you see the whole project in one go you can see the big picture and closing tasks and doing it like this these small things can makes you really satisfied and keeps you motivated to finish the whole project and makes you proud okay friends so now I just went and added few icons a rename stuff and as well more tasks for each epic and this going to be our starting point in the project and once we have more informations we’re going to go and add more details on how exactly we’re going to build the data warehouse so at the start we’re going to go and analyze and understand the requirements and only after that we’re going to start designing the data architecture and here we have three tasks first we have to to choose the data management approach and after that we’re going to do brainstorming and designing the layers of the data warehouse and at the end we’re going to go and draw a data architecture so with that we have clear understanding how the data architecture looks like and after that we’re going to go to the next epic where we’re going to start preparing our projects so once we have clear understanding of the data architecture the first task here is to go and create detailed project tasks so we’re going to go and add more epes and more tasks and once we are done then we’re going to go and create the naming conventions for the project just to make sure that we have rules and standards in the whole project and next we’re going to go and create a repository in the git and we can to prepare as well the structure of the repository so that we always commit our work there and then we can start with the first script where we can create a database and schemas so my friends this is the initial plan for the project now let’s start with the first epic we have the requirements analyzes now analyzing the requirement it is very important to understand which type of data wehous you’re going to go and build because there is like not only one standard on how to build it and if you go blindly implementing the data warehouse you might be doing a lot of stuff that is totally unnecessary and you will be burning a lot of time so that’s why you have to sit with the stockholders with the department and understand what we exactly have to build and depend on the requirements you design the shape of the data warehouse so now let’s go and analyze the requirement of this project now the whole project is splitted into two main sections the first section we have to go and build a data warehouse so this is a data engineering task and we will go and develop etls and data warehouse and once we have done that we have to go and build analytics and reporting business intelligence so we’re going to do data analysis but now first we will be focusing on the first part building the data warehouse so what do you have here the statement is very simple it says develop a modern data warehouse using SQL Server to consolidate sales data enabling analytical reporting and informed decision making so this is the main statements and then we have specifications the first one is about the data sources it says import data from two Source systems Erb and CRM and they are provided as CSV files and now the second task is talking about the data quality we have to clean and fix data quality issues before we do the data analyses because let’s be real there is no R data that is perfect is always missing and we have to clean that up now the next task is talking about the integration so it says we have to go and combine both of the sources into one single userfriendly data model that is designed for analytics and Reporting so that means we have to go and merge those two sources into one single data model and now we have here another specifications it says focus on the latest data sets so there is no need for historization so that means we don’t have to go and build histories in the the database and the final requirement is talking about the documentation so it says provide clear documentations of the data model so that means the last product of the data warehouse to support the business users and the analytical teams so that means we have to generate a manual that’s going to help the users that makes lives easier for the consumers of our data so as you can see maybe this is very generic requirements but it has a lot of information already for you so it’s saying that we have to use the platform SQL Server we have two Source systems using using the CSV files and it sounds that we really have a bad data quality in the sources and as well it wants us to focus on building completely new data model that is designed for reporting and it says we don’t have to do historization and it is expected from us to generate documentations of the system so these are the requirements for the data engineering part where we’re going to go and build a data warehouse that fulfill these requirements all right so with that we have analyzed the requirements and as well we have closed at the first easiest epic so we are done with this let’s go and close it and now let’s open another one here we have to design the data architecture and the first task is to choose data management approach so let’s go now designing the data architecture it is exactly like building a house so before construction starts an architect going to go and design a plan a blueprint for the house how the rooms will be connected how to make the house functional safe and wonderful and without this blueprint from The Architects the builders might create something unstable inefficient or maybe unlivable the same goes for data projects a data architect is like a house architect they design how your data will flow integrate and be accessed so as data Architects we make sure that the data warehouse is not only functioning but also scalable and easy to maintain and this is exactly what we will do now we will play the role of the data architect and we will start brainstorming and designing the architecture of the data warehouse so now I’m going to show you a sketch in order to understand what are the different approaches in order to design a data architecture and this phase of the projects usually is very exciting for me because this is my main role in data projects I am a data architect and I discuss a lot of different projects where we try to find out the best design for the projects all right so now let’s go now the first step of building a data architecture is to make very important decision to choose between four major types the first approach is to build a data warehouse it is very suitable if you have only structured data and your business want to build solid foundations for reporting and business intelligence and another approach is to build a data leak this one is way more flexible than a data warehouse where you can store not only structured data but as well semi and unstructured data we usually use this approach if you have mixed types of data like database tables locks images videos and your business want to focus not only on reporting but as well on Advanced analytics or machine learning but it’s not that organized like a data warehouse and data leaks if it’s too much unorganized can turns into Data swamp and this is where we need the next approach so the next one we can go and build data leak house so it is like a mix between data warehouse and data leak you get the flexibility of having different types of data from the data Lake but you still want to structure and organiz your data like we do in the data warehouse so you mix those two words into one and this is a very modern way on how to build data Architects and this is currently my favorite way of building data management system now the last and very recent approach is to build data Mish so this is a little bit different instead of having centralized data management system the idea now in the data Mish is to make it decentralized you cannot have like one centralized data management system because always if you say centralized then it means bottleneck so instead you have multiple departments and multiple domains where each one of them is building a data product and sharing it with others so now you have to go and pick one of those approaches and in this project we will be focusing on the data warehouse so now the question is how to build the data warehouse well there is as well four different approaches on how to build it the first one is the inone approach so again you have your sources and the first layer you start with the staging where the row data is landing and then the next layer you organize your data in something called Enterprise data Warehouse where you go and model the data using the third normal format it’s about like how to structure and normalize your tables so you are building a new integrated data model from the multiple sources and then we go to the third layer it’s called the data Mars where you go and take like small subset of the data warehouse and you design it in a way that is ready to be consumed from reporting and it focus on only one toque like for example the customers sales or products and after that you go and connect your bi tool like powerbi or Tableau to the data Mars so with that you have three layers to prepare the data before reporting now moving on to the next one we have the kle approach he says you know what building this Enterprise data warehouse it is wasting a lot of time so what we can do we can jump immediately from the stage layer to the final data marks because building this Enterprise data warehouse it is a big struggle and usually waste a lot of time so he always want you to focus and building the data marks quickly as possible so it is faster approach than Inon but with the time you might get chaos in the data Mars because you are not always focusing in the big picture and you might be repeating same Transformations and Integrations in different data Mars so there is like trade-off between the speed and consistent data warehouse now moving on to the third approach we have the Data Vault so we still have the stage and the data Mars but it says we still need this Central Data Warehouse in the middle but this middle layer we’re going to bring more standards and rules so it tells you to split this middle layer into two layers the row Vault and the business vault in the row Vault you have the original data but in the business Vault you have all the business rules and Transformations that prepares the data for the data Mars so Data Vault it is very similar to the in one but it brings more standards and rules to the middle layer now I’m going to go and add a fourth one that I’m going to call it Medallion architecture and this one is my favorite one because it is very easy to understand and to build so it says you’re going to go and build three layers bronze silver and gold the bronze layer it is very similar to the stage but we have understood with the time that the stage layer is very important because having the original data as it is it going to helps a lot by tracebility and finding issues then the next layer we have the silver layer it is where we do Transformations data cleansy but we don’t apply yet any business rules now moving on to the last layer the gold layer it is as well very similar to the data Mars but there we can build different typ type of objects not only for reporting but as well for machine learning for AI and for many different purposes so they are like business ready objects that you want to share as a data product so those are the four approaches that you can use in order to build a data warehouse so again if you are building a data architecture you have to specify which approach you want to follow so at the start we said we want to build a data warehouse and then we have to decide between those four approaches on how to build the data warehouse and in this project we will be using using The Medallion architecture so this is a very important question that you have to answer as the first step of building a data architecture all right so with that we have decided on the approach so we can go and Mark it as done the next step we’re going to go and design the layers of the data warehouse now there is like not 100% standard way and rules for each layer what you have to do as a data architect you have to Define exactly what is the purpose of each layer so we start with the bronze layer so we say it going to store row and unprocessed data as it is from the sources and why we are doing that it is for tracebility and debugging if you have a layer where you are keeping the row data it is very important to have the data as it is from the sources because we can go always back to the pron layer and investigate the data of specific Source if something goes wrong so the main objective is to have row untouched data that’s going to helps you as a data engineer by analyzing the road cause of issues now moving on to the silver layer it is the layer where we’re going to store clean and standardized data and this is the place where we’re going to do basic transformations in order to prepare the data for the final layer now for the good layer it going to contain business ready data so the main goal here is to provide data that could be consumed by business users and analysts in order to build reporting and analytics so with that we have defined the main goal for each layer now next what I would like to do is to to define the object types and since we are talking about a data warehouse in database we have here generally two types either a table or a view so we are going for the bronze layer and the silver layer with tables but for the gold layer we are going with the views so the best practice says for the last layer in your data warehouse make it virtual using views it going to gives you a lot of dynamic and of course speed in order to build it since we don’t have to make a load process for it and now the next step is that we’re going to go and Define the load method so in this project I have decided to go with the full load using the method of trating and inserting it is just faster and way easier so we’re going to say for the pron layer we’re going to go with the full load and you have to specify as well for the silver layer as well we’re going to go with the full load and of course for the views we don’t need any load process so each time you decide to go with tables you have to define the load methods with full load incremental loads and so on now we come to the very interesting part the data Transformations now for the pron layer it is the easiest one about this topic because we don’t have any transformations we have to commit ourself to not touch the data do not manipulate it don’t change anything so it’s going to stay as it is if it comes bad it’s going to stay bad in the bronze layer and now we come to the silver layer where we have the heavy lifting as we committed in the objective we have to make clean and standardized data and for that we have different types of Transformations so we have to do data cleansing data standardizations data normalizations we have to go and derive new columns and data enrichment so there are like bunch of trans transformation that we have to do in order to prepare the data our Focus here is to transform the data to make it clean and following standards and try to push all business transformations to the next layer so that means in the god layer we will be focusing on business Transformations that is needed for the consumers for the use cases so what we do here we do data Integrations between Source system we do data aggregations we apply a lot of business Logics and rules and we build a data model that is ready for for example business inions so here we do a lot of business Transformations and in the silver layer we do basic data Transformations so it is really here very important to make the fine decisions what type of transformations to be done in each layer and make sure that you commit to those rules now the next aspect is about the data modeling in the bronze layer and the silver layer we will not break the data model that comes from the source system so if the source system deliver five tables we’re going to have here like five tables and as well in the silver layer we will not go and D normalize or normalize or like make something new we’re going to leave it exactly like it comes from the source system because what we’re going to do we’re going to build the data model in the gold layer and here you have to Define which data model you want to follow are you following the star schema the snowflake or are you just making aggregated objects so you have to go and make a list of all data models types that you’re going to follow in the gold layer and at the end what you can specify in each layer is the target audience and this is of course very important decision in the bronze layer you don’t want to give access access to any end user it is really important to make sure that only data Engineers access the bronze layer it makes no sense for data analysts or data scientist to go to the bad data because you have a better version for that in the silver layer so in the silver layer of course the data Engineers have to have an access to it and as well the data analysts and the data scientist and so on but still you don’t give it to any business user that can’t deal with the row data model from the sources because for the business users you’re going to get a bit layer for them and that is the gold layer so the gold layer it is suitable for the data analyst and as well the business users because usually the business users don’t have a deep knowledge on the technicality of the Sero layer so if you are designing multiple layers you have to discuss all those topics and make clear decision for each layer all right my friends so now before we proceed with the design I want to tell you a secret principle Concepts that each data architect must know and that is the separation of concerns so what is that as you are designing an architecture you have to make sure to break down the complex system into smaller independent parts and each part is responsible for a specific task and here comes the magic the component of your architecture must not be duplicated so you cannot have two parts are doing the same thing so the idea here is to not mix everything and this is one of the biggest mistakes in any big projects and I have sewn that almost everywhere so a good data architect follow this concept this principle so for example if you are looking to our data architecture we have already done that so we have defined unique set of tasks for each layer so for example we have said in the silver layer we do data cleansing but in the gold layer we do business Transformations and with that you will not be allowing to do any business transformations in the silver layer and the same thing goes for the gold layer you don’t do in the gold layer any data cleansing so each layer has its own unique tasks and the same thing goes for the pron layer and the silver layer you do not allow to load data from The Source systems directly to the silver layer because we have decided the landing layer the first layer is the pron layer otherwise you will have like set of source systems that are loaded first to the pron layer and another set is skipping the layer and going to the silver and with that we have overlapping you are doing data inje in two different layers so my friends if you have this mindsets separation of concerns I promise you you’re going to be a data architect so think about it all right my friends so with that we have designed the layers of the data warehouse we can go and close it the next step we’re going to go to draw o and start drawing the data architecture so there is like no one standard on how to build a data architecture you can add your style and the way that you want so now the first thing that we have to show in data architecture is the different layers that we have the first layer is the source system layer so let’s go and take a box like this and make it a little bit bigger and I’m just going to go and make the design so I’m going to remove the fill and make the line dotted one and after dots I’m going to go and change maybe the color to something like this gray so now we have like a container for the first layer and then we have to go and add like a text on top of it so what I’m going to do I’m going to take another box let’s go and type inside it sources and I’m going to go and style it so I’m going to go to the text and make it maybe 24 and then remove the lines like this make it a little bit smaller and put it on top so this is the first layer this is where the data come from and then the data going to go inside a data warehouse so I’m just going to go and duplicate this one this one is the data warehouse all right so now the third layer what is going to be it’s going to be the consumers who will be consuming this data warehouse so I’m going to put another box and say this is the consume layer okay so those are the three containers now inside the data warehouse we have decided to build it using the Medan architecture so we’re going to have three layers inside the warehouse so I’m going to take again another box I’m going to call this one this is the bronze layer and now we have to go and put a design for it so I’m going to go with this color over here and then the text and maybe something like 20 and then make it a little bit smaller and just put it here and beneath that we’re going to have the component so this is just a title of a container so I’m going to have it like this this remove the text from inside it and remove the filling so this container is for the bronze layer let’s go and duplicate it for the next one so this one going to be the silver layer and of course we can go and change the coloring to gray because it is silver and as well the lines and remove the filling great and now maybe I’m going to make the font as bold all right now the third layer going to be the gold layer and we have to go and pick it color for that so style and here we have like something like yellow the same thing for the container I remove the filling so with that we are showing now the different layers inside our data warehouse now those containers are empty what we’re going to do we’re going to go inside each one of them and start adding contents so now in the sources it is very important to make it clear what are the different types of source system that you are connecting to the data warehouse because in real project there are like multiple types you might have a database API files CFA and here it’s important to show those different types in our projects we have folders and inside those folders We have CSV files so now what you have to do we have to make it clear in this layer that the input for our project is CSV file so it really depend how you want to show that I’m going to go over here and say maybe folder and then I’m going to go and take the folder and put it here inside and then maybe search for file more results and go pick one of those icons for example I’m going to go with this one over here so I’m going to make it smaller and add it on top of the folder so with that we make it clear for everyone seeing the architecture that the sources is not a database is not an API it is a file inside the folder so now very important here to show is the source systems what are the sources that is involved in the project so here what we’re going to do we’re going to go and give it a name for example we have one source called CRM B like this and maybe make the icon and we have another source called Erp so we going to go and duplicate it put it over here and then rename it Erp so now it is for everyone clear we have two sources for the this project and the technology is used is simply a file so now what we can do as well we can go and add some descriptions inside this box to make it more clear so what I’m going to do I’m going to take a line because I want to split the description from the icons something like this and make it gray and then below it we’re going to go and add some text and we’re going to say is CSV file and the next point and we can say the interface is simply files in folder and of course you can go and add any specifications and explanation about the sources if it is a database you can see the type of the database and so on so with that we made it in the data architecture clear what are the sources of our data warehouse and now the next step what we’re going to do we’re going to go and design the content of the bronze silver and gold so I’m going to start by adding like an icon in each container it is to show about that we are talking about database so what we’re going to do we’re going to go and search for database and then more result more results I’m going to go with this icon over here so let’s go and make it it’s bigger something like this maybe change the color of that so we’re going to have the bronze and as well here the silver and the gold so now what we’re going to do we’re going to go and add some arrows between those layers so we’re going to go over here so we can go and search for Arrow and maybe go and pick one of those let’s go and put it here and we can go and pick a color for that maybe something like this and adjust it so now we can have this nice Arrow between all the layers just to explain the direction of our architecture right so we can read this from left to right and as well between the gold layer and the consume okay so now what I’m going to do next we’re going to go and add one statement about each layer the main objective so let’s go and grab a text and put it beneath the database and we’re going to say for example for the bl’s layer it’s going to be the row data maybe make the text bigger so you are the row data and then the next one in the silver you are cleans standard data and then the last one for the gos we can say business ready data so with that we make the objective clear for each layer now below all those icons what we going to do we’re going to have a separator again like this make it like colored and beneath it we’re going to add the most important specifications of this layer so let’s go and add those separators in each layer okay so now we need a text below it let’s take this one here so what is the object type of the bronze layer it’s going to be a table and we can go and add the load methods we say this is patch processing since we are not doing streaming we can say it is a full load we are not doing incremental load so we can say here Tran and insert and then we add one more section maybe about the Transformations so we can say no Transformations and one more about the data model we’re going to say none as is and now what I’m going to do I’m going to go and add those specifications as well for the silver and gold so here what we have discussed the object type the load process the Transformations and whether we are breaking the data model or not the same thing for the gold layer so I can say with that we have really nice layering of the data warehouse and what we are left is with the consumers over here you can go and add the different use cases and tools that can access your data warehouse like for example I’m adding here business intelligence and Reporting maybe using poweri or Tau or you can say you can access my data warehouse in order to do atoc analyzes using the SQ queries and this is what we’re going to focus on the projects after we buil the data warehouse and as well you can offer it for machine learning purposes and of course it is really nice to add some icons in your architecture and usually I use this nice websites called Flat icon it has really amazing icons that you can go and use it in your architecture now of course we can go and keep adding icons and stuff to explain the data architecture and as well the system like for example it is very important here to say which tools you are using in order to build this data warehouse is it in the cloud are you using Azure data breaks or maybe snowflake so we’re going to go and add for our project the icon of SQL Server since we are building this data warehouse completely in the SQL Server so for now I’m really happy about it as you can see we have now a plan right all right guys so with that we have designed the data architecture using the drw O and with that we have done the last step in this epic and now with that we have a design for the data architecture and we can say we have closed this epic now let’s go to the next one we will start doing the first step to prepare our projects and the first task here is to create a detailed project plan all right my friends so now it’s clear for us that we have three layers and we have to go and build them so that means our big epic is going to be after the layers so here I have added three more epics so we have build bronze layer build silver layer and gold layer and after that I went and start defining all the different tasks that we have to follow in the projects so at the start will be analyzing then coding and after that we’re going to go and do testing and once everything is ready we’re going to go and document stuff and at the end we have to commit our work in the get repo all those epics are following the same like pattern in the tasks so as you can see now we have a very detailed project structure and now things are more cleared for us how we going to build the data warehouse so with that we are done from this task and now the next task we have to go and Define the naming Convention of the projects all right so now at this phase of the projects we usually Define the naming conventions so what is that it a set of rules that you define for naming everything in the projects whether it is a database schema tables start procedures folders anything and if you don’t do that at the early phase of the project I promise you chaos can happen because what going to happen you will have different developers in your projects and each of those developers have their own style of course so one developer might name a tabled Dimension customers where everything is lowercase and between them underscore and you have another developer creating another table called Dimension products but using the camel case so there is no separation between the words and the first character is capitalized and maybe another one using some prefixes like di imore categories so we have here like a shortcut of the dimension so as you can see there are different designs and styles and if you leave the door open what can happen in the middle of the projects you will notice okay everything looks inconsistence and you can define a big task to go and rename everything following specific role so instead of wasting all this time at this phase you go and Define the naming conventions and let’s go and do that so we will start with a very important decision and that is which naming convention we going to follow in the whole project so you have different cases like the camel case the Pascal case the Kebab case and the snake case and for this project we’re going to go with the snake case where all the letters of award going to be lowercase and the separation between wordss going to be an underscore for example a table name called customer info customer is lowercased info is as well lowercased and between them an underscore so this is always the first thing that you have to decide for your data project the second thing is to decide the language so for example I work in Germany and there is always like a decision that we have to make whether we use Germany or English so we have to decide for our project which language we’re going to use and a very important general rule is that avoid reserved words so don’t use a square reserved word as an object name like for example table don’t give a table name as a table so those are the general principles so those are the general rules that you have to follow in the whole project this applies for everything for tables columns start procedures any names that you are giving in your scripts now moving on we have specifications for the table names and here we have different set of rules for each layer so here the rule says Source system uncore entity so we are saying all the tables in the bronze layer should start first with the source system name like for example CRM or Erb and after that we have an underscore and then at the end we have the entity name or the table name so for example we have this table name CRM uncore so that means this table comes from the source system CRM and then we have the table name the entity name customer info so this is the rule that we’re going to follow in naming all tables in the pron layer then moving on to the silver layer it is exactly like the bronze because we are not going to rename anything we are not going to build any new data model so the naming going to be one to one like the bronze so it is exactly the same rules as the bronze but if we go to the gold here since we are building new data model we have to go and rename things and since as well we are integrating multi sources together we will not be using the source system name in the tables because inside one table you could have multiple sources so the rule says all the names must be meaningful business aligned names for the tables starting with the category prefix so here the rule says it start with category then underscore and then entity now what is category we have in the go layer different types of tables so we could build a table called a fact table another one could be a dimension a third type could be an aggregation or report so we have different types of tables and we can specify those types as a perect at the start so for example we are seeing here effect uncore sales so the category is effect and the table name called sales and here I just made like a table with different type of patterns so we could have a dimension so we say it start with the di imore for example the IM customers or products and then we have another type called fact table so it starts with fact underscore or aggregated table where we have the fair three characters like aggregating the customers or the sales monthly so as you can see as you are creating a naming convention you have first to make it clear what is the rule describe each part of the rule and start giving examples so with that we make it clear for the whole team which names they should follow so we talked here about the table naming convention then you can as well go and make naming convention for the columns like for example in the gold layer we’re going to go and have circuit keys so we can Define it like this the circuit key should start with a table name and then underscore a key like for example we can call it customer underscore key it is a surrogate key in the dimension customers the same thing for technical columns as a data engineer we might add our own columns to the tables that don’t come from the source system and those columns are the technical columns or sometimes we call them metadata columns now in order to separate them from the original columns that comes from the source system we can have like a prefix for that like for example the rule says if you are building any technical or metadata columns the column should start with dwore and then that column name for example if you want the metadata load date we can have dwore load dates so with that if anyone sees that column starts with DW we understand this data comes from a data engineer and we can keep adding rules like for example the St procedure over here if you are making an ETL script then it should should start with the prefix load uncore and then the layer for example the St procedure that is responsible for loading the bronze going to be called load uncore bronze and for the Silver Load uncore silver so those are currently the rules for the St procedure so this is how I do it usually in my projects all right my friends so with do we have a solid namey conventions for our projects so this is done and now the next with that we’re going to go to git and you will create a brand new repository and we’re going to prepare its structure so let’s go go all right so now we come to as well important step in any projects and that’s by creating the git repository so if you are new to git don’t worry about it it is simpler than it sounds so it’s all about to have a safe place where you can put your codes that you are developing and you will have the possibility to track everything happen to the codes and as well you can use it in order to collaborate with your team and if something goes wrong you can always roll back and the best part here once you are done with the project you can share your reposter as a part of your portfolio and it is really amazing thing if you are applying for a job by showcasing your skills that you have built a data warehouse by using well documented get reposter so now let’s go and create the reposter of the project now we are at the overview of our account so the first thing that you have to do is to go to the repos stories over here and then we’re going to go to this green button and click on you the first thing that we have to do is to give Theory name so let’s call it SQL data warehouse project and then here we can go and give it a description so for example I’m saying building a modern data warehouse with SQL Server now the next option whether you want to make it public and private I’m going to leave it as a public and then let’s go and add here a read me file and then here about the license we can go over here and select the MIT MIT license gives everyone the freedom of using and modifying your code okay so I think I’m happy with the setup let’s go and create the repost story and with that we have our brand new reposter now the next step that I usually do is to create the structure of the reposter and usually I always follow the same patterns in any projects so here we need few folders in order to put our files right so what I usually do I go over here to add file create a new file and I start creating the structure over here so the first thing is that we need data sets then slash and with that the repos you can understand this is a folder not a file and then you can go and add anything like here play holder just an empty file this just can to help me to create the folders so let’s go and commit so commit the changes and now if you go back to the main projects you can see now we have a folder called data sets so I’m going to go and keep creating stuff so I will go and create the documents placeholder commit the changes and then I’m going to go and create the scripts Place holder and the final one what I usually add is the the tests something like this so with that as you can see now we have the main folders of our repository now what I usually do the next with that I’m going to go and edit the main readme so you can see it over here as well so what we’re going to do we’re going to go inside the read me and then we’re going to go to the edit button here and we’re going to start writing the main information about our project this is really depend on your style so you can go and add whatever you want this is the main page of your repository and now as you can see the file name here ismd it stands for markdown it is just an easy and friendly format in order to write a text so if you have like documentations you are writing a text it is a really nice format in order to organize it structure it and it is very friendly so what I’m going to do at the start I’m going to give a few description about the project so we have the main title and then we have like a welcome message and what this reposter is about and in the next section maybe we can start with the project requirements and then maybe at the end you can say few words about the licensing and few words about you so as you can see it’s like the homepage of the project and the repository so once you are done we’re going to go and commit the changes and now if you go to the main page of the repository you can see always the folder and files at the start and then below it we’re going to see the informations from the read me so again here we have the welcome statement and then the projects requirements and at the end we have the licensing and about me so my friends that’s that’s it we have now a repost story and we have now the main structure of the projects and through the projects as we are building the data warehouse we’re going to go and commit all our work in this repository nice right all right so with that we have now your repository ready and as we go in the projects we will be adding stuff to it so this step is done and now the last step finally we’re going to go to the SQL server and we’re going to write our first scripts where we’re going to create a database and schemas all right now the first step is we have to go and create brand new database so now in order to do that first we have to switch to the database master so you can do it like this use master and semicolon and if you go and execute it now we are switched to the master database it is a system database in SQL Server where you can go and create other databases and you can see from the toolbar that we are now logged into the master database now the next step we have to go and create our new database so we’re going to say say create database and you can call it whatever you want so I’m going to go with data warehouse semicolon let’s go and execute it and with that we have created our database let’s go and check it from the object Explorer let’s go and refresh and you can see our new data warehouse this is our new database awesome right now to the next step we’re going to go and switch to the new database so we’re going to say use data warehouse and semicolon so let’s go and switch to it and you can see now now we are logged into the data warehouse database and now we can go and start building stuff inside this data warehouse so now the first step that I usually do is I go and start creating the schemas so what is the schema think about it it’s like a folder or a container that helps you to keep things organized so now as we decided in the architecture we have three layers bronze silver gold and now we’re going to go and create for each layer a schema so let’s go and do that we’re going to start with the first one create schema and the first one is bronze so let’s do it like this and a semicolon let’s go and create the first schema nice so we have new schema let’s go to our database and then in order to check the schemas we go to the security and then to the schemas over here and as you can see we have the bronze and if you don’t find it you have to go and refresh the whole schemas and then you will find the new schema great so now we have the first schema now what we’re going to do we’re going to go and create the others two so I’m just going to go and duplicate it so the next one going to be the silver and the third one going to be the golds so let’s go and execute those two together we will get an error and that’s because we are not having the go in between so after each command let’s have a go and now if I highlight the silver and gold and then execute it will be working the go in SQL it is like separator so it tells SQL first execute completely the First Command before go to the next one so it is just separator now let’s go to our schemas refresh and now we can see as well we have the gold and the silver so with this we have now a database we have the three layers and we can start developing each layer individually okay so now let’s go and commit our work in the git so now since it is a script and code we’re going to go to the folder scripts over here and then we’re going to go and add a new file let’s call it init database.sql and now we’re going to go and paste our code over here so now I have done few modifications like for example before we create the database we have to check whether the database exists this is an important step if you are recreating the database otherwise if you don’t do that you will get an error where it’s going going to say the database already exists so first it is checking whether the database exist then it drops it I have added few comments like here we are saying creating the data warehouse creating the schemas and now we have a very important step we have to go and add a header comment at the start of each scripts to be honest after 3 months from now you will not be remembering all the details of these scripts and adding a comment like this it is like a sticky note for you later once you visit this script again and it is as well very important for the other developers in the team because each time you open a scripts the first question going to be what is the purpose of this script because if you or anyone in the team open the file the first question going to be what is the purpose of these scripts why we are doing these stuff so as you can see here we have a comment saying this scripts create a new data warehouse after checking if it already exists if the database exists it’s going to drop it and recreate it and additionally it’s going to go and create three schemas bronze silver gold so that it gives Clarity what this script is about and it makes everyone life easier now the second reason why this is very important to add is that you can add warnings and especially for this script it is very important to add these notes because if you run these scripts what’s going to happen it’s going to go and destroy the whole database imagine someone open the script and run it imagine an admin open the script and run it in your database everything going to be destroyed and all the data will be lost and this going to be a disaster if you don’t have any backup so with that we have nice H our comment and we have added few comments in our codes and now we are ready to commit our codes so let’s go and commit it and now we have our scripts in the git as well and of course if you are doing any modifications make sure to update the changes in the Gs okay my friends so with that we have an empty database and schemas and we are done with this task and as well we are done with the whole epic so we have completed the project initialization and now we’re going to go to the interesting stuff we will go and build the bronze layer so now the first task is to analyze the source systems so let’s go all right so now the big question is how to build the bronze layer so first thing first we do analyzing as you are developing anything you don’t immediately start writing a code so before we start coding the bronze layer what we usually do is we have to understand the source system so what I usually do I make an interview with the source system experts and ask them many many questions in order to understand the nature of the source system that I’m connecting to the data warehouse and once you know the source systems then we can start coding and the main focus here is to do the data ingestion so that means we have to find a way on how to load the data from The Source into the data warehouse so it’s like we are building a bridge between the source and our Target system the data warehouse and once we have the code ready the next step is we have to do data validation so here comes the quality control it is very important in the bronze layer to check the data completeness so that means we have to compare the number of Records between the source system and the bronze layer just to make sure we are not losing any data in between and another check that we will be doing is the schema checks and that’s to make sure that the data is placed on the right position and finally we don’t have to forget about documentation and committing our work in the gits so this is the process that we’re going to follow to build the bronze layer all right my friends so now before connecting any Source systems to our data warehouse we have to make very important step is to understand the sources so how I usually do it I set up a meeting with the source systems experts in order to interview them to ask them a lot of stuff about the source and gaining this knowledge is very important because asking the right question will help you to design the correct scripts in order to extract the data and to avoid a lot of mistakes and challenges and now I’m going to show you the most common questions that I usually ask before connecting anything okay so we start first by understanding the business context and the ownership so I would like to understand the story behind the data I would like to understand who is responsible for the data which it departments and so on and then it’s nice to understand as well what business process it supports does it support the customer transactions the supply chain Logistics or maybe Finance reporting so with that you’re going to understand the importance of your data and then I ask about the system and data documentation so having documentations from the source is your learning materials about your data and it going to saves you a lot of time later when you are working and designing maybe new data models and as well I would like always to understand the data model for the source system and if they have like descript I of the columns and the tables it’s going to be nice to have the data catalog this can helps me a lot in the data warehouse how I’m going to go and join the tables together so with that you get a solid foundations about the business context the processes and the ownership of the data and now in The Next Step we’re going to start talking about the technicality so I would like to understand the architecture and as well the technology stack so the first question that I usually ask is how the source system is storing the data do we have the data on the on Prem like an SQL Server Oracle or is it in the cloud like Azure lws and so on and then once we understand that then we can discuss what are the integration capabilities like how I’m going to go and get the data do the source system offer apis maybe CFA or they have only like file extractions or they’re going to give you like a direct connection to the database so once you understand the technology that you’re going to use in order to extract the data then we’re going to Deep dive into more technical questions and here we can understand how to extract the data from The Source system and and then load it into the data warehouse so the first things that we have to discuss with the experts can we do an incremental load or a full load and then after that we’re going to discuss the data scope the historization do we need all data do we need only maybe 10 years of the data are there history is already in the source system or should we build it in the data warehouse and so on and then we’re going to go and discuss what is the expected size of the extracts are we talking here about megabytes gigabytes terabytes and this is very important to understand whether we have the right tools and platform to connect the source system and then I try to understand whether there are any data volume limitations like if you have some Old Source systems they might struggle a lot with performance and so on so if you have like an ETL that extracting large amount of data you might bring the performance down of the source system so that’s why you have to try to understand whether there are any limitations for your extracts and as well other aspects that might impact the performance of The Source system this is very important if they give you an access to the database you have to be responsible that you are not bringing the performance of the database down and of course very important question is to ask about the authentication and the authorization like how you going to go and access the data in the source system do you need any tokens Keys password and so on so those are the questions that you have to ask if you are connecting new source system to the data warehouse and once you have the answers for those questions you can proceed with the next steps to connect the sources to the that Warehouse all right my friends so with that you have learned how to analyze a new source systems that you want to connect to your data warehouse so this STP is done and now we’re going to go back to coding where we’re going to write scripts in order to do the data ingestion from the CSV files to the Bros layer and let’s have quick look again to our bronze layer specifications so we just have to load the data from the sources to the data warehouse we’re going to build tables in the bronze layer we are doing a full load so that means we are trating and then inserting the data there will be no data Transformations at all in the bronze layer and as well we will not be creating any data model so this is the specifications of the bronze layer all right now in order to create the ddl script for the bronze layer creating the tables of the bronze we have to understand the metadata the structure the schema of the incoming data and here either you ask the technical experts from The Source system about these informations or you can go and explore the incoming data and try to define the structure of your tables so now what we’re going to do we’re going to start with the First Source system the CRM so let’s go inside it and we’re going to start with the first table that customer info now if you open the file and check the data inside it you see we have a Header information and that is very good because now we have the names of the columns that are coming from the source and from the content you can Define of course the data types so let’s go and do that first we’re going to say create table and then we have to define the layer it’s going to be the bronze and now very important we have to follow the naming convention so we start with the name of the source system it is the CRM underscore and then after that the table name from The Source system so it’s going to be the costore info so this is the name of our first table in the bronze layer then the next step we have to go and Define of course the columns and here again the column names in the bronze layer going to be one to one exactly like the source system so the first one going to be the ID and I will go with the data type integer then the next one going to be the key invar Char and the length I will go with [Music] 50 and the last one going to be the create dates it’s going to be date so with that we have covered all the columns available from The Source system so let’s go and check and yes the last one is the create date so that’s it for the first table now semicolon of course at the end let’s go and execute it and now we’re going to go to the object Explorer over here refresh and we can see the first table inside our data warehouse amazing right so now next what you have to do is to go and create a ddl statement for each file for those two systems so for the CRM we need three ddls and as well for the other system the Erp we have as well to create three ddls for the three files so at the ends we’re going to have in the bronze ler Six Tables six ddls so now pause the video go create those ddls I will be doing the same as well and we will see you soon all right so now I hope you have created all those details I’m going to show you what I have just created so the second table in the source CRM we have the product informations and the third one is the sales details then we go to the second system and here we make sure that we are following the naming convention so first The Source system Erb and then the table name so the second system was really easy you can see we have only here like two columns and for the customers like only three and for the categories only four columns all right so after defining those stuff of course we have to go and execute them so let’s go and do that and then we go to the object Explorer over here refresh the tables and with that you can see we have six empty tables in the bronze layer and with that we have all the tables from the two Source systems inside our database but still we don’t have any data and you can see our naming convention is really nice you see the first three tables comes from the CRM Source system and then the other three comes from the Erb so we can see in the bronze layer the things are really splitted nicely and you can identify quickly which table belonged to which source system now there is something else that I usually add to the ddl script is to check whether the table exists before creating so for example let’s say that you are renaming or you would like to change the data type of specific field if you just go and run this Square you will get an error because the database going to say we have already this table so in other databases you can say create or replace table but in the SQL Server you have to go and build a tsql logic so it is very simple first we have to go and check whether the object exist in the database so we say if object ID and then we have to go and specify the table name so let’s go and copy the whole thing over here and make sure you get exactly the same name as a table name so there is see like space I’m just going to go and remove it and then we’re going to go and Define the object type so going to be the U it stands for user it is the user defined tables so if this table is not null so this means the database did find this object in the database so what can happen we say go and drop the table so the whole thing again and semicolon so again if the table exist in the database is not null then go and drop the table and after that go and created so now if you go and highlight the whole thing and then execute it it will be working so first drop the table if it exist then go and create the table from scratch now what you have to do is to go and add this check before creating any table inside our database so it’s going to be the same thing for the next table and so on I went and added all those checks for each table and what can happen if I go and execute the whole thing it going to work so with that I’m recreating all the tables in the bronze layer from the scratch now the methods that we’re going to use in order to load the data from the source to the data warehouse is the bulk inserts bulk insert is a method of loading massive amount of data very quickly from files like CSV files or maybe a text file directly into a database it’s is not like the classical normal inserts where it’s going to go and insert the data row by row but instead the PK insert is one operation that’s going to load all the data in one go into the database and that’s what makes it very fast so let’s go and use this methods okay so now let’s start writing the script in order to load the first table in the source CRM so we’re going to go and load the table customer info from the CSV file to the database table so the syntax is very simple we’re going to start to saying pulk insert so with that SQL understand we are doing not a normal insert we are doing a pulk insert and then we have to go and specify the table name so it is bronze. CRM cost info so now now we have to specify the full location of the file that we are trying to load in this table so now what we have to do is to go and get the path where the file is stored so I’m going to go and copy the whole path and then add it to the P insert exactly like where the data exists so for me it is in csql data warehouse project data set in the source CRM and then I have to specify the file name so it’s going to be the costore info. CSV you have to get it exactly like like the path of your files otherwise it will not be working so after the path now we come to the with CLA now we have to tell the SQL Server how to handle our file so here comes the specifications there is a lot of stuff that we can Define so let’s start with the very important one is the row header now if you check the content of our files you can see always the first row includes the Header information of the file so those informations are actually not the data it’s just the column names the ACT data starts from the second row and we have to tell the database about this information so we’re going to say first row is actually the second row so with that we are telling SQL to skip the first row in the file we don’t need to load those informations because we have already defined the structure of our table so this is the first specifications the next one which is as well very important and loading any CSV file is the separator between Fields the delimiter between Fields so it’s really depend on the file structure that you are getting from the source as you can see all those values are splitted with a comma and we call this comma as a file separator or a delimiter and I saw a lot of different csvs like sometime they use a semicolon or a pipe or special character like a hash and so on so you have to understand how the values are splitted and in this file it’s splitted by the comma and we have to tell SQL about this info it’s very important so we going to say fill Terminator and then we’re going to say it is the comma and basically those two informations are very important for SQL in order to be able to read your CSV file now there are like many different options that you can go and add for example tabe lock it is an option in order to improve the performance where you are locking the entire table during loading it so as SQL is loading the data to this table it going to go and lock the whole table so that’s it for now I’m just going to go and add the semicolon and let’s go and insert the data from the file inside our pron table let’s execute it and now you can see SQL did insert around 880,000 rows inside our table so it is working we just loaded the file into our data Bas but now it is not enough to just write the script you have to test the quality of your bronze table especially if you are working with files so let’s go and just do a simple select so from our new table and let’s run it so now the first thing that I check is do we have data like in each column well yes as you can see we have data and the second thing is do we have the data in the correct column this is very critical as you are loading the data from a file to a database do we have the data in the correct column so for example here we have the first name which of course makes sense and here we have the last name but what could happen and this mistakes happens a lot is that you find the first name informations inside the key and as well you see the last name inside the first name and the status inside the last name so there is like shifting of the data and this data engineering mistake is very common if you are working with CSV files and there are like different reasons why it happens maybe the definition of your table is wrong or the filled separator is wrong maybe it’s not a comma it’s something else or the separator is a bad separator because sometimes maybe in the keys or in the first name there is a comma and the SQL is not able to split the data correctly so the quality of the CSV file is not really good and there are many different reasons why you are not getting the data in the correct column but for now everything looks fine for us and the next step is that I go and count the rows inside this table so let’s go and select that so we can see we have 18,490 and now what we can do we can go to our CSV file and check how many rows do we have inside this file and as you can see we have 18,490 we are almost there there is like one extra row inside the file and that’s because of the header the first Header information is not loaded inside our table and that’s why always in our tables we’re going to have one less row than the original files so everything looks nice and we have done this step correctly now if I go and run it again what’s going to happen we will get dcat inside the bronze layer so now we have loaded the file like twice inside the same table which is not really correct the method that we have discussed is first to make the table empty and then load trate and then insert in order to do that before the bulk inserts what we’re going to do we’re going to say truncate table and then we’re going to have our table and that’s it with a semicolon so now what we are doing is first we are making the table empty and then we start loading from the scratch we are loading the whole content of the file inside the table and this is what we call full load so now let’s go and Mark everything together and execute and again if you go and check the content of the table you can see we have only 18,000 rows let’s go and run it again the count of the bronze layer you can see we still have the 18,000 so each time you run this script now we are refreshing the table customer info from the file into the database table so we are refreshing the bronze layer table so that means if there is like now any changes in the file it will be loaded to the table so this is how you do a full load in the bronze layer by trating the table and then doing the inserts and now of course what we have to do is to Bow the video and go and write WR the same script for all six files so let’s go and do [Music] that okay back so I hope that you have as well written all those scripts so I have the three tables in order to load the First Source system and then three sections in order to load the Second Source system and as I’m writing those scripts make sure to have the correct path so for the Second Source system you have to go and change the path for the other folder and as well don’t forget the table name on the bronze layer is different from the file name because we start always with the source system name with the files we don’t have that so now I think I have everything is ready so let’s go and execute the whole thing perfect awesome so everything is working let me check the messages so we can see from the message how many rows are inserted in each table and now of course the task is to go through each table and check the content so that means now we have really ni script in order to load the bronze layer and we will use this script in daily basis every day we have to run it in order to get a new content to the data warehouse and as you learned before if you have like a script of SQL that is frequently used what we can do we can go and create a stored procedure from those scripts so let’s go and do that it’s going to be very simple we’re going to go over here and say create or alter procedure and now we have to define the name of the Sol procedure I’m going to go and put it in the schema bronze because it belongs to the bronze layer so then we’re going to go and follow the naming convention the S procedure starts with load underscore and then the bronze layer so that’s it about the name and then very important we have to define the begin and as well the end of our SQL statements so here is the beginning and let’s go to the end and say this is the end and then let’s go highlight everything in between and give it one push with tab so with that it is easier to read so now next one we’re going to do we’re going to go and execute it so let’s go and create this St procedure and now if you want to go and check your St procedure you go to the database and then we have here folder called programmability and then inside we have start procedure so if you go and refresh you will see our new start procedure let’s go and test it so I’m going to go and have new query and what we’re going to do we’re going to say execute bronze. load bronze so let’s go and execute it and with that we have just loaded completely the pron layer so as you can see SQL did go and insert all the data from the files to the bronze layer it is way easier than each time running those scripts of course all right so now the next step is that as you can see the output message it is really not having a lot of informations the message of your ETL with s procedure it will not be really clear so that’s why if you are writing an ETL script always take care of the messaging of your code so let me show you a nice design let’s go back to our St procedure so now what we can do we can go and divide the message p based on our code so now we can start with a message for example over here let’s say print and we say what you are doing with this thir procedure we are loading the bronze ler so this is the main message the most important one and we can go and play with the separators like this so we can say print and now we can go and add some nice separators like for example the equals at the start and at the end just to have like a section so this is just a nice message at the start so now by looking to our code we can see that our code is splited into two sections the first section we are loading all the tables from The Source system CRM and the second section is loading the tables from the Erp so we can split the prints by The Source system so let’s go and do that so we’re going to say print and we’re going to say loading CRM tables this is for the first section and then we can go and add some nice separators like the one let’s take the minus and of course don’t forget to add semicolons like me so we can to have semicolon for each print same thing over here I will go and copy the whole thing because we’re going to have it at the start and as well at the end let’s go copy the whole thing for the second section so for the Erp it starts over here and we’re going to have it like this and we’re going to call it loading Erp so with that in the output we can see nice separation between loading each Source system now we go to the next step where we go and add like a print for each action so for example here we are Tran getting the table so we say print and now what we can do we can go and add two arrows and we say what we are doing so we are trating the table and then we can go and add the table name in the message as well so this is the first action that we are doing and we can go and add another print for inserting the data so we can say inserting data into and then we have the table name so with that in the output we can understand what SQL is doing so let’s go and repeat this for all other tables Okay so I just added all those prints and don’t forget the semicolon at the end so I would say let’s go and execute it and check the output so let’s go and do that and then maybe at the start just to have quick output execute our stored procedure like this so let’s see now if you check the output you can see things are more organized than before so at the start we are reading okay we are loading the bronze layer now first we are loading the source system CRM and then the second section is for the Erp and we can see the actions so we trating inserting trating inserting for each table and as well the same thing for the Second Source so as you can see it is nice and cosmetic but it’s very important as you are debugging any errors and speaking of Errors we have to go and handle the errors in our St procedure so let’s go and do that it’s going to be the first thing that we do we say begin try and then we go to the end of our scripts and we say before the last end we say end try and then the next thing we have to add the catch so we’re going to say begin catch and end catch so now first let’s go and organize our code I’m going to take the whole codes and give it one more push and as well the begin try so it is more organized and as you know the try and catch is going to go and execute the try and if there is like any errors during executing this script the second section going to be executed so the catch will be executed only if the SQL failed to run that try so now what we have to do is to go and Define for SQL what to do if there’s like an error in your code and here we can do multiple stuff like maybe creating a logging tables and add the messages inside this table or we can go and add some nice messaging to the output like very example we can go and add like a section again over here so again some equals and we can go and repeat it over here and then add some content in between so we can start with something like to say error Accord during loading bronze layer and then we can go and add many stuff like for example we can go and add the error message and here we can go and call the function error message and we can go and add as well for example the error number so error number and of course the output of this going to be in number but the error message here is a text so we have to go and change the data type so we’re going to do a cast as in VAR Char like this and then there is like many functions that you can add to the output like for example the error States and so on so you can design what can happen if there is an error in the ETL now what else is very important in each ETL process is to add the duration of each like step so for example I would like to understand how long it takes to load this table over here but looking to the output I don’t have any informations how long is taking to load my tables and this is very important because because as you are building like a big data warehouse the ATL process is going to take long time and you would like to understand where is the issue where is the bottleneck which table is consuming a lot of time to be loaded so that’s why we have to add those informations as well to the output or even maybe to protocol it in a table so let’s go and add as well this step so we’re going to go to the start and now in order to calculate the duration you need the starting time and the end time so we have to understand when we started loaded and when we ended loading the table so now the first thing is we have to go and declare the variables so we’re going to say declare and then let’s make one called start time and the data type of this going to be the date time I need exactly the second when it started and then another one for the end time so another variable end time and as well the same thing date time so with that we have declared the variables and the next step is to go and use them so now let’s go to the first table to the customer info and at the start we’re going to say set start time equal to get date so we will get the exact time when we start loading this table and then let’s go and copy the whole thing and go to the end of loading over here so we’re going to say set this time the end time equal as well to the get dates so with that now we have the values of when we start loading this table and when we completed loading the table and now the next step is we have to go and print the duration those informations so over here we can go and say print and we can go and have as again the same design so two arrows and we can say very simply load duration and then double points and space and now what we have to do is to calculate the duration and we can do that using the date and time function date diff in order to find the interval between two dates so we’re going to say plus over here and then use date diff and here we have to Define three arguments first one is the unit so you can Define second minute hours and so on so we’re going to go with a second and then we’re going to define the start of the interval it’s going to be the start time and then the last argument is going to be the end of the boundary it’s going to be the end time and now of course the output of this going to be in number that’s why we have to go and cast it so we’re going to say cast as enar Char and then we’re going to close it like this and maybe at the ends we’re going to say plus space seconds in order to have a nice message so again what we have done we have declared the two variables and we are using them at the start we we are getting the current date and time and at the end of loading the table we are getting the current date and time and then we are finding the differences between them in order to get the load duration and in this case we are just priting this information and now we can go of course and add some nice separator between each table so I’m going to go and do it like this just few minuses not a lot of stuff so now what we have to do is to go and add this mechanism for each table in order to measure the speed of the ETL for each one of [Music] them okay so now I have added all those configurations for each table and let’s go and run the whole thing now so let’s go and edit the stor procedure this and we’re going to go and run it so let’s go and execute so now as you can see we have here one more info about the load durations and it is everywhere I can see we have zero seconds and that’s because it is super fast of loading those informations we are doing everything locally at PC so loading the data from files to database going to be Mega fast but of course in real projects you have like different servers and networking between them and you have millions of rods in the tables of course the duration going to be not like 0 seconds things going to be slower and now you can see easily how long it takes to load each of your tables and now of course what is very interesting is to understand how long it takes to load the whole pron lier so now your task is is as well to print at the ends informations about the whole patch how long it took to load the bronze [Music] layer okay I hope we are done now I have done it like this we have to Define two new variables so the start time of the batch and the end time of the batch and the first step in the start procedure is to get that date and time informations for the first variable and exactly at the end the last thing that we do in the start procedure we’re going to go and get the date and time informations for the end time so we say again set get date for the patch in time and then all what you have to do is to go and print a message so we are saying loading bronze layer is completed and then we are printing total load duration and the same thing with a date difference between the patch start time and the end time and we are calculating the seconds and so on so now what you have to do is to go and execute the whole thing so let’s go and refresh the definition of the S procedure and then let’s go and execute it so in the output we have to go to the last message and we can see loading pron layer is completed and the total load duration is as well 0 seconds because the execution time is less than 1 seconds so with that you are getting now a feeling about how to build an ETL process so as you can see the data engineering is not all about how to load the data it’s how to engineer the whole pipeline how to measure the speed of loading the data what can happen happen if there’s like an error and to print each step in your ETL process and make everything organized and cleared in the output and maybe in the logging just to make debugging and optimizing the performance way easier and there is like a lot of things that we can add we can add the quality measures and stuff so we can add many stuff to our ETL scripts to make our data warehouse professional all right my friends so with that we have developed a code in order to load the pron layer and we have tested that as well and now in the next step we we’re going to go back to draw because we want to draw a diagram about the data flow so let’s go so now what is a data flow diagram we’re going to draw a Syle visual in order to map the flow of your data where it come froms and where it ends up so we want just to make clear how the data flows through different layers of your projects and that’s help us to create something called the data lineage and this is really nice especially if you are analyzing an issue so if you have like multiple layers and you don’t have a real data lineage or flow it’s going to be really hard to analyze the scripts in order to understand the origin of the data and having this diagram going to improve the process of finding issues so now let’s go and create one okay so now back to draw and we’re going to go and build the flow diagram so we’re going to start first with the source system so let’s build the layer I’m going to go and remove the fill dotted and then we’re going to go and add like a box saying sources and we’re going to put it over here increase the size 24 and as well without any lines now what do we have inside the sources we have like folder and files so let’s go and search for a folder icon I’m going to go and take this one over here and say you are the CRM and we can as well increase the size and we have another source we have the Erp okay so this is the first layer let’s go and now have the bronze layer so we’re going to go and grab another box and we’re going to go and make the coloring like this and instead of Auto maybe take the hatch maybe something like this whatever you know so rounded and then we can go and put on top of it like the title so we can say you are the bronze layer and increase as well the size of the font so now what you’re going to do we’re going to go and add boxes for each table that we have in the bronze layer so for example we have the sales details we can go and make it little bit smaller so maybe 16 and not bold and we have other two tables from the CRM we have the customer info and as well the product info so those are the three tables that comes from the CRM and now what we’re going to do we’re going to go and connect now the source CRM with all three tables so what we going to do we’re going to go to the folder and start making arrows from the folder to the bronze layer like this and now we have to do the same thing for the Erp source so as you can see the data flow diagram shows us in one picture the data lineage between the two layers so here we can see easily those three tables actually comes from the CRM and as well those three tables in the bronze layer are coming from the Erp I understand if we have like a lot of tables it’s going to be a huge Miss but if you have like small or medium data warehouse building those diagrams going to make things really easier to understand how everything is Flowing from the sources into the different layers in your data warehouse all right so with that we have the first version of the data flow so this step is done and the final step is to commit our code in the get repo okay so now let’s go and commit our work since it is scripts we’re going to go to the folder scripts and here we’re going to have like scripts for the bronze silver and gold that’s why maybe it makes sense to create a folder for each layer so let’s go and start creating the bronze folder so I’m going to go and create a new file and then I’m going to say pron slash and then we can have the DL script of the pron layer dot SQL so now I’m going to go and paste the edal codes that we have created so those six tables and as usual at the start we have a comment where we are explaining the purpose of these scripts so we are saying these scripts creates tables in the pron schema and by running the scripts you are redefining the DL structure of the pron tables so let’s have it like that and I’m going to go and commit the changes all right so now as you can see inside the scripts we have a folder called bronze and inside it we have the ddl script for the bronze layer and as well in the pron layer we’re going to go and put our start procedure so we’re going to go and create a new file let’s call it proc load bronze. SQL and then let’s go and paste our scripts and as usual I have put it at the start an explanation about the sord procedure so we are seeing this St procedure going to go and load the data from the CSV files into the pron schema so it going go and truncate first the tables and then do a pulk inserts and about the parameters this s procedure does not accept any parameter or return any values and here a quick example how to execute it all right so I think I’m happy with that so let’s go and commit it all right my friends so with that we have committed our code into the gch and with that we are done building the pron layer so the whole is done now we’re going to go to the next one this one going to be more advanced than the bronze layer because the there will be a lot of struggle with cleaning the data and so on so we’re going to start with the first task where we’re going to analyze and explore the data in the source systems so let’s go okay so now we’re going to start with the big question how to build the silver layer what is the process okay as usual first things first we have to analyze and now the task before building anything in the silver layer we have to go and explore the data in order to understand the content of our sources once we have it what we’re going to do we will be starting coding and here the transformation that we’re going to do is data cleansing this is usually process that take really long time and I usually do it in three steps the first step is to check first the data quality issues that we have in the pron layer so before writing any data Transformations first we have to understand what are the issues and only then I start writing data transformations in order to fix all those quality issues that we have in the bronze and the last step once I have clean results what we’re going to do we’re going to go and inserted into the silver layer and those are the three faces that we will be doing as we are writing the code for the silver layer and the third step once we have all the data in the server layer we have to make sure that the data is now correct and we don’t have any quality issues anymore and if you find any issues of course what you going to do we’re going to go back to coding we’re going to do the data cleansing and again check so it is like a cycle between validating and coding once the quality of the silver layer is good we cannot skip the last phase where we going to document and commit our work in the Gs and here we’re going to have two new documentations we’re going to build the data flow diagram and as well the data integration diagram after we understood the relationship between the sources from the first step so this is the process and this is how we going to build the server layer all right so now exploring the data in the pron layer so why it is very important because understanding the data it is the key to make smart decisions in the server layer it was not the focus in the BR layer to understand the content of the data at all we focused only how to get the data to the data warehouse so that’s why we have now to take a moment in order to explore and understand the tables and as well how to connect them what are the relationship between these tables and it is very important as you are learning about a new source system is to create like some kind of documentation so now let’s go and explore the sources okay so now let’s go and explore them one by one we can start with the first one from the CRM we have the customer info so right click on it and say select top thousand rows and this is of course important if you have like a lot of data don’t go and explore millions of rows always limit your queries so for example here we are using the top thousands just to make sure that you are not impacting the system with your queries so now let’s have a look to the content of this table so we can see that we have here customer informations so we have an ID we have a key for the customer we have first name last name my Ral status gender and the creation date of the customer so simply this is a table for the customer customer information and a lot of details for the customers and here we have like two identifiers one it is like technical ID and another one it’s like the customer number so maybe we can use either the ID or the key in order to join it with other tables so now what I usually do is to go and draw like data model or let’s say integration model just to document and visual what I am understanding because if you don’t do that you’re going to forget it after a while so now we go and search for a shape let’s search for table and I’m going to go and pick this one over here so here we can go and change the style for example we can make it rounded or you can go make it sketch and so on and we can go and change the color so I’m going to make it blue then go to the text make sure to select the whole thing and let’s make it bigger 26 and then what I’m going to do for those items I’m just going to select them and go to arrange and maybe make it 40 something like this so now what we’re going to do we’re going to just go and put the table name so this is the one that we are now learning about and what I’m going to do I’m just going to go and put here the primary key I will not go and list all the informations so the primary key was the ID and I will go and remove all those stuff I don’t need it now as you can see the table name is not really friendly so I can go and bring a text and put it here on top and say this is the customer information just to make it friendly and do not forget about it and as well going to increase the size to maybe 20 something like this okay with that we have our first table and we’re going to go and keep exploring so let’s move to the second one we’re going to take the product information right click on it and select the top thousand rows I will just put it below the previous query query it now by looking to this table we can see we have product informations so we have here a primary key for the product and then we have like key or let’s say product number and after that we have the full name of the product the product costs and then we have the product line and then we have like start and end well this is interesting to understand why we have start and ends let’s have a look for example for those three rows all of those three having the same key but they have different IDs so it is the same product but with different costs so for 2011 we have the cost of 12 then 2012 we have 14 and for the last year 2013 we have 13 so it’s like we have like a history for the changes so this table not only holding the current affirmations of the product but also history informations of the products and that’s why we have those two dates start and end now let’s go back and draw this information over here so I’m just going to go and duplicate it so the name of this table going to be the BRD info and let’s go and give it like a short description current and history products information something like this just to not forget that we have history in this table and here we have as well the PRD ID and there is like nothing that we can use in order to join those two tables we don’t have like a customer ID here or in the other table we don’t have any product ID okay so that’s it for this table let’s jump to the third table and the last one in the CRM so let’s go and select I just made other queries as well short so let’s go and execute so what do you have over here we have a lot of informations about the order the sales and a lot of measures order number we have the product key so this is something that we can use in order to join it with the product table we have the customer ID we don’t have the customer key so here we have like ID and here we have key so there’s like two different ways on how to join tables and then we have here like dates the order dates the shipping date the due date and then we have the sales amount the quantity and the price so this is like an event table it is transactional table about the orders and sales and it is great table in order to connect the customers with the products and as well with the orders so let’s document this new information that we have so the table name is the sales details so we can go and describe it like this transactional records about sales and orders and now we have to go and describe how we can connect this table to the other two so we are not using the product ID we are using the product key and now we need a new column over here so you can hold control and enter or you can go over here and add a new row and the other row is going to be the customer ID so now for the the customer ID it is easy we can gr and grab an arrow in order to connect those two tables but for the product key we are not using the ID so that’s why I’m just going to go and remove this one and say product key let’s have here again a check so this is a product key it’s not a product ID and if we go and check the old table the products info you can see we are using this key and not the primary key so what we’re going to do now we will just go and Link it like this and maybe switch those two tables so I will put the customer below just perfect it looks nice okay so let’s keep moving let’s go now to the other source system we have the Erp and the first one is ARB cost and we have this cryptical name let’s go and select the data so now here it’s small table and we have only three informations so we have here something called C and then we have something I think this is the birthday and the gender information so we have here male female and so on so it looks again like the customer informations but here we have like extra data about the birthday and now if you go and compare it to the customer table that we have from the other source system let’s go and query it you can see the new table from the Erb don’t have IDs it has actually the customer number or the key so we can go and join those two tables using the customer key let’s go and document this information so I will just go and copy paste and put it here on the right side I will just go and change the color now since we are now talking about different Source system and here the table name going to be this one and the key called C ID now in order to join this table with the customer info we cannot join it with the customer ID we need the customer key that’s why here we have to go and add a new row so contrl enter and we’re going to say customer key and then we have to go and make a nice Arrow between those two keys so we’re going to go and give it a description customer information and here we have the birth dates okay so now let’s keep going we’re going to go to the next one we have the Erp location let’s go and query this table so what do you have over here we have the CID again and as you can see we have country informations and this is of course again the customer number and we have only this information the country so let’s go and docment this information this is the customer location table name going to be like this and we still have the same ID so we have here still the customer ID and we can go and join it using the customer key and we have to give it the description locate of customers and we can say here the country okay so now let’s go to the last table and explore it we have the Erp PX catalog so let’s go and query those informations so what do we have here we have like an ID a category a subcategory and the maintenance here we have like either yes and no so by looking to this table we have all the categories and the subcategories of the products and here we have like special identifier for those informations now the question is how to join it so I would like to join it actually with the product informations so let’s go and check those two tables together okay so in the products we don’t have any ID for the categories but we have these informations actually in the product key so the first five characters of the product key is actually the category ID so we can use this information over here in order to join it with the categories so we can go and describe this information like this and then we have to go and give it a name and then here we have the ID and the ID could be joined using the product key so that means for the product information we don’t need at all the product ID the primary key all what we need is the product key or the product number and what I would like to do is like to group those informations in a box so let’s go grab like any boxes here on the left side and make it bigger and then make the edges a little bit smaller let’s remove move the fill and the line I will make a dotted line and then let’s grab another box over here and say this is the CRM and we can go and increase the size maybe something like 40 smaller 35 bold and change the color to Blue and just place it here on top of this box so with that we can understand all those tables belongs to the source system CRM and we can do the same stuff for the right side as well now of course we have to go and add the description here so it’s going to be the product categories all right so with that we have now clear understanding how the tables are connected to each others we understand now the content of each table and of course it can to help us to clean up the data in the silver layer in order to prepare it so as you can see it is very important to take time understanding the structure of the tables the relationship between them before start writing any code all right so with that we have now clear understanding about the sources and with that we have as well created a data integration in the dro so with that we have more understanding about how to connect the sources and now in the next two task we will go back to SQL where we’re going to start checking the quality and as well doing a lot of data Transformations so let’s go okay so now let’s have a quick look to the specifications of the server layer so the main objective to have clean and standardized data we have to prepare the data before going to the gold layer and we will be building tables inside the silver layer and the way of loading the data from the bronze to the silver is a full load so that means we’re going to trate and then insert and here we’re going to have a lot of data Transformations so we’re going to clean the data we’re going to bring normalizations standardizations we’re going to derive new columns we will be doing as well data enrichment so a lot of things to be done in the data transformation but we will not be building any new data model so those are the specifications and we have to commit ourself to this scope okay so now building the ddl script for the layer going to be way easier than the bronze because the definition and the structure of each table in the silver going to be identical to the bronze layer we are not doing anything new so all what you have to do is to take the ddl script from the bronze layer and just go and search and replace for the schema I’m just using the notepad++ for the scripts so I’m going to go over here and say replace the bronze dots with silver dots and I’m going to go and replace all so with that now all the ddl is targeting the schema silver layer which is exactly what we need all right now before we execute our new ddl script for the silver we have to talk about something called the metadata columns they are additional columns or fields that the data Engineers add to each table that don’t come directly from the source systems but the data Engineers use it in order to provide extra informations for each record like we can add a column called create date is when the record was loaded or an update date when the the record got updated or we can add the source system in order to understand the origin of the data that we have or sometimes we can add the file location in order to understand the lineage from which file the data come from those are great tool if you have data issue in your data warehouse if there is like corrupt data and so on this can help you to track exactly where this issue happens and when and as well it is great in order to understand whether I have Gap in my data especially if you are doing incremental mod it is like putting labels on everything and you will thank yourself later when you start using them in hard times as you have an issue in your data warehouse so now back to our ddl scripts and all what you have to do is to go and do the following so for example for the first table I will go and add at the end one more extra column so it start with the prefix DW as we have defined in the naming convention and then underscore let’s have the create dates and the data tabe going to be date time to and now what we can do is we can go and add a default value for it I want the database to generate these informations automatically we don’t have to specify that in any ETL scripts so which value it’s going to be the get datee so each record going to be inserted in this table will get automatically a value from the current date and time so now as you can see the naming convention it is very important all those columns comes from the source system and only this one column comes from the data engineer of the data warehouse okay so that’s it let’s go and repeat the same thing for all other tables so I will just go and add this piece of information for each ddl all right so I think that’s it all what you have to do is now to go and execute the whole ddl script for the silver layer let’s go into that all right perfect there’s no errors let’s go and refresh the tables on the object Explorer and with that as you can see we have six tables for the silver layer it is identical to the bronze layer but we have one extra column for the metadata all right so now in the server layer before we start writing any data Transformations and cleansing we have first to detect the quality issues in the pron without knowing the issues we cannot find solution right we will explore first the quality issues only then we start writing the transformation scripts so let’s [Music] go okay so now what we’re going to do we’re going to go through all the tables over the bronze layer clean up the data and then insert it to the server layer so let’s start with the first table the first bronze table from The Source CRM so we’re going to go to the bronze CRM customer info so let’s go and query the data over here now of course before writing any data and Transformations we have to go and detect and identify the quality issues of this table so usually I start with the first check where we go and check the primary key so we have to go and check whether there are nulls inside the primary key and whether there are duplicates so now in order to detect the duplicates in the primary key what we have to do is to go and aggregate the primary key if we find any value in the primary key that exist more than once that means it is not unique and we have duplicates in the table so let’s go and write query for that so what we’re going to do we’re going to go with the customer ID and then we’re going to go and count and then we have to group up the data so Group by based on the primary key and of course we don’t need all the results we need only where we have an issue so we’re going to say having counts higher than one so we are interested in the values where the count is higher than one so let’s go and execute it now as you can see we have issue in this table we have duplicates because all those IDs exist more than one in the table which is completely wrong we should have the primary key unique and you can see as well we have three records where the primary key is empty which is as well a bad thing now there is an issue here if we have only one null it will not be here at the result so what I’m going to do I’m going to go over here and say or the primary key is null just in case if we have only one null I’m still interested to see the results so if I go and run it again we’ll get the same results so this is equality check that you can do on the table and as you can see it is not meeting the expectation so that means we have to do something about it so let’s go and create a new query so here what we’re going to do we can to start writing the query that is doing the data transformation and the data cleansing so let’s start again by selecting the [Music] data and excuse it again so now what I usually do I go and focus on the issue so for example let’s go and take one of those values and I focus on it before start writing the transformation so we’re going to say where customer ID equal to this value all right so now as you can see we have here the issue where the ID exist three times but actually we are interested only on one of them so the question is how to pick one of those usually we search for a timestamp or date value to help us so if you check the creation date over here we can understand that this record this one over here is the newest one and the previous two are older than it so that means if I have to go and pick one of those values I would like to get the latest one because it holds the most fresh information so what we have to do is we have to go and rank all those values based on the create dates and only pick the highest one so that means we need a ranking function and for that in scale we have the amazing window functions so let’s go and do that we will use the function row number over and then Partition by and here we have to divide the table by the customer ID so we’re going to divide it by the customer ID and in order now to rank those rows we have to sort the data by something so order by and as we discussed we want to sort the data by the creation date so create date and we’re going to sort it descending so the highest first then the lowest so let’s go and do that and now we’re going to go and give it the name flag last so now let’s go and executed now the data is sorted by the creation date and you can see over here that this record is the number one then the one that is older is two and the oldest one is three of course we are interested in the rank number one now let’s go and moove the filter and check everything so now if you have a look to the table you can see that on the flag we have everywhere like one and that’s because the those primary Keys exist only one but sometimes we will not have one we will have two three and so on if there’s like duplicates we can go of course and do a double check so let’s go over here and say select star from this query we’re going to say where flag last is in equal to one so let’s go and query it and now we can see all the data that we don’t need because they are causing duplicates in the primary key and they have like an old status so what we’re going to do we’re going to say equal to one and with that we guarantee that our primary key is unique and each value exist only once so if I go and query it like this you will see we will not find any duplicate inside our table and we can go and check that of course so let’s go and check this primary key and we’re going to say and customer ID equal to this value and you can see it exists now only once and we are getting the freshest data from this key so with that we have defined like transformation in order to remove any D Kates okay so now moving on to the next one as you can see in our table we have a lot of values where they are like string values now for these string values we have to check the unwanted spaces so now let’s go and write a query that’s going to detect those unwanted spaces so we’re going to say select this column the first name from our table bronze customer information so let’s go and query it now by just looking to the data it’s going to be really hard to find those unwanted spaces especially if they are at the end of the world but there is a very easy way in order to detect those issues so what we’re going to do we’re going to do a filter so now we’re going to say the first name is not equal to the first name after trimming the values so if you use the function trim what it going to do it’s going to go and remove all the leading and trailing spaces so the first name so if this value is not equal to the first name after trimming it then we have an issue so it is very simple let’s go and execute it so now in the result we will get the list of all first names where we have spaces either at the start or at the end so again the expectation here is no results and the same thing we can go and check something else like for example the last name so let’s go and do that over here and here let’s go and execute it we see in the result we have as well customers where they have like space in their last name which is not really good and we can go and keep checking all the string values that you have inside the table so for example the gender so let’s go and check that and execute now as you can see we don’t have any results that means the quality of the gender is better and we don’t have any unwanted spaces so now we have to go and write transformation in order to clean up those two columns now what I’m going to do I’m just going to go and list all the column in the query instead of the star all right so now I have a list of all the columns that I need and now what we have to do is to go to those two columns and start removing The Unwanted spaces so we’ll just use the trim it’s very simple and give it a name of course the same name and we will trim as well the last name so let’s go and query this and with that we have cleaned up those two colums from any unwanted spaces okay so now moving on we have those two informations we have the marital status and as well the gender if you check the values inside those two columns as you can see we have here low cardinality so we have limited numbers of possible values that is used inside those two columns so what we usually do is to go and check the data consistency inside those two columns so it’s very simple what we’re going to do we’re going to do the following we’re going to say distinct and we’re going to check the values let’s go and do that and now as you can see we have only three possible values either null F or M which is okay we can stay like this of course but we can make a rule in our project where we can say we will not be working with data abbreviations we will go and use only friendly full names so instead of having an F we’re going to have like a full word female and instead of M we’re going to have like male and we make it as a rule for the whole project so each time we find the gender informations we try to give the full name of it so let’s go and map those two values to a friendly one so we’re going to go to the gender of over here and say case when and we’re going to say the gender is equal to F then make it a female and when it is equal to M then M it to male and now we have to make decision about the nulls as you can see over here we have nulls so do we want to leave it as a null or we want to use always the value unknown so with that we are replacing the missing values with a standard default value or you can leave it as a null but let’s say in our project that we are replacing all the missing value with a default value so let’s go and do that we going to say else I’m going to go with the na not available or you can go with the unknown of course so that’s for the gender information like this and we can go and remove the old one and now there is one thing that I usually do in this case where sometimes what happens currently we are getting the capital F and the capital M but maybe in the the time something changed and you will get like lower M and lower F so just to make sure in those cases we still are able to map those values to the correct value what we’re going to do we’re going to just use the function upper just to make sure that if we get any lowercase values we are able to catch it so the same thing over here as well and now one more thing that you can add as well of course if you are not trusting the data because we saw some unwanted spaces in the first name and the last name you might not trust that in the future you will get here as well unwanted spaces you can go and make sure to trim everything just to make sure that you are catching all those cases so that’s it for now let’s go and excute now as you can see we don’t have an m and an F we have a full word male and female and if we don’t have a value we don’t have a null we are getting here not available now we can go and do the same stuff for the Merial status you can see as well we have only three possibil ities the S null and an M we can go and do the same stuff so I will just go and copy everything from here and I will go and use the marital status I just remove this one from here and now what are the possible values we have the S so it’s going to be single we have an M for married and we have as well a null and with that we are getting the not available so with that we are making as well data standardizations for this column so let’s go and execute it now as you can see we don’t have those short values we have a full friendly value for the status and as well for the gender and at the same time we are handling the nulls inside those two columns so with that we are done with those two columns and now we can go to the last one that create date for this type of informations we make sure that this column is a real date and not as a string or barar and as we defined it in the data type it is a date which is completely correct so nothing to do with this column and now the next step is that we’re going to go and write the insert statement so how we’re going to do it we’re going to go to the start over here and say insert into silver do SRM customer info now we have to go and specify all the columns that should be inserted so we’re going to go and type it so something like this and then we have the query over here let’s go and execute it so let’s do that so with that we have inserted clean data inside the silver table so now what we’re going to do we’re going to go and take all the queries that we have used used in order to check the quality of the bronze and let’s go and take it to another query and instead of having bronze we’re going to say silver so this is about the primary key let’s go and execute it perfect we don’t have any results so we don’t have any duplicates the same thing for the next one so the silver and it was for the first name so let’s go and check the first name and run it as you can see there is no results it is perfect we don’t have any issues you can of course go and check the last name and run it again we don’t have any result over here and now we can go and check those low cardinality columns like for example the gender let’s go and execute it so as you can see we have the not available or the unknown male and female so perfect and you can go and have a final look to the table to the silver customer info let’s go and check that so now we can have a look to all those columns as you can see everything looks perfect and you can see it is working this metadata information that we have added to the table definition now it says when we have inserted all those three cords to the table which is really amazing information to have a track and audit okay so now by looking to the script we have done different types of data Transformations the first one is with the first name and the last name here we have done trimming removing unwanted spaces this is one of the types of data cleansing so we remove unnecessary spaces or unwanted characters to to ensure data consistency now moving on to the next transformation we have this casewin so what we have done here is data normalization or we call it sometimes data standardization so this transformation is type of data cleansing where we can map coded values to meaningful userfriendly description and we have done the same transformation as well to the agender another type of transformation that we have done as well in the same case when is that we have handled the missing values so instead of nulls we can have not available so handling missing data is as well type of data cleansing where we are filling the blanks by adding for example a default value so instead of having an empty string or a null we’re going to have a default value like the not available or unknown another type of data and Transformations that we have done in this script is we have removed the duplicates so removing duplicates is as well type of data cleansing where we ensure only one record for each primary key by identifying and retaining only the most relevant role to ensure there is no duplicates inside our data and as we are removing the duplicates of course we are doing data filtering so those are the different types of data Transformations that we have done in this script all right moving on to the second table in the bronze layer from the CRM we have the product info and of course as usual before we start writing any Transformations we have to search for data quality issues and we start with the first one we have to check the primary key so we have to check whether we have duplicates or nulls inside this key so what you have to do we have to group up the data by the primary key or check whether we have nulls so let’s go and execute it so as you can see everything is safe we don’t have dcat or nulls in the primary key now moving on to the next one we have the product key here we have in this column a lot of informations so now what you have to do is to go and split this string into two informations so we are deriving new two columns so now let’s start with the first one is the category ID the first five characters they are actually the category ID and we can go and use the substring function in order to extract part of a string it needs three arguments the first one going to be the column that we want to extract from and then we have to define the position where to extract and since the first part is on the left side we going to start from the first position and then we have to specify the length so how many characters we want to extract we need five characters so 1 2 3 4 five so that’s set for the category ID category ID let’s go and execute it now as you can see we have a new column called the category ID and it contains the first part of the string and in our database from the other source system we have as well the category ID now we can go and double check just in order to make sure that we can join data together so we’re going to go and check the ID from the pron table Erp and this can be from the category so in this table we have the category ID and you can see over here those are the IDS of the category and in the C layer we have to go and join those two tables but here we still have an issue we have here an underscore between the category and the subcategory but in our table we have actually a minus so we have to replace that with an underscore in order to have matching informations between those two tables otherwise we will not be able to join the tables so we’re going to use the function replace and what we are replacing we are replacing the m with an underscore something like this and if you go now and execute it we will get an underscore exactly like the other table and of course we can go and check whether everything is matching by having very simple query where we say this new information not in and then we have this nice subquery so we are trying to find any category ID that is not available in the second table so let’s go and execute it now as you can see we have only one category that is not matching we are not finding it in this table which is maybe correct so if you go over here you will not find this category I just make it a little bit bigger so we are not finding this one category from this table which is fine so our check is okay okay so with that we have the first part now we have to go and extract the second part and we’re going to do the same thing so we’re going to use the substring and the three argument the product key but this time we will not start cutting from the first position we have to be in the middle so 1 2 2 3 4 5 6 7 so we start from the position number seven and now we have to define the length how many characters to be extracted but if you look over here you can see that we have different length of the product keys it is not fixed like the category ID so we cannot go and use specified number we have to make something Dynamic and there is Trick In order to do that we can to go and use the length of the whole column with that we make sure that we are always getting enough characters to be extra Ed and we will not be losing any informations so we will make it Dynamic like this we will not have it as a fixed length and with that we have the product key so let’s go and execute it as you can see we are now extracting the second part from this string now why we need the product key we need it in order to join it with another table called sales details so let’s go and check the sales details so let me just check the column name it is SLS product key so from bronze CRM sales let’s go and check the data over here and it looks wonderful so actually we can go and join those informations together but of course we can go and check that so we’re going to say where and we’re going to take our new column and we’re going to say not in the subquery just to make sure that we are not missing anything so let’s go and execute so it looks like we have a lot of products that don’t have any orders well I don’t have a nice feelings about it let’s go and try something like this one here and we say where LS BRD key like this value over here so I’ll just cut the last three just to search inside this table so we really don’t have such a keys let me just cut the second one so let’s go and search for it we don’t have it as well so anything that starts with the FK we don’t have any order with the product where it starts with the F key so let’s go and remove it but still we are able to join the tables right so if I go and say in instead of not in so with that you are able to match all those products so that means everything is fine actually it’s just products that don’t have any orders so with that I’m happy with this transformation now moving on to the next one we have here the name of the product we can go and check whether there is unwanted spaces so let’s go to our quality checks make sure to use the same table and we’re going to use the product name and check whether we find any unmatching after trimming so let’s go and do it well it looks really fine so we don’t have to trim anything this column is safe now moving on to the next one we have the costs so here we have numbers and we have to check the quality of the numbers so what we can do we can check whether we have nulls or negative numbers so negative costs or negative prices which is not really realistic depend on the business of course so let’s say in our business we don’t have any negative costs so it’s going to be like this let’s go and check whether is something less than zero or whether we have costs that is null so let’s go and check those informations well as you can see we don’t have any negative values but we have nulls so we can go and handle that by replacing the null with a zero of course if the business allow that so in SQL server in order to replace the null with a zero we have a very nice function called is null so we are saying if it is null then replace this value with a zero it is very simple like this and we give it a name of course so let’s go and execute it and as you can see we don’t have any more nulls we have zero which is better for the calculations if you are later doing any aggregate functions like the average now moving on to the next one we have the product line This is again abbreviation of something and the cardinality is low so let’s go and check all possible values inside this column so we’re just going to use the distinct going to be BRD line so let’s go and execute it and as you can see the possible values are null Mr rst and again those are abbreviations but in our data warehouse we have decided to give full nice names so we have to go and replace those codes those abbreviations with a friendly value and of course in order to get those informations I usually go and ask the expert from the The Source system or an expert from the process so let’s start building our case win and then let’s use the upper and as well the trim just to make sure that we are having all the cases so the BRD line is equal to so let’s start with the first value the M then we will get the friendly value it’s going to be Mountain then to the next one so I will just copy and paste here if it is an R then it is rods and another one for let me check what do we have here we have Mr and then s the S stands for other sales and we have the T so let’s go and get the T so the T stands for touring we have at the end an else for unknown not available so we don’t need any nulls so that’s it and we’re going to name it as before so product line so let’s remove the old one and let’s execute it and as you can see we don’t have here anymore those shortcuts and the abbreviations we have now full friendly value but I will go and have here like capital O it looks nicer so that we have nice friendly value now by looking to this case when as you can see it is always like we are mapping one value to another value and we are repeating all time upper time upper time and so on we have here a quick form in the case when if it is just a simple mapping so the syntax is very simple we say case and then we have the column so we are evaluating this value over here and then we just say when without the equal so if it is an M then make it Mountain the same thing for the next one and so so with that we have the functions only once and we don’t have to go and keep repeating the same function over and over and this one only if you are mapping values but if you have complex conditions you can do it like this but for now I’m going to stay with the quick form of the case wi it looks nicer and shorter so let’s go and execute it we will get the same results okay so now back to our table let’s go to the last two columns we have the start and end date so it’s like defining an interval we have start and end so let’s go and check the quality of the start and end dates we’re going to go and say select star from our bronze table and now we’re going to go and search it like this we are searching for the end date that is smaller than the starts so PRT start dates so let’s let’s go and query this so you can see the start is always like after the end which makes no sense at all so we have here data issue with those two dates so now for this kind of data Transformations what I usually do is I go and grab few examples and put it in Excel and try to think about how I’m going to go and fix it so here I took like two products this one and this one over here and for that we have like three rows for each one of them and we have this situation over here so the question now how we going to go and fix it I will go and make like a copy of one solution where we’re going to say it’s very simple let’s go and switch the start date with the end date so if I go and grab the end dates and put it at the starts things going to look way nicer right so we have the start is always younger than the end but my friends the data now makes no sense because we say it starts from 2007 and ends by 2011 the price was 12 but between 2018 and 2012 we have 14 which is not really good because if you take for example the year 2010 for 2010 it was 12 and at the same time 14 so it is really bad to have an overlapping between those two dates it should start from 2007 and end with 11 and then start febe from 12 and end with something else there should be no overlapping between years so it’s not enough to say the start should be always smaller than the end but as well the end of the first history should be younger than the start of the next records this is as well a rule in order to have no overlapping this one has no start but has already an end which is not really okay because we have always to have a starts each new record in historization has to has a start so for this record over here this is as well wrong and of course it is okay to have the start without an end so in this scenario it’s fine because this indicate this is the current informations about the costs so again this solution is not working at all so now for for the solution to what we can say let’s go and ignore completely the end date and we take only the start dates so let’s go and paste it over here but now we go and rebuild the end date completely from the start date following the rules that we have defined so the rule says the end of date of the current records comes from the start date from the next records so here this end date comes from this value over here from the next record so that means we take the next start date and put it at the end date for the previous records so with that as you can see it is working the end date is higher than the start dates and as well we are making sure this date is not overlapping with the next record but as well in order to make it way nicer we can subtract it with one so we can take the previous day like this so with that we are making sure the end date is smaller than the next start now for the next record this one over here the end date going to come from the next start date so we will take this one for here and put it as an end Ag and subtract it with one so we will get the previous day so now if you compare those two you can see it’s still higher than the start and if you compare it with the NY record this one over here it is still smaller than the next one so there is no overlapping and now for the last record since we don’t have here any informations it will be a null which is totally fine so as you can see I’m really happy with this scenario over here of course you can go and validate this with an exp from The Source system let’s say I’ve done that and they approved it and now I can go and clean up the data using this New Logic so this is how I usually brainstorm about fixing an issues if I have like a complex stuff I go and use Excel and then discuss it with the expert using this example it’s way better than showing a database queries and so on it just makees things easier to explain and as well to discuss so now how I usually do it I usually go and make a focus on only the columns that I need and take only one two scenarios while I’m building the logic and once everything is ready I go and integrate it in the query so now I’m focusing only on these columns and only for these products so now let’s go and build our logic now in SQL if you are at specific record and you want to access another information from another records and for that we have two amazing window functions we have the lead and lag in this scenario we want to access the next records that’s why we have to go with the function lead so let’s go and build it lead and then what do we need we need the lead or the start date so we want the start date of the next records and then we say over and we have to partition the data so the window going to be focusing on only one product which is the product key and not the product ID so we are dividing the data by product key and of course we have to go and sort the data so order by and we are sorting the data by the start dates and ascending so from the lowest to the highest and let’s go and give it another name so as let’s say test for example just to test the data so let’s go and execute and I think I missed something here it say Partition by so let’s go and execute again and now let’s go and check the results for the first partition over here so the start is 2011 and the end is 2012 and this information came from the next record so this data is moved to the previous record over here and the same thing for this record so the end date comes from the next record so our logic is working and the last record over here is null because we are at the end of the window and there is no next data that’s why we will get null and this is perfect of course so it looks really awesome but what is missing is we have to go and get the previous day and we can do that very simply using minus one we are just subtracting one day so we have no overlapping between those two dates and the same thing for those two dates so as you can see we have just buil a perfect end date which is way better than the original data that we got from the source system now let’s take this one over here and put it inside our query so we don’t need the end H we need our new end dat we just remove that test and execute now it looks perfect all right now we are not done yet with those two dates actually we are saying all time dates because we don’t have here any informations about the time always zero so it makes no sense to have these informations inside our data so what we can do we can do a very simple cast and we make this column as a date instead of date time so this is for the first one and as well for the next one as dates so let’s try that out and as you can see it is nicer we don’t have the time informations of course we can tell the source systems about all those issues but since they don’t provide the time it makes no sense to have date and time okay so it was a long run but we have now cleaned product informations and this is way nicer than the original product information that we got from the source CRM so if you grab the ddl of the server table you can see that we don’t have a category ID so we have product ID and product key and as well those two columns we just change the data type so it’s date time here but we have changed that to a date so that means we have to go and do few modifications to the ddl so what we going to do we’re going to go over here and say category ID and I will be using the same data type and for the start and end this time it’s going to be date and not date and time so that’s it for now let’s go ah and execute it in order to repair the ddl and this is what happen in the silver layer sometimes we have to adjust the metadata if the quality of the data types and so on is not good or we are building new derived informations in order later to integrate the data so it will be like very close to the bronze layer but with few modifications so make sure to update your ddl scripts and now the next step is that we’re going to go and insert the data into the table and now the next step we’re going to go and insert the result of this query that is cleaning up the bronze table into the silver table so as we’ done it before insert into silver the product info and then we have to go and list all the columns I’ve just prepared those columns so with that we can go and now run our query in order to insert the data so now as you can see SQL did insert the data and the very important step is now to check the quality of the silver table so we go back to our data quality checks and we go switch to the silver so let’s check the primary key there is no issues and we can go and check for example here the the trims there is as well no issue and now let’s go and check the costs it should not be negative or null which is perfect let’s go and check the data standardizations as you can see they are friendly and we don’t have any nulls and now very interesting the order of the dates so let’s go and check that as you can see we don’t have any issues and finally what I do I go and have a final look to the silver table and as we can see everything is inserted correctly in the correct color colums so all those columns comes from the source system and the last one is automatically generated from the ddl indicate when we loaded this table now let’s sit back and have a look to our script what are the different types of data Transformations that we have done here is for example over here the category ID and the product key we have derived new columns so it is when we create a new column based on calculations or transformations of an existing one so sometimes we need columns only for analytics and we cannot each time go to the source system and ask them to create it so instead of that we derive our own columns that we need for the analytics another transformation we have is that is null over here so we are handling here missing information instead of null we’re going to have a zero and one more transformation we have over here for the product line we have done here data normalization instead of having a code value we have a friendly value and as well we have handled the missing data for example over here instead of having a null we’re going to have not available all right moving on to another data transformation we have done data type casting so we are converting the data type from one to another and this considered as well to be a data transformation and now moving on to the last one we are doing as well data type casting but what’s more important we are doing data enrichment this type of transformation it’s all about adding a value to your data so we are adding a new relevant data to our data sets so those are the different types of data Transformations that we have done for this table okay so let’s keep going we have the sales details and this is the last table in the CRM so what do you have over here we have the order number and this is a string of course we can go and check whether we have an issue with the unwanted spaces so we can search whether we’re going to find something so we can say trim and something like this and let’s go and execute it so we can see that we don’t have any unwanted spaces that means we don’t have to transform this column so we can leave it as it is now the next two columns they are like keys and ideas is in order to connect it with the other tables as we learned before we are using the product key in order to connect it with the product informations and we are connecting the customer ID with the customer ID from the customer info so that means we have to go and check whether everything is working perfectly so we can go and check the Integrity of those columns where we say the product key Nots in and then we make a subquery and this time we can work with the silver layer right so we can say the product key from Silver do product info so let’s go and query this and as you can see we are not getting any issue that means all the product keys from the sales details can be used and connected with the product info the same thing we can go and check the Integrity of the customer ID and we can use not the products we can go to the customer info and the name was CST ID so let’s go and query that and the same thing we don’t have here any issues so that means we can go and connect the sales with the customers using the customer ID and we don’t have to do any Transformations for it so things looks really nice for those three columns now we come to the challenging one we have here the dates now those dates are not actual dates they are integer so those are numbers and we don’t want to have it like this we would like to clean that up we have to change the data type from integer to a DAT now if you want to convert an integer to a date we have to be careful with the values that we have inside each of those columns so now let’s check the quality for example of the order dates let’s say where order dates is less than zero for example something negative well we don’t have any negative values which is good let’s go and check whether we have any zeros well this is bad so we have here a lot of zeros now what we can do we can replace those informations with a null we can use of course the null IF function like this we can say null if and if it is zero then make it null so let’s execute it and as you can see now all those informations are null now let’s go and check again the data so now this integer has the years information at the start then the months and then the day so here we have to have like 1 2 3 4 5 so the length of each number should be H and if the length is less than eight or higher than eight then we have an issue let’s go and check that so we’re going to say or length sales order is not equal to eight that means less or higher let’s go and execute it now let’s go and check the results over here and those two informations they don’t look like dates so we cannot go and make from these informations a real dates they are just bad data and of course you can go and check the boundaries of a DAT like for example it should not be higher than for example let’s go and get this value 2050 and then I need for the month and the date so let’s go and execute it and if we just remove those informations just to make sure so we don’t have any date that is outside of the boundaries that you have in your business or you go for example and say the boundary should be not less than depend when your business started maybe something like this we are getting of course those values because they are less than n but if you have values around these dates you will get it as well in the query so we can go and add the rests so all those checks like validate the column that has date informations and it has the data type integer so again what are the issues over here we have zeros and sometimes we have like strange numbers that cannot be converted to a dates so let’s go and fix that in our query so we can say case when the sales order the order date is equal to zero or of the order date is not equal to 8 then null right we don’t want to deal with those values they are just wrong and they are not real dates otherwise we say else it’s going to be the order dates now what we’re going to do we’re going to go and convert this to a date we don’t want this as an integer so how we can do that we can go and cast it first to varar because we cannot cast from integer to date in SQL Server first you have to convert it to a varar and then from varar you go to a dates well this is how we do it in scq server so we cast it first to a varar and then we cast it to a date like this that’s it so we have end and we are using the same column name so this is how we transform an integer to a date so let’s go and query this and as you can see the order date now is a real date it is not a number so we can go and get rid of the old column now we have to go and do the same stuff for the shipping dates so we can go over here and replace everything with the shipping date and let’s go query well as you can see the shipping date is perfect we don’t have any issue with this column but still I don’t like that we found a lot of issues with the order dates so what we’re going to do just in case this happens for the shipping date in the future I will go and apply the same rules to the shipping dates oh let’s take the shipping date like this and if you don’t want to apply it now you have always to build like quality checks that runs every day in order to detect those issues and once you detect it then you can go and do the Transformations but for now I’m going to apply it right away so that is for the shipping date now we go to the due date and we will do the same test let’s go and execute it and as well it is perfect so still I’m going to apply the same rules so let’s get the D everywhere here in the query just make sure you don’t miss anything here so let’s go and execute now perfect as you can see we have the order date shipping date and due date and all of them are date and don’t have any wrong data inside those columns now still there is one more check that we can do and is that the order date should be always smaller than the shipping date or the due date because it’s makes no sense right if you are delivering an item without an order so first the order should happen then we are shipping the items so there is like an order of those dates and we can go and check that so we are checking now for invalid date orders where we going to say the order date is higher than the shipping date or we are searching as well for an order where the order date date is higher than the due dates so we going to have it like this due dates so let’s go and check well that’s really good we don’t have such a mistake on the data and the quality looks good so the order date is always smaller than the shipping date or the due dates so we don’t have to do any Transformations or cleanup okay friends now moving on to the last three columns we have the sales quantity and the price all those informations are connected to each others so we have a business rule or calculation it says the sales must be equal to quantity multiplied by the price and all sales quantity and price informations must be positive numbers so it’s not allowed to be negative zero or null so those are the business rules and we have to check the data consistency in our table does all those three informations following our rules so we’re going to start first with our rule right so we’re going to say if the sales is not equal to quantity multiplied by the price so we are searching where the result is not matching our expectation and as well we can go and check other stuff like the nulls so for example we can say or sales is null or quantity is null and the last one for the price and as well we can go and check whether they are negative numbers or zero so we can go over here and say less or equal to zero and apply it for the other columns as well so with that we are checking the calculation and as well we are checking whether we have null0 Z or negative numbers let’s go and check our informations I’m going to have here A distinct so let’s go and query it and of course we have here bad data but we can go and sort the data by the sales quantity and the price so let’s do it now by looking to the data we can see in the sales we have nulls we have negative numbers and zeros so we have all bad combinations and as well we have here bad calculations so as you can see the price here is 50 the quantity is one but the sales is two which is not correct and here we have as well wrong calculations here we have to have a 10 and here nine or maybe the price is wrong and by looking to the quantity now you can see we don’t have any nulls we don’t have any zeros or negative numbers so the quantity looks better than the sales and if you look to the prices we have nulls we have negatives and yeah we don’t have zeros so that means the quality of the sales and the price is wrong the calculation is not working and we have these scenarios now of course how I do it here I don’t go and try now to transform everything on my own I usually go and talk to an expert maybe someone from the business or from the source system and I show those scenarios and discuss and usually there is like two answers either they going to tell me you know what I will fix it in my source so I have to live with it there is incoming bad data and the bad data can be presented in the warehouse until the source system clean up those issues and the other answer you might get you know what we don’t have the budget and those data are really old and we are not going to do anything so here you have to decide either you leave it as it is or you say you know what let’s go and improve the quality of the data but here you have to ask for the experts to support you solving these issues because it really depend on their rules different rules makes different Transformations so now let’s say that we have the following rules if the sales informations are null or negative or zero then use the calculation the formula by multiplying the quality with the price and now if the prices are wrong for example we have here null or zero then go and calculate it from the sales and a quantity and if you have a price that is a minus like minus 21 a negative number then you have to go and convert it to a 21 so from a negative to a positive without any calculations so those are the rules and now we’re going to go and build the Transformations based on those rules so let’s do it step by step I will go over here and we’re going to start building the new sales so what is the rule Sals case when of course as usual if the sales is null or let’s say the sales is negative number or equal to zero or another scenario we have a sales information but it is not following the calculation so we have wrong information in the sales so we’re going to say the sales is not equal to the quantity multiplied by the price but of course we will not leave the price like this by using the function APS the absolute it’s going to go and convert everything from negative to a positive then what we have to do is to go and use the calculation so so it’s going to be the quantity multiplied by the price so that means we are not using the value that come from the source system we are recalculating it now let’s say the sales is correct and not one of those scenarios so we can say else we will go with the sales as it is that comes from the source because it is correct it’s really nice let’s go and say an end and give it the same name I will go and rename the old one here as an old value and the same for the price the quantity will not T it because it is correct so like this and now let’s go and transform the prices so again as usual we go with case wi so what are the scenarios the price is null or the price is less or equal to zero then what we’re going to do we’re going to do the calculation so it going to be the sales divided by the quantity the SLS quantity but here we have to make sure that we are not dividing by zero currently we don’t have any zeros in the quantity but you don’t know future you might get a zero and the whole code going to break so what you have to do is to go and say if you get any zero replace it with a null so null if if it is zero then make it null so that’s it now if the price is not null and the price is not negative or equal to zero then everything is fine and that’s why we’re going to have now the else it’s going to be the price as it is from The Source system so that’s it we’re going to say end as price so I’m totally happy with that let’s go and execute it and check of course so those are the old informations and those are the new transformed cleaned up informations so here previously we have a null but now we have two so two multiply with one we are getting two so the sales is here correct now moving on to the next one we have in the sales 40 but the price is two so two multiplied with one we should get two so the new sales is correct it is two and not 40 now to the next one over here the old sales is zero but if you go and multiply the four with the quantity you will get four so the sales here is not correct that’s why in the new sales we have it correct as a four and let’s go and get a minus so in this case we have a minus which is not correct so we are getting the price multiplied with one we should get here a nine and this sales here is correct now let’s go and get a scenario where the price is a null like this here so we don’t have here price but we calculated from the sales and the quantity so we divided the 10 by two and we have five so the new price is better and the same thing for the minuses so we have here minus 21 and in the output we have 21 which is correct so for now I don’t see any scenario where the data is wrong so everything looks better than before and with that we have applied the business rules from the experts and we have cleaned up the data in the data warehouse and this is way better than before because we are presenting now better data for analyzes and Reporting but it is challenging and you have exactly to understand the business so now what we’re going to do we’re going to go and copy those informations and integrate it in our query so instead of sales we’re going to get our new calculation and instead of the price we will get our correct calculation and here I’m missing the end let’s go and run the whole thing again so with that we have as well now cleaned sales quantity and price and it is following our business rules so with that we are done cleaning up the sales details The Next Step we’re going to go and inserted to the sales details but we have to go and check again the ddl so now all what you have to do is to compare those results with the ddl so the first one is the order number it’s fine the product key the customer ID but here we have an issue all those informations now are date and not an integer so we have to go and change the data type and with that we have better data type than before then the sales quantity price it is correct let’s go and drop the table and create it from scratch again and don’t forget to update your ddl script so that’s it for this and we’re going to go now and insert the results into our silver table say details and we have to go and list now all the columns I have already prepared the list of all the columns so make sure that you have the correct order of the columns so let’s go now and insert the data and with that and with that we can see that the SQL did insert data to our sales details but now very important is to check the health of the silver table so what we going to do instead here of bronze we’re going to go and switch it to Silver so let’s check over here so here always the order is smaller than the shipping and the due date which is really nice but now I’m very interested on the calculations so here we’re going to switch it from bronze to Silver and I’m going to go and get rid of all those calculations because we don’t need it this and now let’s see whether we have any issue well perfect our data is following the business rules we don’t have any nulls negative values zeros now as usual the last step the final check we will just have a final look to the table so we have the order number the product key the customer ID the three dates we have have the sales quantity and the price and of course we have our metadata column everything is perfect so now by looking to our code what are the different types of data Transformations that we are doing so in those three columns we are doing the following so at the start we are handling invalid data and this is as well type of transformation and as well at the same time we are doing data type casting so we are changing it to more correct data type and if you are looking to the sales over here then what we are doing over here is we are handling the missing data and as well the invalid data by deriving the column from already existing one and it is as well very similar for the price we are handling as well the invalid data by deriving it from specific calculation over here so those are the different types of data Transformations that you have done in these scripts all right now let’s keep moving to the next our system we have the customer AZ 12 so here we have we have like only three columns and let’s start with the ID first so here again we have the customers informations and if we go and check again our model you can see that we can connect this table with the CRM table customer info using the customer key so that means we have to go and make sure that we can go and connect those two tables so let’s go and check the other table we can go and check of course the silver layer so let’s query it and we can query both of the tables now we can see there is here like exract characters that are not included in the customer key from the CRM so let’s go and search for example for this customer over here where C ID like so we are searching for customer has similar ID now as you can see we are finding this customer but the issue is that we have those three characters in as there is no specifications or explanation why we have the nas so actually what we have to do is to go and remove those informations we don’t need it so let’s again check the data so it looks like the old data have an Nas at the start and then afterward we have new data without those three characters so we have to clean up those IDs in order to be able to connect it with other tables so we’re going to do it like this we’re going to start with the case wiin since we have like two scenarios in our data so if the C ID is like the three characters in as so if the ID start with those three characters then we’re going to go and apply transformation function otherwise eyes it’s going to stay like it is so that’s it so now we have to go and build the transformation so we’re going to use substring and then we have to define the string it’s going to be the C ID and then we have to define the position where it start cutting or extracting so we can say 1 2 3 and then four so we have to define the position number four and then we have to define the string how many characters should be extracted I will make it Dynamic so I will go with the link I will not go and count how much so we’re going to say the C ID so it looks good if it’s like an as then go and extract from the CID at the position number four the rest of the characters so let’s go and execute it and I’m missing here a comma again where we don’t have any Nas at the start and if you scroll down you can see those as well are not affected so with that we have now a nice ID to be joined with other table of course we can go and test it like this where and then we take the whole thing the whole transformation and say not in we remove of course the alas name we don’t need it and then we make very simple substring select distinct CST key the customer key from the silver table can be silver CRM cost info so that’s it let’s go and check so as you can see it is working fine so we are not able to find any unmatching data between the customer info from ERB and the CRM but of course after the transformation if you don’t use the transformation so if I just remove it like this we will find a lot of unmatching data so this means our transformation is working perfectly and we can go and remove the original value so that’s it for the First Column okay now moving on to the next field we have the birthday of their customers so the first thing to do is to check the data type it is a date so it’s fine it is not an integer or a string so we don’t have to convert anything but still there is something to check with the birth dates so we can check whether we have something out of range so for example we can go and check whether we have really old dates at the birth dates so let’s take 1900 and let’s say 24 and we can take the first date of the month so let’s go and check that well it looks like that we have customers that are older than a 100 Year well I don’t know maybe this is correct but it sounds of course strange to bit of the business of course this is Creed and he is in charge of something that is correct say hi to the kids hi kids yay and then we can go and check the other boundary where it is almost impossible to have a customer that the birthday is in the future so we can say birth date is higher than the current dates like this so let’s go and query this information well it will not work because we have to have like an or between them and now if we check the list over here we have dates that are invalid for the birth dates so all those dates they are all birthday in the future and this is totally unacceptable so this is an indicator for bad data quality of course you can go and report it to the source system in order to correct it so here it’s up to you what to do with those dates either leave it as it is as a bad data or we can go and clean that up by replacing all those dates with a null or maybe replacing only the one that is Extreme where it is 100% is incorrect so let’s go and write the transformation for that as usual we’re going to start with case whenn per dates is larger than the current date and time then null otherwise we can have an else where we have the birth dat as it is and then we have an end as birth date so let’s go and excuse it and with that we should not get any customer we the birthday in the future so that’s it for the birth dates now let’s move to the next one we have the gender now again the gender informations is localities so we have to go and check all the possible values inside this column so in order to check all the possible values we’re going to use select distinct gen from our table so let’s go and execute it and now the data doesn’t look really good so we have here a null we have an F we have here an empty string we have male female and again we have the m so this is not really good what we going to do we’re going to go and clean up all those informations in order to have only three values male female and not available so we’re going to do it like this we’re going to say case when and now we’re going to go and trim the values just to make sure there is like no empty spaces and as well I’m going to go and use the upper function just to make sure that in the future if we get any lower cases and so on we are covering all the different scenarios so case this is in F4 let’s say female then make it as female and we can go and do the same thing for the male like this so if it is an M or a male make sure it is capital letters because here we are using the upper then it is a male otherwise all other scenarios it should be not available so whether it is an empty string or nulls and so on so we have to have an end of course as gen so now let’s go and test it and check whether we have covered everything so you can see the m is now male the empty is not available the f is female the empty string or maybe spaces here is not available female going to stay as it is and the same for the male so with that we are covering all the scenarios and we are following our standards in the project so I’m going to go and cut this and put it in our original query over here so let’s go and execute the whole thing and with that we have cleaned up all those three columns now the question is did we change anything in the ddl well we didn’t change anything we didn’t introduce any new column or change any data type so that means the next step is we’re going to go and insert it in the server layer so as usual we’re going to say here insert into silver Erp the customer and then we’re going to go and list all the column names so C ID birth dat and the gender all right so let’s go and execute it and with that we can see it inserted all the data and of course the very important step as the next is to check that data quality so let’s go back to our query over here and change it from bronze to Silver so let’s go and check the silver layer well of course we are getting those very old customers but we didn’t change that we only change the birthday that is in the future and we don’t see it here in the results so that means everything is clean so for the next one let’s go and check the different genders and as you can see we have only those three values and of course we can go and take a final look to our table so you can see the C ID here the birth date the gender and then we see our metadata column and everything looks amazing so that’s it what are the different types of data Transformations that we have done first with the ID what you have done we have handled inv valid values so we have removed this part where it is not needed and the same thing goes for the birth dates we have handled as well invalid values and then for the last one for the gender we have done data normalizations by mapping the code to more friendly value and as well we have handled the missing values so those are the types that we have done in this code okay moving on to the second table we have the location informations so we have Erp location a101 so now here the task is easy because we have only two columns and if you go and check the integration model we can find our table over here so we can go and connect it together with the customer info from the other system using the CI ID with the customer key so those two informations must be matching in order to join the tables so that means we have to go and check the data so let’s go and select the data CST key from let’s go and get the silver Data customer info so let’s now if you go and check the result you can see over here that we have an issue with the CI ID there is like a minus between the characters and the numbers but the customer ID the customer number we don’t have anything that splits the characters with the numbers so if you go and join those two informations it will not be working so what we have to do we have to go and get rid of this minus because it is totally unnecessary so let’s go and fix that it’s going to be very simple so what we’re going to do we’re going to say C ID so we’re going to go and search for the m and replace it with nothing it’s very simple like this so let’s go and quer it again and with that things looks very similar to each others and as well we can go and query it so we’re going to say where our transformation is not in then we can go and use this as a subquery like this so let’s go and execute it and as you can see we are not finding any unmatching data now so that means our transformation is working and with that we can go and connect those two tables together so if I take take the transformation away you can see that we will find a lot of unmatching data so the transformation is okay we’re going to stay with it and now let’s speak about the countries now we have here multiple values and so on what I’m going to do this is low cardinality and we have to go and check all possible values inside this column so that means we are checking whether the data is consistent so we can do it like this distinct the country from our table I’m just going to go and copy it like this and as well I’m going to go s the data by the country so let’s go and check the informations now you can see we have a null we have an empty string which is really bad and then we have a full name of country and then we have as well an abbreviation of the countries well this is a mix this is not really good because sometimes we have the E and sometimes we have Germany and then we have the United Kingdom and then for the United States we have like three versions of the same information which is as well not really good so the quality of the is not really good so let’s go and work on the transformation as usual we’re going to start with the case win if trim country is equal to D then we’re going to transform it to Germany and the next one it’s going to be about the USA so if trim country is in so now let’s go and get those two values the US and the USA so us and USA then it’s going to be the United States States states so with that we have covered as well those three cases now we have to talk about the null and the empty string so we’re going to say when trim country is equal to empty string or country is null then it’s going to be not available otherwise I would like to get the country as it is so trim country just to make sure that we don’t have any leading or trailing spaces so that’s it let’s go and say this is the country so it is working and the country information is transformed and now what I’m going to do I’m going to take the whole new transformation and compare it to the old one let me just call this as old country and let’s go and query it so now we can check those value State as before so nothing did change the de is now Germany the empty string is not available the null the same thing and the United Kingdom State as like it’s like before and now we have one value for all those information so it’s only the United States so it looks perfect and with that we have cleaned as well the second column so with that we have now clean results and now the question did we change anything in the ddl well we haven’t changed anything both of them are varar so we can go now immediately and insert it into our table so insert into silver customer location and here we have to specify the columns it’s very simple the ID and the country so let’s go and execute it and as you can see we got now inserted all those values of course as a next we go and double check those informations I would just go and remove all those stuff as well here and instead of bronze let’s go with the silver so as you can see all the values of the country looks good and let’s have a final look to the table so like this so we have the IDS without the separator we have the countries and as well our metadata information so with that we have cleaned up the data for the location okay so now what are the different types of data transformation that we have done here is first we have handled invalid values so we have removed the minus with an empty string and for the country we have done data normalization so we have replaced codes with friendly values and as well at the same time we have handled missing values by replacing the empty string and null with not available and one more thing of course we have removed the unwanted spaces so those are the different types of transformation that we have done for this table okay guys now keep the energy up keep the spirit up we have to go and clean up the last table in the bronze layer and of course we cannot go and Skip anything we have to check the quality and to detect all the errors so now we have a table about the categories for the products and here we have like four columns let’s go and start with the first one the ID as you can see in our integration model we can connect this table together with the product info from the CRM using the product key and as as you remember in the silver layer we have created an extra column for that in the product info so if you go and select those data you can see we have a column called category ID and this one is exactly matching the ID that we have in this table and we have done the testing so this ID is ready to be used together with the other table so there is nothing to do over here and now for the next columns they are string and of course we can go and check whether there are any unwanted spaces so we are checking for The Unwanted spaces is so let’s go and check select star from and we’re going to go and get the same table like this here and first we are checking the category so the category is not equal to the category after trimming The Unwanted spaces so let’s go and execute it and as you can see we don’t have any results so there are no unwanted spaces let’s go and check the other column for example the subcategory the next one so let’s get the subcategory and the under query as well we don’t have anything so that means we don’t have unwanted spaces for the subcategory let’s go now and check the last column so I will just copy and paste now let’s get the maintenance and let’s go and execute and as well no results perfect we don’t have any unwanted spaces inside this table so now the next step is that we’re going to go and check the data standardizations because all those columns has low cardinality so what we’re going to do we’re going to say select this thing let’s get the cat category from our table I’ll just copy and paste it and check all values so as you can see we have the accessories bikes clothing and components everything looks perfect we don’t have to change anything in this column let’s go and check the subcategory and if you scroll down all values are friendly and nice as well nothing to change here and let’s go and check the last column the maintenance perfect we have only two values yes and no we don’t have any nulls so my friends that means this table has really nice data quality and we don’t have to clean up anything but still we have to follow our process we have to go and load it from the bronze to the silver even if we didn’t transform anything so our job is really easy here we’re going to go and say insert into silver dots Erp PX and so on and we’re going to go and Define The Columns so it’s going to be the ID the category sub category maintenance so that’s it let’s go and insert the data now as usual what we’re going to do we’re going to go and check the data so silver Erp PX let’s have a look all right so we can see the IDS are here the categories the subcategories the maintenance and we have our meta column so everything is inserted correctly all right so now I have all those queries and the insert statements for all six tables and now what is important before inserting any data we have to make sure that we are trating and emptying the table because if you run this qu twice what’s going to happen you will be inserting duplicates so first truncate the data and then do a full load insert all data so we’re going to have one step before it’s like the bronze layer we’re going to say trate table and then we will be trating the silver customer info and only after that we have to go and insert the data and of course we can go and give this nice information at the start so first we are truncating the table and then inserting so if I go and run the whole thing so let’s go and do it it will be working so if I can run it again we will not have any duplicates so we have to go and add this tip before each insert so let’s go and do that all right so I’m done with all tables so now let’s go and run everything so let’s go and execute it and we can see in the messaging everything working perfectly so with that we made all tables empty and then we inserted the data so perfect with that we have a nice script that loads the silver layer but of course like the bronze layer we’re going to put everything in one stored procedure so let’s go and do that we’ll go to the beginning over here and say create or alter procedure and we’re going to put it in the schema silver and using the naming convention load silver and we’re going to go over here and say begin and take the whole code end it is long one and give it one push with a tab and then at the end we’re going to say and perfect so we have our s procedure but we forgot here the US with that we will not have any error let’s go and execute it so the thir procedure is created if you go to the programmability and you will find two procedures load bronze and load silver so now let’s go and try it out all what you have to do is now only to execute the Silver Load silver so let’s execute the start procedure and with that we will get the same results this thir procedure now is responsible of loading the whole silver layer now of course the messaging here is not really good because we have learned in the bronze layer we can go and add many stuff like handling the error doing nce messaging catching the duration time so now your task is to pause the video take this thir procedure and go and transform it to be very similar to the bronze layer with the same messaging and all the add-ons that we have added so pause the video now I will do it as well offline and I will see you soon okay so I hope you are done and I can show you the results it’s like the bronze layer we have defined at the star few variables in order to catch the duration so we have the start time the end time patch start time and Patch end time and then we are printing a lot of stuff in order to have like nice messaging in the outut so at the start we are saying loading the server layer and then we start splitting by The Source system so loading the CRM tables and I’m going to show you only one table for now so we are setting the timer so we are saying start time get the dat date and time informations to it then we are doing the usual we are truncating the table and then we are inserting the new informations after cleaning it up and we have this nice message where we say load duration where we are finding the differences between the start time and the end time using the function dat diff and we want to show the result in the seconds so we are just printing how long it took to load this table and we’re going to go and repeat this process for all the tables and of course we are putting everything in try and Cat so the SQL going to go and try to execute the tri part and if there are any issues the SQL going to go and execute the catch and here we are just printing few information like the error message the error number and the error States and we are following exactly the same standard at the bronze layer so let’s go and execute the whole thing and with that we have updated the definition of the S procedure let’s go now and execute it so execute silver do load silver so let’s go and do that it went very fast like few than 1 second again because we are working on local machine loading the server layer loading the CRM tables and we can see this nice messaging so it start with trating the table inserting the data and we are getting the load duration for this table and you will see that everything is below 1 second and that’s because at in real project you will get of course more than 1 second so at the end we have low duration of the whole silver layer and now I have one more thing for you let’s say that you are changing the design of this thr procedure for the silver layer you are adding different types of messaging or maybe are creating logs and so on so now all those new ideas and redesigns that you are doing for the silver layer you have always to think about bringing the same changes as well in the other store procedure for the pron layer so always try to keep your codes following the same standards don’t have like one idea in One S procedure and an old idea in another one always try to maintain those scripts and to keep them all up to date following the same standards otherwise it can to be really hard for other developers to understand the cause I know that needs a lot of work and commitments but this is your job to make everything following the best practices and following the same naming convention and standards that you put for your projects so guys now we have very nice two ETL scripts one that loads the pron layer and another one for the server layer so now our data bear house is very simple all what you have to do is to run first the bronze layer and with that we are taking all the data from the CSV files from the source and we put it inside our data warehouse in the pron layer and with that we are refreshing the whole bronze layer once it’s done the next step is to run the start procedure of the servey layer so once you executed you are taking now all the data from the bronze layer transforming it cleaning it up and then loading it to the server layer and as you can see the concept is very simple we are just moving the data from one layer another layer with different tasks all right guys so as you can see in the silver layer we have done a lot of data Transformations and we have covered all the types that we have in the data cleansing so we remove duplicates data filtering handling missing data invalid data unwanted spaces casting the data types and so on and as well we have derived new columns we have done data enrichment and we have normalized a lot of data so now of course what we have not done yet business rules and logic data aggregations and data integration this is for the next layer all right my friends so finally we are done cleaning up the data and checking the quality of our data so we can go and close those two steps and now to the next step we have to go and extend the data flow diagram so let’s go okay so now let’s go and extend our data flow for the silver layer so what I’m going to do I’m just going to go and copy the whole thing and put it side by side to the bronze layer and let’s call it silver layer and the table names going to stay as before because we have like one to one like the bronze layer but what we’re going to do we’re going to go and change the coloring so I’m going to go and Mark everything and make it gray like silver and of course what is very important is to make the lineage so I’m going to go now from the bronze and take an arrow and put it to the server table and now with that we have like a lineage between three layers and you are checking this table the customer info you can understand aha this comes from the bronze layer from the customer info and as well this comes from the source system CRM so now you can see the lineage between different layers and without looking to any scripts and so on in one picture you can understand the whole projects so I don’t have to explain a lot of stuff by just looking to this picture you can understand how the data is Flowing between sources bronze layer silver layer and to the gold layer of course later so as you can see it looks really nice and clean all right so with that we have updated the data flow next we’re going to go and commit our work in the get repo so let’s go okay so now let’s go and commit our scripts we’re going to go to the folder scripts and here we have a server layer if you don’t have it of course you can go and create it so first we’re going to go and put the ddl scripts for the server layer so let’s go and I will paste the code over here and as usually we have this comment at the header explaining the purpose of this scripts so let’s go and commit our work work and we’re going to do the same thing for the start procedure that loads the silver layer so I’m going to go over here I have already file for that so let’s go and paste that so we have here our stored procedures and as usual at the start we have as well so this script is doing the ETL process where we load the data from bronze into silver so the action is to truncate the table first and then insert transformed cleans data from bronze to Silver there are no parameters at all and this is how you can use the start procedure okay so we’re going to go and commit our work and now one more thing that we want to commit in our project all those quaries that you have built to check the quality of the server layer so this time we will not put it in the scripts we’re going to go to the tests and here we’re going to go and make a new file called quality checks silver and inside it we’re going to go and paste all the queries that we have filled I just here reorganize them by the tables so here we can see all the checks that we have done during the course and at the header we have here nice comments so here we are just saying that this script is going to check the quality of the server layer and we are checking for nulls duplicates unwanted spaces invalid date range and so on so that each time you come up with a new quality check I’m going to recommend you to share it with the project and with other team in order to make it part of multiple checks that you do after running the atls so that’s it I’m going to go and put those checks in our repo and in case I come up with new check I’m going to go and update it perfect so now we have our code in our repository all right so with that our code is safe and we are done with the whole epic so we have build the silver layer now let’s go and minimize it and now we come to my favorite layer the gold layer so we’re going to go and build it the first step as usual we have to analyze and this time we’re going to explore the business objects so let’s go all right so now we come to the big question how we going to build the gold layer as usual we start with analyzing so now what we’re going to do here is to explore and understand what are the main business objects that are hidden inside our source system so as you can see we have two sources six files and here we have to identify what are the business objects once we have this understanding then we can start coding and here the main transformation that we are doing is data integration and here usually I split it into three steps the first one we’re going to go and build those business objects that we have identified and after we have a business object we have to look at it and decide what is the type of this table is it a dimension is it a fact or is it like maybe a flat table so what type of table that we have built and the last step is of course we have now to rename all the columns into something friendly and easy to understand so that our consumers don’t struggle with technical names so once we have all those steps what we’re going to do it’s time to validate what we have created so what we have to do the new data model that we have created it should be connectable and we have to check that the data integration is done correctly and once everything is fine we cannot skip the last step we have to document and as well commit our work in the git and here we will be introducing new type of documentations so we’re going to have a diagram about the data model we’re going to build a data dictionary where we going to describe the data model and of course we can extend the data flow diagram so this is our process those are the main steps that we will do in order to build the gold layer okay so what is exactly data modeling usually usually the source system going to deliver for you row data an organized messy not very useful in its current States but now the data modeling is the process of taking this row data and then organize it and structure it in meaningful way so what we are doing we are putting the data in a new friendly and easy to understand objects like customers orders products each one of them is focused on specific information and what is very important is we’re going to describe the relationship between those objects so by connecting them using lines so what you have built on the right side we call it logical data model if you compare to the left side you can see the data model makes it really easy to understand our data and the relationship the processes behind them now in data modeling we have three different stages or let’s say three different ways on how to draw a data model the first stage is the conceptual data model here the focus is only on the entity so we have customers orders products and we don’t go in details at all so we don’t specify any columns or attributes inside those boxes we just want to focus what are the entities that we have and as well the relationship between them so the conceptual data model don’t focus at all on the details it just gives the big picture so the second data model that we can build is The Logical data model and here we start specifying what are the different columns that we can find in each entity like we have the customer ID the first name last name and so on and we still draw the relationship between those entities and as well we make it clear which columns are the primary key and so on so as you can see we have here more details but one thing we don’t describe a lot of details for each column and we are not worry how exactly we going to store those tables in the database the third and last stage we have the physical data model this is where everything gets ready before creating it in the database so here you have to add all the technical details like adding for each column the data types and the length of each data type and many other database techniques and details so again if if you look to the conceptual data model it gives us the big picture and in The Logical data model we dive into details of what data we need and the physical layer model prepares everything for the implementation in the database and to be honest in my projects I only draw the conceptual and The Logical data model because drawing and building the physical data model needs a lot of efforts and time and there are many tools like in data bricks they automatically generate those models so in this project what we’re going to do we’re going to draw The Logical data model for the gold layer all right so now for analytics and specially for data warehousing and business intelligence we need a special data model that is optimized for reporting and analytics and it should be flexible scalable and as well easy to understand and for that we have two special data models the first type of data model we have the star schema it has a central fact table in the middle and surrounded by Dimensions the fact table contains transactions events and the dimensions contains descriptive informations and the relationship between the fact table in the middle and the dimensions around it forms like a star shape and that’s why we call it star schema and we have another data model called snowflake schema it looks very similar to the star schema so we have again the fact in the middle and surrounded by Dimensions but the big difference is that we break the dimensions into smaller subdimensions and the shape of this data model as you are extending the dimensions it’s going to look like a snowflake so now if you compare them side by side you can see that the star schema looks easier right so it is usually easy to understand easy to query it is really perfect for analyzes but it has one issue with that the dimension might contain duplicates and your Dimensions get bigger with the time now if you compare to the snowflake you can see the schema is more complex you so you need a lot of knowledge and efforts in order to query something from the snowflake but the main advantage here comes with the normalization as you are breaking those redundancies in small tables you can optimize the storage but to be honest who care about the storage so for this project I have chose to use the star schema because it is very commonly used perfect for reporting like for example if you’re using power pii and we don’t have to worry about the storage so that’s why we going to adapt this model to build our gold layer okay so now one more thing about those data models is that they contain two types of tables fact and dimensions so when I I say this is a fact table or a dimension table well the dimension contains descriptive informations or like categories that gives some context to your data for example a product info you have product name category subcategories and so on this is like a table that is describing the product and this we call it Dimension but in the other hand we have facts they are events like transactions they contain three important informations first you have multiple IDs from multiple dimensions then we have like the informations like when the transaction or the event did happen and the third type of information you’re going to have like measures and numbers so if you see those three types of data in one table then this is a fact so if you have a table that answers how much or how many then this is a fact but if you have a table that answers who what where then this is a dimension table so this is what dimension and fact tables all right my friends so so far in the bronze layer and in the silver layer we didn’t discuss anything about the business so the bronze and silver were very technical we are focusing on data Eng gestion we are focusing on cleaning up the data quality of the data but still the tables are very oriented to the source system now comes the fun part in the god layer where we’re going to go and break the whole data model of the sources so we’re going to create something completely new to our business that is easy to consume for business reporting and analyzes and here it is very very important to have a clear understanding of the business and the processes and if you don’t know it already at this phase you have really to invest time by meeting maybe process experts the domain experts in order to have clear understanding what we are talking about in the data so now what we’re going to do we’re going to try to detect what are the business objects that are hidden in the source systems so now let’s go and explore that all right now in order to build a new data model I have to understand first the original data model what are the main business objects that we have how things are related to each others and this is very important process in building a new model so now what I usually do I start giving labels to all those tables so if you go to the shapes over here let’s go and search for label and if you go to more icons I’m going to go and take this label over here so drag and drop it and then I’m going to go and increase maybe the size of the font so let’s go with 20 and bold just make it a little bit bigger so now by looking to this data model we can see that we have a bradu for informations in the CRM and as well in the ARP and then we have like customer informations and transactional table so now let’s focus on the product so the product information is over here we have here the current and the history product informations and here we have the categories that’s belong to the products so in our data model we have something called products so let’s go and create this label it’s going to be the products and so let’s go and give it a color to the style let’s Pi for example the red one now let’s go and move this label and put it beneath this table over here that I have like a label saying this table belongs to the objects called products now I’m going to do the same thing for the other table over here so I’m going to go and tag this table to the product as well so that I can see easily which tables from the sources does has informations about the product business object all right now moving on we have here a table called customer information so we have a lot of information about the customer we have as well in the ARB customer information where we have the birthday and the country so those three tables has to do with the object customer so that means we’re going to go and label it like that so let’s call it customer and I’m going to go and pick different color for that let’s go with the green so I will tag this table like this and the same thing for the other tables so copy tag the second table and the third table now it is very easily for me to see which table to belong to which business objects and now we have the final table over here and only one table about the sales and orders in the ARB we don’t have any informations about that so this one going to be easy let’s call it sales and let’s move it over here and as well maybe change the color of that to for example this color over here now this step is very important by building any data model in the gold layer it gives you a big picture about the things that you are going to module so now the next step with that we’re going to go and build those objects step by step so let’s start with the first objects with our customers so here we we have three tables and we’re going to start with the CRM so let’s start with this table over here all right so with that we know what are our business objects and this task is done and now in The Next Step we’re going to go back to SQL and start doing data Integrations and building completely new data model so let’s go and do that now let’s have a quick look to the gold layer specifications so this is the final stage we’re going to provide data to be consumed by reporting and Analytics and this time we will not be building tables we will be using views so that means we will not be having like start procedure or any load process to the gold layer all what you are doing is only data transformation and the focus of the data transformation going to be data integration aggregation business logic and so on and this time we’re going to introduce a new data model we will be doing star schema so those are the specifications for the gold layer and this is our scope so this time we make sure that we are selecting data from the silver layer not from the bronze because the bronze has bad data quality and the server is everything is prepared and cleaned up in order to build the good layer going to be targeting the server layer so let’s start with select star from and we’re going to go to the silver CRM customer info so let’s go and hit execute and now we’re going to go and select the columns that we need to be presented in the gold layer so let’s start selecting The Columns that we want we have the ID the key the first name I will not go and get the metadata information this only belongs to the Silver Perfect the next step is that I’m going to go and give this table an ilas so let’s go and call it CI and I’m going to make sure that we are selecting from this alas because later we’re going to go and join this table with other tables so something like this so we’re going to go with those columns now let’s move to the second table let’s go and get the birthday information so now we’re going to jump to the other system and we have to join the data by the CI ID together with the customer key so now we have to go and join the data with another table and here I try to avoid using the inner join because if the other table doesn’t have all the information about the customers I might lose customers so always start with the master table and if you join it with any other table in order to get informations try always to avoid the inner join because the other source might not have all the customers and if you do inner join you might lose customers so iend to start from the master table and then everything else is about the lift join so I’m going to say Lift join silver Erp customer a z12 so let’s give it the ls CA and now we have to join the tables so it’s going to be by C from the first table it going to be the customer key equal to ca and we have the CI ID now of course we’re going to get matching data because we checked the silver layer but if we haven’t prepared the data in the silver layer we have to do here preparation step in order to join Jo the tables but we don’t have to do that because that was a preep in the silver layer so now you can see the systematic that we have in this pron silver gold so now after joining the tables we have to go and pick the information that we need from the second table which is the birth dat so B dat and as well from this table there is another nice information it is the gender information so that’s all what we need from the second table let’s go and check the third table so the third table is about the location information the countries and as well we connect the tables by the C ID with the key so let’s go and do that we’re going to say as well left join silver Erp location and I’m going to give it the name LA and then we have to join while the keys the same thing it’s going to be CI customer key equal to La a CI ID again we have prepared those IDs and keys in the server layer so the joint should be working now we have to go and pick the data from the second table so what do we we have over here we have the ID the country and the metadata information so let’s go and just get the country perfect so now with that we have joined all the three tables and we have picked all the columns that we want in this object so again by looking over here we have joined this table with this one and this one so with that we have collected all the customer informations that we have from the two Source systems okay so now let’s go and query in order to make sure that we have everything correct and in order to understand that your joints are correct you have to keep your eye in those three columns so if you are seeing that you are getting data that means you are doing the the joints correctly but if you are seeing a lot of nulls or no data at all that means your joints are incorrect but now it looks for me it is working and another check that I do is that if your first table has no duplicates what could happen is that after doing multiple joints you might now start getting dgates because the relationship between those tables is not clear one to one you might get like one to many relationship or many to many relationships so now the check that I usually do at this stage advance I have to make sure that I don’t have duplicates from their results so we don’t have like multiple rows for the same customer so in order to do that we go and do a quick group bu so we’re going to group by the data by the customer ID and then we do the counts from this subquery so this is the whole subquery and then after that we’re going to go and say Group by the customer ID and then we say having counts higher than one so this query actually try to find out whether we have any duplicates in the primary key so let’s go and executed we don’t have any duplicate and that means after joining all those tables with the customer info those tables didn’t didn’t cause any issues and it didn’t duplicate my data so this is very important check to make sure that you are in the right way all right so that means everything is fine about the D Kates we don’t have to worry about it now we have here an integration issue so let’s go and execute it again and now if you look to the data we have two sources for the gender informations one comes from the CRM and another where come from the Erp so now the question is what are we going to do with this well we have to do data integration so let me show you how I do it first I go and have a new query and then I’m going to go and remove all other stuff and I’m going to leave only those two informations and use it distinct just to focus on the integration and let’s go and execute it and maybe as well to do an order bu so let’s do one and two let’s go and execute it again so now here we have all the scenarios and we can see sometimes there is a matching so from the first table we have female and the other table we have as well female but sometimes we have an issue like those two tables are giving different informations and the same thing over here so this is as well an issue different informations another scenario where we have a from the first table like here we have the female but in the other table we have not available well this is not a problem so we can get it from the first table but we have as well the exact opposite scenario where from the first table the data is not available but it is available from the second table and now here you might wonder why I’m getting a null over here we did handle all the missing data in the silver layer and we replace everything with not available so why we are still getting a null this null doesn’t come directly from the tables it just come because of joining tables so that means there are customers in the CRM table that is not available in the Erb table and if there is like no match what’s going to happen we will get a null from scel so this null means there was no match and that’s why we are getting this null it is not coming from the content of the tables and this is of course an issue but now the big issue what can happen for those two scenarios here we have the data but they are different and here again we have to ask the experts about it what is the master here is it the CRM system or the ARP and let’s say from their answer going to say the master data for the customer information is the CRM so that means the CRM informations are more accurate than the Erp information and this is only about the customers of course so for this scenario where we have female and male then the correct information is the female from the First Source system the same goes over here and here we have like male and female then the correct one is is the mail because this Source system is the master okay so now let’s go and build this business rule we’re going to start as usual with the case wi so the first very important rule is if we have a data in the gender information from the CRM system from the master then go and use it so we’re going to go and check the gender information from the CRM table so customer gender is not equal to not available so that means we have a value male or female let me just have here a comma like this then what going to happen go and use it so we’re going to use the value from the master CRM is the master for gender info now otherwise that means it is not available from the CRM table then go and use and grab the information from the second table so we’re going to say ca gender but now we have to be careful this null over here we have to convert it to not available as well so we’re going to use the Calis so if this is a null then go and use the not available like this so that’s it let’s have an end let me just push this over here so let’s go and call it new chin for now let’s go and excute it and let’s go and check the different scenarios all those values over here we have data from the CRM system and this is as well represented in the new column but now for the second parts we don’t have data from the first system so we are trying to get it from the second system so for the first one is not available and then we try to get it from the Second Source system so now we are activating the else well it is null and with that the CIS is activated and we are replacing the null with not available for the second scenario as well the first system don’t have the gender information that’s why we are grabbing it from the second so with that we have a female and then the third one the same thing we don’t have information but we get it from the Second Source system we have the mail and the last one it is not available in in both Source systems that’s why we are getting not available so with that as you can see we have a perfect new column where we are integrating two different Source system in one and this is exactly what we call data integration this piece of information it is way better than the source CRM and as well the source ARP it is more rich and has more information and this is exactly why we Tred to get data from different Source system in order to get rich information in the data warehouse so do we have a nice logic and as you can see it’s way easier to separate it in separate query in order first to build the logic and then take it to the original query so what I’m going to do I’m just going to go and copy everything from here and go back to our query I’m going to go and delete those informations the gender and I will put our new logic over here so a comma and let’s go and execute so with that we have our new nice column now with that we have very nice objects we don’t have delates and we have integrated data together so we took three three tables and we put it in one object now the next step is that we’re going to go and give nice friendly names the rule in the gold layer that to use friendly names and not to follow the names that we get from The Source system and we have to make sure that we are following the rules by the naming conventions so we are following the snake case so let’s go and do it step by step for the first one let’s go and call it the customer ID and then the next one I will get rid of using keys and so on I’m going to go and call it customer number because those are customer numbers then for the next one we’re going to call it first name without using any prefixes and the next one last name and we have here marital status so I will be using the exact name but without the prefix and here we just going to call it gender and this one we going to call it create date and this one birth dat and the last one going to be the country so let’s go and execute it now as you can see the names are really friendly so we have customer ID customer numbers first name last name material status gender so as you can see the names are really nice and really easy to understand now the next step I’m going to think about the order of those columns so the first two it makes sense to have it together the first name last name then I think the country is very important information so I’m going to go and get it from here and put it exactly after the last name it’s just nicer so let’s go and execute it again so the first name last name country it’s always nice to group up relevant columns together right so we have here the status of the gender and so on and then we have the CATE date and the birth date I think I’m going to go and switch the birth date with the CATE date it’s more important than the CATE dates like this and here not forget a comma so execute again so it looks wonderful now comes a very important decision about this objects is it a fact table or a dimension well as we learned Dimensions hold descriptive information about an object and as you can see we have here a descriptions about the customers so all those columns are describing the customer information and we don’t have here like transactions and events and we don’t have like measures and so on so we cannot say this object is a fact it is clearly a dimension so that’s why we’re going to go and call this object the dimension customer now there is one thing that if you creating a new dimension you need always a primary key for the dimension of course we can go over here and the depend on the primary key that we get from The Source system but sometimes you can have like Dimensions where you don’t have like a primary key that you can count on so what we have to do is to go and generate a new primary key in the data warehouse and those primary Keys we call it surrogate keys serate keys are system generated unique identifier that is assigned to each record to make the record unique it is not a business key it has no meaning and no one in the business knows about it we only use it in order to connect our data model and in this way we have more control on how to connect our data model and we don’t have to depend all way on the source system and there are different ways on how to generate surrogate Keys like defining it in the ddl or maybe using the window function row number in this data warehouse I’m going to go with a simple solution where we’re going to go and use the window function so now in order to generate a Sur key for this Dimension what we’re going to do it is very simple so we’re going to say row number over and here if we have to order by something you can order by the create date or the customer ID or the customer number whatever you want but in this example I’m going to go and order by the customer ID so we have to follow the naming convention that’s all surate keys with the key at the end as a suffix so now let’s go and query those informations and as you can see at the start we have a customer key and this is a sequence we don’t have here of course any duplicates and now this sgate key is generated in the data warehouse and we going to use this key in order to connect the data model so now with that our query is ready and the last step is that we’re going to go and create the object and as we decided all the objects in the gold layer going to be a virtual one so that means we’re going to go and create a view so we’re going to say create View gold. dim so follow damic convention stand for the dimension and we’re going to have the customers and then after that we have us so with that everything is ready let’s go and excuse it it was successful let’s go to the Views now and you can see our first objects so we have the dimension customers in the gold layer now as you know me in the next of that we’re going to go and check the quality of this new objects so let’s go and have a new query so select star from our view temp customers and now we have to make sure that everything in the right position like this and now we can do different checks like the uniqueness and so on but I’m worried about the gender information so let’s go and have a distinct of all values so as you can see it is working perfectly we have only female male and not available so that’s it with that we have our first new dimension okay friends so now let’s go and build the second object we have the products so as you can see product information is available in both Source systems as usual we’re going to start with the CRM informations and then we’re going to go and join it with the other table in order to get the category informations so those are the columns that we want from this table now we come here to a big decision about this objects this objects contains historical informations and as well the current informations now of course depend on the requirement whether you have to do analysis on the historical informations but if you don’t have such a requirements we can go and stay with only the current informations of the products so we don’t have to include all the history in the objects and it is anyway as we learned from the model over here we are not using the primary key we are using the product key so now what we have to do is to filter out the historical data and to stay only with the current data so we’re going to have here aware condition and now in order to select the current data what we’re going to do we’re going to go and Target the end dates if the end date is null that means it is a current data let’s take this example over here so you can see here we have three record for the same product key and for the first two records we have here an information in the end dates because it is historical informations but the last record over here we have it as a null and that’s because this is the current information it is open and it’s not closed yet so in order to select only the current informations it is very simple we’re going to say BRD in dat is null so if you go now and execute it you will get only the current products you will not have any history and of course we can go and add comment to it filter out all historical data and this means of course we don’t need the end date in our selection of course because it is always a null so with that we have only the current data now the next step that we have to go and join it with the product categories from the Erp and we’re going to use here the ID so as usual the master information is the CRM and everything else going to be secondary that’s why I use the Live join just to make sure I’m not losing I’m not filtering any data because if there is no match then we lose data so let’s join silver Erp and the category so let’s call it PC and now what we’re going to do we’re going to go and join it using the key so PN from the CRM we have the category ID equal to PC ID and now we have to go and pick columns from the second table so it’s going to be the PC we have the category very important PC we have the subcategory and we can go and get the maintenance so something like this let’s go and query and with that we have all those columns comes from the first table and those three comes from the second so with that we have collected all the product informations from the two Source systems now the next step is we have to go and check the quality of these results and of course what is very important is to check the uniqueness so what we’re going to do we’re going to go and have the following query I want to make sure that the product key is unique because we’re going to use it later in order to join the table with the sales so from and then we have to have group by product key and we’re going to say having counts higher than one so let’s go and check perfect we don’t have any duplicates the second table didn’t cause any duplicates for our join and as well this means we don’t have historical data and each product is only one records and we don’t have any duplicates so I’m really happy about that so let’s go in query again now of course the next step do we have anything to integrate together do we have the same information twice well we don’t have that the next step is that we’re going to go and group up the relevant informations together so I’m going to say the product ID then the product key and the product name are together so all those three informations are together and after that we can put all the category informations together so we can have the category ID the category itself the subcategory let me just query and see the results so we have the product ID key name and then we have the category ID name and the subcategory and then maybe as well to put the maintenance after the subcategory like this and I think the product cost and the line can start could stay at the end so let me just check so those three four informations about the category and then we have the cost line and the start date I’m really happy with that the next step we’re going to go and give n names friendly names for those columns so let’s start with the first one this is the product ID the next one going to be the product number we need the key for the surrogate key later and then we have the product name and after that we have the category ID and the category and this is the subcategory and then the next one going to stay as it is I don’t have to rename it the next one going to be the cost and the line and the last one will be the start dates so let’s go and execute it now we can see very nicely in the output all those friendly names for the columns and it looks way nicer than before I don’t have even to describe those informations the name describe it so perfect now the next big decision is what do we have here do we have a effect or Dimension what do you think well as you can see here again we have a lot of descriptions about the products so all those informations are describing the business object products we don’t have like here transactions events a lot of different keys and ideas so we don’t have really here a facts we have a dimension each row is exactly describing one object describing one products that’s why this is a dimension okay so now since this is a dimension we have to go and create a primary key for it well actually the surrogate key and as we have done it for the customers we’re going to go and use the window function row number in order to generate it over and then we have to S the data I will go with the start dates so let’s go with the start dates and as well the product key and we’re going to gra it a name products key like this so let’s go and execute it with that we have now generated a primary key for each product and we’re going to be using it in order to connect our data model all right now the next step we does we’re going to go and build the view so we’re going to say create view we’re going to say go and dimension products and then ask so let’s go and create our objects and now if you go and refresh the views you will see our second object the second dimension so we have here in the gold layer the dimension products and as usual we’re going to go and have a look to this view just to make sure that everything is fine so them products so let’s execute it and by looking to the data everything looks nice so with that we have now two dimensions all right friends so with that we have covered a lot of stuff so we have covered the customers and the products and we are left with only one table where we have the transactions the sales and for the sales information we have only data from the CRM we don’t have anything from the Erp so let’s go and build it okay so now I have all those informations and now of course we have only one table we don’t have to do any Integrations and so on and now we have to answer the big question do we have here a dimension or a fact well by looking to those details we can see transactions we can see events we have a lot of dates informations we have as well a lot of measures and metrics and as well we have a lot of IDs so it is connecting multiple dimensions and this is exactly a perfect setup for effects so we’re going to go and use those informations as effects and of course as we learned effect is connecting multiple Dimensions we have to present in this fact the surrogate keys that comes from the dimensions so those two informations the product key and the customer ID those informations comes from the searce system and as we learned we want to connect our data model using the surate keys so what we’re going to do we’re going to replace those two informations with the surate keys that we have generated and in order to do that we have to go and join now the two dimensions in order to get the surate key and we call this process of course data lookup so we are joining the tables in order only to get one information so let’s go and do that we will go with the lift joint of course not to lose any transaction so first we’re going to go and join it with the product key now of course in the silver layer we don’t have any ciruit Keys we have it in the good layer so that means for the fact table we’re going to be joining the server layer together with the gold layer so gold dots and then the dimension products and I’m going to just call it PR and we’re going to join the SD using the product key together with the product number [Music] from the dimension and now the only information that we need from the dimension is the key the sget key so we’re going to go over here and say product key and what I’m going to do I’m going to go and remove this information from here because we don’t need it we don’t need the original product key from The Source system we need the circuit key that we have generated in our own in this data warehouse so the same thing going to happen as well for the customer so gold Dimension customer again again we are doing here a look up in order to get the information on SD so we are joining using this ID over here equal to the customer ID because this is a customer ID and what we’re going to do the same thing we need the circuit key the customer key and we’re going to delete the ID because we don’t need it now we have the circuit key so now let’s go and execute it and now with that we have in our fact table the two keys from the dimensions and now this can help us to connect the data model to connect the facts with the dimensions so this is very necessary Step Building the fact table you have to put the surrogate keys from the dimensions in the facts so that was actually the hardest part building the facts now the next step all what you have to do is to go and give friendly names so we’re going to go over here and say order number then the surrogate keys are already friendly so we’re going to go over here and say this is the order date and the next one going to be shipping date and then the next one due date and the sales going to be I’m going to say sales amount the quantity and the final one is the price so now let’s go and execute it and look to the results so now as you can see the columns looks very friendly and now about the order of the columns we use the following schema so first in the fact table we have all the surrogate keys from the dimensions then second we have all the dates and at the end you group up all the measures and the matrics at the end of The Facts so that’s it for the query for the facts now we can go and build it so we’re going to say create a view gold in the gold layer and this time we’re going to use the fact underscore and we’re going to go and call it sales and then don’t forget about the ass so that’s it let’s go and create it perfect now we can see the facts so with that we have three objects in the gold layer we have two dimensions and one and facts and now of course the next step with this we’re going to go and check the quality of the view so let’s have a simple select fact sales so let’s execute it now by checking the result you can see it is exactly like the result from the query and everything looks nice okay so now one more trick that I usually do after building a fact is try to connect the whole data model in order to find any issues so let’s go and do that we will do just simple left join with the dimensions so gold Dimension customers C and we will use the [Music] keys and then we’re going to say where customer key is null so there is no matching so let’s go and execute this and with that as you can see in the results we are not getting anything that means everything is matching perfectly and we can do as well the same thing with the products so left join C them products p on product key and then we connect it with the facts product key and then we going to go and check the product key from the dimension like this so we are checking whether we can connect the facts together with the dimension products let’s go and check and as you can see as well we are not getting anything and this is all right so with that we have now SQL codes that is tested and as well creating the gold layer now in The Next Step as you know in our requirements we have to make clear documentations for the end users in order to use our data model so let’s go and draw a data model of the star schema so let’s go and draw our data model let’s go and search for a table and now what I’m going to do I’m going to go and take this one where I can say what is the primary key and what is the for key and I’m going to go and change little bit the design so it’s going to be rounded and let’s say I’m going to go and change to this color and maybe go to the size make it 16 and then I’m going to go and select all the columns and make it as well 16 just to increase the size and then go to our range and we can go and increase it 39 so now let’s go and zoom in a little bit for the first table let’s go and call it gold Dimension customers and make it a little bit bigger like this and now we’re going to go and Define here the primary key it is the customer key and what else we’re going to do we’re going to go and list all the columns in the dimension is little bit annoying but the results going to be awesome so what do we we have the customer ID we have the customer number and then we have the first name now in case you want a new rows so you can hold control and enter and you can go and add the other columns so now pause the video and then go and create the two Dimensions the customers and the products and add all the columns that you have built in the [Music] view welcome back so now I have those two Dimensions the third one one going to be the fact table now for the fact table I’m going to go with different color for example the blue and I’m going to go and put it in the middle something like this so we’re going to say gold fact sales and here for that we don’t have primary key so we’re going to go and delete it and I have to go and add all The Columns of the facts so order number products key customer key okay all right perfect now what we can do we can go and add the foreign key information so the product key is a foreign key key for the products so you’re going to say fk1 and the customer key going to be the foreign key for the customers so fk2 and of course you can go and increase the spacing for that okay so now after we have the tables the next step in data modeling is to go and describe the relationship between these tables this is of course very important for reporting and analytics in order to understand how I’m going to go and use the data model and we have different types of relationships we have one to one one too many and in Star schema data model the relationship between the dimension and the fact is one too many and that’s because in the table customers we have for a specific customer only one record describing the customer but in the fact table the customer might exist in multiple records and that’s because customers can order multiple times so that’s why in fact it is many and in the dimension side it is one now in order to see all those relationships we’re going to go to the menu to the left side and as you can see we have here entity relations and now you have different types of arrows so here for example we have zero to many one one to many one to one and many different types of relations so now which one we going to take we’re going to go and pick with this one so it says one mandatory so that means the customer must exist in the dimension table too many but it is optional so here we have three scenarios the customer didn’t order anything or the customer did order only once or the customer did order many things so that’s why in the fact table it is optional so we’re going to take this one and place it over here so we’re going to go and connect this part to the customer Dimension and the many parts to the facts well actually we have to do it on the customers so with that we are describing the relationship between the dimensions and fact with one to many one is mandatory for the customer Dimension and many is optional to the facts so we have the same story as well for the products so the many part to the facts and the one goes to the products so it’s going to look like this each time you are connecting new dimension to the fact table it is usually one too many relationship so you can go and add anything you want to this model like for example a text like explaining something for example if you have some complicated calculations and so on you can go and write this information over here so for example we can say over here sales calculation we can make it a little bit smaller so let’s go with 18 so we can go and write here the formula for that so sales equal quantity multipli with a price and make this a little bit bigger so it is really nice info that we can add it to the data model and even we can go and Link it to the column so we can go and take this arrow for example with it like this and Link it to the column and with that you have as well nice explanation about the business rule or the calculation so you can go and add any descriptions that you want to the data model just to make it clear for anyone that is using your data model so with that you don’t have only like three tables in the database you have as well like some kind of documentations and explanation in one Blick we can see how the data model is built and how you can connect the tables together it is amazing really for all users of your data model all right so now with that we have really nice data model and now in The Next Step we’re going to go and create quickly a data catalog all right great so with that we have a data model and we can say we have something called a data products and we will be sharing this data product with different type of users and there’s something that’s every every data product absolutely needs and that is the data catalog it is a document that can describe everything about your data model The Columns the tables maybe the relationship between the tables as well and with that you make your data product clear for everyone and it’s going to be for them way easier to derive more insights and reports from your data product and what is the most important one it is timesaving because if you don’t do that what can happen each consumer each user of your data product will keep asking you the same question questions about what do you mean with this column what is this table how to connect the table a with the table B and you will keep repeating yourself and explaining stuff so instead of that you prepare a data catalog a data model and you deliver everything together to the users and with that you are saving a lot of time and stress I know it is annoying to create a data catalog but it is Investments and best practices so now let’s go and create one okay so now in order to do that I’ve have created a new file called Data catalog in the folder documents and here what we’re going to do is very St straightforwards we’re going to make a section for each table in the gold layer so for example we have here the table dimension customers what you have to do first is to describe this table so we are saying it stores details about the customers with the demographics and Geographics data so you give a short description for the table and then after that you’re going to go and list all your columns inside this table and maybe as well the data type but what is way important is the description for each column so you give a very short description like for example here the gender of the customer now one of the best practices of describing a column is to give examples because you can understand quickly the purpose of the columns by just seeing an example right so here we are seeing we can find inside it a male female and not available so with that the consumer of your table can immediately understand uhhuh it will not be an M or an F it’s going to be a full friendly value without having them to go and query the content of the table they can understand quickly the purpose of the column so with that we have a full description for all the columns of our Dimension the same thing we’re going to do for the products so again a description for the table and as well a description for each column and the same thing for the facts so that’s it with that you have like data catalog for your data product at the code layer and with that the business user or the data analyst have better and clear understanding of the content of your gold layer all right my friends so that’s all for the data catalog in The Next Step we’re going to go back to Dro where we’re going to finalize the data flow diagram so let’s go okay so now we’re going to go and extend our data flow diagram but this time for the gold layer so now let’s go and copy the whole thing from the silver layer and put it over here side by side and of course we’re going to go and change the coloring to the gold and now we’re going to go and rename stuff so this is the gold layer but now of course we cannot leave those tables like this we have completely new data model so what do we have over here we have the fact sales we have dimension customers and as well we have Dimension products so now what I’m going to do I’m going to go and remove all those stuff we have only three tables and let’s go and put those three tables somewhere here in the center so now what you have to do is to go and start connecting those stuff I’m going to go with this Arrow over here direct connection and start connecting stuff so the sales details goes to the fact table maybe put the fact table over here and then we have the dimension customer this comes from the CRM customer our info and we have two tables from the Erp it comes from this table as well and the location from the Erp now the same thing goes for the products it comes from the product info and comes from the categories from the Erp now as you can see here we have cross arrows so what we going to do we can go and select everything and we can say line jumps with a gap and this makes it a little bit like Pitter individual for the arrows so now for example if someone asks you where the data come from for the dimension products you can open this diagram and tell them okay this comes from the silver layer we have like two tables the product info from the CRM and as well the categories from the Erp and those server tables comes from the pron layer and you can see the product info comes from the CRM and the category comes from the Erp so it is very simple we have just created a full data lineage for our data warehouse from the sources into the different layers in our data warehouse and data lineage is is really amazing documentation that’s going help not only your users but as well the developers all right so with that we have very nice data flow diagram and a data lineage all right so we have completed the data flow it’s really feel like progress like achievement as we are clicking through all those tasks and now we come to the last task in building the data warehouse where we’re going to go and commit our work in the get repo okay so now let’s put our scripts in the project so we’re going to go to the scripts over here we have here bronze silver but we don’t have a gold so let’s go and create a new file we’re going to have gold/ and then we’re going to say ddl gold. SQL so now we’re going to go and paste our views so we have here our three views and as usual at the start we going to describe the purpose of the views so we are saying create gold views this script can go and create views for the code layer and the code layer represent the final Dimension and fact tables the star schema each view perform Transformations and combination data from the server layer to produce business ready data sets and those us can be used for analytics and Reporting so that it let’s go and commit it okay so with that as you can see we have the PRS the silver so we have all our etls and scripts in the reposter and now as well for the gold layer we’re going to go and add all those quality checks that we have used in order to validate the dimensions and facts so we’re going to go to The Taste over here and we’re going to go and create a new file it’s going to be quality checks gold and the file type is SQL so now let’s go and paste our quality checks so we have the check for the fact the two dimensions and as well an explanation about the script so we are validating the integrity and the accuracy of the gold layer and here we are checking the uniqueness of the circuit keys and whether we are able to connect the data model so let’s put that as well in our git and commit the changes and in case we come up with a new quality checks we’re going to go and add it to our script here so those checks are really important if you are modifying the atls or you want to make sure that after each ATL those script SC should run and so on it is like a quality gate to make sure that everything is fine in the gold layer perfect so now we have our code in our repo story okay friends so now what you have to do is to go and finalize the get repo so for example all the documentations that we have created during the projects we can go and upload them in the docs so for example you can see here the data architecture the data flow data integration data model and so on so with that each time you edit those pages you can commit your work and you have likey version of that and another thing that you can do is that you go to the read me like for example over here I have added the project overview some important links and as well the data architecture and a little description of the architecture of course and of course don’t forget to add few words about yourself and important profiles in the different social medias all right my friends so with that we have completed our work and as well closed the last epek building the gold layer and with that we have completed all the faces of building a data warehouse everything is 100% And this feels really nice all right my friends so if you’re still here and you have built with me the data warehouse then I can say I’m really proud of you you have built something really complex and amazing because building a data warehouse is usually a very complex data projects and with that you have not only learned SQL but you have learned as well how we do a complex data projects in real world so with that you have a real knowledge and as well amazing portfolio that you can share with others if you are applying for a job or if you are showcase that you have learned something new and with that you have experienced different rules in the project what the data Architects and the data Engineers do in complex data projects so that was really an amazing journey even for me as I’m creating this project so now in the next and with that you have done the first type of data analytics projects using SQL the data warehousing now in The Next Step we’re going to do another type of projects the exploratory data analyzes Eda where we’re going to understand and explore our data sets if you like this video and you want me to create more content like this I’m going to really appreciate it if you support the channel by subscribing liking sharing commenting all those stuff going to help the Channel with the YouTube algorithm and as well my content going to reach to the others so thank you so much for watching and I will see you in the next tutorial bye

    By Amjad Izhar
    Contact: amjad.izhar@gmail.com
    https://amjadizhar.blog

  • Under Your Skin: The O’Malley Family, Book 1 by SHANNYN SCHROEDER

    Under Your Skin: The O’Malley Family, Book 1 by SHANNYN SCHROEDER

    This source presents excerpts from “Under Your Skin (The O’Malley Family Book 1).” It centers around the lives and relationships of the O’Malley family, specifically focusing on themes of pregnancy, family dynamics, and personal struggles. The narrative appears to follow multiple characters, such as Norah and Kai, as they navigate complex situations involving family, work, and unexpected pregnancies. There also seems to be an overarching narrative, though not specifically stated in the book description, involving criminal behavior. The characters’ interactions are portrayed with a focus on their emotions and internal conflicts as they negotiate their individual challenges. The story seems to take place in the Boston and Chicago areas.

    Under Your Skin: The O’Malley Family Book 1 – Study Guide

    Key Themes

    • Family Dynamics: The complex relationships between the O’Malley siblings, their parents, and extended family members, marked by both love and conflict.
    • Responsibility and Burden: The weight of responsibility each character carries, particularly in relation to caring for family members.
    • Second Chances: Opportunities for redemption and self-improvement are themes woven into the story, especially for characters like Kai and Tommy.
    • Personal Growth: Characters evolve as they confront their pasts and make choices about their futures.
    • Love and Relationships: The various forms love takes, including familial love, romantic love, and friendship, and how these relationships affect the characters.

    Chapter Summaries

    • Chapter One: Introduces the O’Malley family, specifically focusing on Tommy and his return to Chicago after time in rehab. Kai is shown to be running the tattoo parlor.
    • Chapter Two: Introduces Norah’s pregnancy and Jimmy’s reaction. We see Norah’s strained relationship with her father and interactions with Moira.
    • Chapter Five: Focuses on Kai taking care of his mother and his internal conflict. It also introduces Jaleesa’s physical therapy.
    • Chapter Six: Explores Kai and Norah’s interactions and their respective burdens. Norah’s conversation with Kevin reveals family tensions.
    • Chapter Seven: Touches upon Norah’s cravings and discomfort during pregnancy. Kai is shown taking care of his mother.
    • Chapter Eight: Norah navigates her pregnancy and book club responsibilities.
    • Chapter Nine: Kai takes care of Norah when she goes to the hospital, demonstrating their growing bond.
    • Chapter Ten: Centers on Norah’s reaction to Kai’s poker game and their evolving relationship.
    • Chapter Twelve: Features tension between Kai and Norah.
    • Chapter Thirteen: Features a sexual encounter between Kai and Norah.
    • Chapter Fifteen: Norah and Kai’s intimacy is revisited.
    • Chapter Sixteen: Kai continues his tattoo work while struggling with his feelings for Norah.
    • Chapter Seventeen: Explores Kai’s complicated situation and tension with Rooster.
    • Chapter Eighteen: Focuses on Sean’s birthday party and the family gathering, which reveals underlying tensions.

    Character Relationships

    • Norah & Kai: An evolving relationship marked by attraction, shared burdens, and emotional vulnerability. They seem to support one another.
    • Tommy & Kai: Brotherly relationship, shaped by shared history and a need for support. Kai keeps Tommy on a relatively straight path after rehab.
    • Norah & Jimmy: Siblings who clearly care for one another, even if Jimmy struggles with the circumstances of Norah’s pregnancy.
    • Kai & His Mother: Kai is very dedicated to his mother.

    Quiz

    1. Describe Kai O’Malley’s profession and a key aspect of his personality.
    2. What major life change is Norah experiencing in the novel, and how is she handling it?
    3. What is Tommy’s recent history and how does his brother Kai play a role in Tommy’s life?
    4. What are some of the main issues or concerns that Kai’s mother deals with?
    5. Describe the dynamic between Norah and her brother Jimmy.
    6. What significant decision does Norah make about the baby, and what are her motivations?
    7. What activity does Kai participate in during his leisure time, and what do we learn about his past from it?
    8. How does the novel portray the themes of family loyalty and obligation within the O’Malley family?
    9. What services do Kai and Norah separately provide for family members?
    10. Describe the nature of Kai and Norah’s eventual relationship.

    Quiz – Answer Key

    1. Kai is a tattoo artist who runs his own shop, Ink Envy. He is portrayed as someone trying to do what’s best for his family, and has had struggles that he is trying to overcome.
    2. Norah is pregnant and facing the challenges of unplanned pregnancy as a single woman. She demonstrates bravery in the face of her unplanned situation.
    3. Tommy has recently been through rehab, is still struggling with past mistakes and trying to find his place. Kai provides guidance and support, to help him stay on the right track.
    4. Kai’s mother is a single woman who appears to have limited mobility. He takes care of her in the mornings and makes sure she is safe while she is in the house all day.
    5. Norah and Jimmy seem to have a strong sibling bond and have one another’s best interests at heart. Jimmy seems to want to do what is best for his sister.
    6. Norah makes the decision to put her baby up for adoption in the hopes of a better life for her. Her decision is hard for her, but she stands by it.
    7. Kai plays poker, often in the basement of his house. It’s revealed that the game provides an escape, in the presence of old friends, but that the presence of an ex-gangbanger is disruptive.
    8. The O’Malley family shows strong loyalty and obligation, evident in their willingness to support one another through thick and thin. The family always seems to pull together, although their methods of support vary.
    9. Kai is dedicated to being a tattoo artist in his shop and providing for his mother. Norah provides care to her, but also acts as a resource of advice and assistance to her brothers, in times of need.
    10. Kai and Norah develop an intimate relationship. The novel explores the complicated nature of their romance.

    Essay Questions

    1. Discuss the role of responsibility and burden in shaping the lives and choices of the O’Malley siblings.
    2. Analyze how the setting of Chicago contributes to the overall mood and themes of Under Your Skin.
    3. Explore the significance of art, particularly tattooing, in the novel and its connection to character development.
    4. Compare and contrast the different types of love depicted in the novel and their impact on the characters’ lives.
    5. Examine the theme of second chances and how characters like Tommy and Kai seek redemption and personal growth.

    Glossary of Key Terms

    • Ink Envy: The name of Kai’s tattoo parlor, representing his profession and artistic expression.
    • Rehab: A facility or program designed to help individuals recover from addiction, as experienced by Tommy.
    • Adoption: The legal process by which a child is permanently placed with a family other than their biological parents, which is part of Norah’s storyline.
    • Poker: A card game that serves as a leisure activity and social outlet for Kai and his friends, also tied to aspects of his past.
    • Home Healthcare: The provision of medical and personal care services in a patient’s home, relevant to Kai’s mother’s needs.
    • Single Motherhood: The experience of raising a child without a partner, a central aspect of Norah’s initial situation.
    • Family Dynamics: The patterns of interaction and relationships between family members, a key focus of the novel.
    • Responsibility: The obligation or duty to care for or be accountable for someone or something, a recurring theme for the characters.
    • Redemption: The act of making amends for past mistakes or wrongdoings, sought by characters like Tommy and Kai.
    • Personal Growth: The process of improving oneself through learning, experience, and self-reflection, evident in the characters’ journeys.

    Under Your Skin: The O’Malley Family, Book 1

    Okay, I will analyze the provided text excerpts from “Under Your Skin (The O’Malley Family Book 1)” and create a detailed briefing document, highlighting the main themes, important ideas, and including relevant quotes.

    Briefing Document: “Under Your Skin (The O’Malley Family Book 1)”

    Overview:

    The provided excerpts introduce the O’Malley family, focusing on the complex relationships between siblings, particularly Norah, Kai, Tommy, and Jimmy, as well as their mother, Lani. The story revolves around themes of family loyalty, responsibility, unexpected pregnancy, and the challenges of navigating adulthood while carrying the weight of past experiences. The characters grapple with difficult decisions, family secrets, personal growth, and attempts to forge their own paths.

    Main Themes and Ideas:

    • Family Dynamics and Loyalty: The O’Malley siblings exhibit a strong, albeit often turbulent, bond. They support each other but also clash frequently, revealing a deep-seated history of shared experiences and expectations.
    • Example: Several interactions highlight the siblings’ willingness to intervene in each other’s lives, even when unwanted. Norah often finds herself helping her brothers.
    • Unexpected Pregnancy and Adoption: Norah’s unexpected pregnancy and subsequent exploration of adoption are central to the excerpts. The excerpts show how everyone around Norah is affected by her decision. The family members start talking to adoption agencies and are trying to find a suitable family for the baby.
    • Personal Growth and Responsibility: Characters, especially Kai and Norah, are shown grappling with their individual responsibilities. Kai is dealing with the financial strains of running his own business. Norah confronts the need to make significant life choices related to her pregnancy.
    • Example: Norah makes decisions about the adoption based on the best family for the baby.
    • Past Trauma and its Lingering Effects: There is a sense of a shared family history that continues to impact the characters’ present lives. The family seem to have gone through hardship and tragedy.
    • Complex Relationships: There are several complex relationships discussed in this excerpt. Norah is navigating her relationships with multiple men who are trying to help her, including Kai, Tommy, and Jimmy.

    Key Characters and Plot Points:

    • Norah: Pregnant, independent, and grappling with the decision of whether to keep the baby or pursue adoption. She is a central figure around whom much of the plot revolves. “Maybe she was a chicken because she not only asked him to do it, but she would actually let him.”
    • Kai: A tattoo artist, seems to be the most responsible sibling and is helping Norah with her choices.
    • Tommy: Involved in hockey and seems to be an emotional support for the family. He seems to be closest to Kai.
    • Jimmy: Seems supportive, but he also faces his own personal issues.
    • Lani: The mother. It was hard to believe that barely two months ago she’d gotten out of the hospital with her knee replacement.”

    Quotes Illustrating Key Themes:

    • On Family: “Besides, the fact that maybe kill me if I didn’t?”
    • On unexpected situations: “”You’re talking like a crazy woman. She’s pregnant with another man’s child.” “Is that man around? I think not or she wouldn’t have spent her day with me.”
    • On adoption decisions:”I want someone who wants my little girl. I want them to be from this area so I can see her. I want two parents. Definitely”

    Possible Conflicts and Questions Raised:

    • How will Norah’s adoption decision impact her relationships with her family and potential adoptive parents?
    • Can Kai manage to overcome his past and present challenges and find a stable path forward?

    Overall Tone:

    The excerpts convey a tone that blends humor, tenderness, and underlying tension. The characters’ interactions are often laced with wit and sarcasm, but there’s also a sense of vulnerability and genuine care beneath the surface.

    This briefing document summarizes the core themes, characters, and potential conflicts presented in the provided excerpts from “Under Your Skin (The O’Malley Family Book 1)”.

    Under Your Skin: O’Malley Family FAQs

    FAQ: Under Your Skin (The O’Malley Family Book 1)

    • What is the central conflict or challenge that Norah faces?
    • Norah is dealing with an unplanned pregnancy and is struggling to figure out her next steps. She is also figuring out if adoption may be the best path forward for herself and the child. She also has to deal with a variety of strong opinions from her family.
    • How is Kai’s artistic ability presented in the story?
    • Kai is depicted as a talented tattoo artist. His work at Ink Envy is sought after, and the narrative highlights his skill in both designing and executing tattoos.
    • What role does family play in the characters’ lives?
    • Family is a dominant theme, with close-knit sibling relationships and strong familial expectations influencing the characters’ decisions and behaviors. The O’Malley family is very involved in each others’ lives, even when it may not be wanted.
    • How does the story portray the challenges of adulthood?
    • The characters grapple with issues like unplanned pregnancy, career aspirations, financial struggles, and complicated romantic relationships, reflecting the complexities and uncertainties of early adulthood.
    • What is Ink Envy, and why is it significant?
    • Ink Envy is the tattoo shop where Kai works. It serves as both his workplace and a space where the characters interact and their stories unfold. The tattoo shop is where Kai is able to be artistically productive, as well as support himself financially.
    • What are the key traits of the O’Malley brothers, and how do they differ?
    • The O’Malley brothers—Tommy, Jimmy, and Kai—each possess distinct personalities. Tommy seems to be the responsible caretaker, Jimmy provides support and commentary, and Kai is focused on his art and working.
    • How are themes of independence and dependence explored in the story?
    • The characters navigate a balance between asserting their independence and relying on family for support, demonstrating the tension between self-reliance and interconnectedness. Norah has moments of being highly independent, and then other moments when she seeks the love and support of her family.
    • What is the significance of the book’s title, “Under Your Skin”?
    • The title, “Under Your Skin,” could have multiple meanings. It refers literally to the art of tattooing, but it also symbolizes the way that family history, relationships, and emotions permeate and shape the characters’ identities and experiences. It is a reference to how close the O’Malley family is to each other.

    Pregnancy Anxieties in “Under Your Skin”

    Some characters in “Under Your Skin (The O’Malley Family Book 1)” experience anxieties related to pregnancy.

    Examples of pregnancy anxieties:

    • Avery is worried about how her body will change.
    • Avery is concerned that her hormones are affecting her negatively.
    • Norah is scared about the possibility of having twins.
    • Norah expresses concern about the changes in her life as a result of the pregnancy.
    • Norah worries about how her family will adjust and whether she will have the support she needs.
    • Norah reflects on whether she is ready to be a mother.
    • Moira reflects on the beginning of her pregnancy.

    O’Malley Family Dynamics: Pregnancy, Relationships, and Conflict

    The source text reveals a complex web of family dynamics, including those influenced by pregnancy and its related anxieties. Various relationships and interactions within the O’Malley family are depicted:

    • Sibling Relationships: The source text illustrates sibling relationships. For example, Norah has brothers, and their interactions range from supportive to overprotective. There are tensions and caring moments between siblings.
    • Parent-Child Dynamics: The text refers to parent-child dynamics, showing the complexities and potential for conflict. Characters reflect on their relationships with their parents, and the impact their parents had on their lives.
    • Extended Family: Interactions with extended family members, like aunts and cousins, also shape the family dynamic. The O’Malley family appears very involved in each others’ lives.
    • Impact of Pregnancy on Family Dynamics: Pregnancy is a central theme that influences family relationships. The characters discuss and debate the impact of unplanned pregnancies. The family members respond differently to the pregnancies. Some family members are supportive, while others are judgmental or concerned. The impending arrival of a new baby also stirs up anxieties and prompts reflections on family history and future.
    • Family Support and Conflict: The source text also highlights instances of family members supporting each other through difficult times. However, there are conflicts and disagreements within the family. These conflicts sometimes stem from differing opinions about how to handle pregnancies or other life challenges.
    • Loyalty and Protection: Despite the conflicts, there is a strong sense of family loyalty and a willingness to protect one another. Siblings rally to support each other, and parents want the best for their children.
    • Changing Roles: The source shows the changing roles within the family as members navigate new relationships, pregnancies, and personal growth. Characters grapple with their identities as parents, siblings, and individuals within the family.

    Personal Growth and Relationships: Navigating Life’s Challenges

    The characters in the source text experience personal growth as they navigate complex relationships, pregnancies, and various life challenges.

    Aspects of personal growth depicted in the source text:

    • Overcoming Past Trauma: Characters grapple with past traumas and work towards healing and moving forward.
    • Identity and Self-Discovery: Characters reflect on their identities and strive toward self-discovery. They consider their roles within the family and as individuals.
    • Changing Relationships: Characters navigate changing relationships and the evolving roles of family members.
    • Taking Responsibility: Some characters make an effort to take responsibility for their actions and decisions.
    • Emotional Maturity: The source text shows characters developing emotional maturity through introspection and self-reflection. They learn to understand their feelings and motivations.
    • Letting Go: Characters learn to let go of past grievances, forgive others, and move forward.
    • Confronting difficult situations: Characters confront difficult situations and make tough choices. They show resilience in the face of adversity.

    Under Your Skin: Relationship Struggles in the O’Malley Family

    In “Under Your Skin (The O’Malley Family Book 1),” characters experience relationship struggles stemming from various sources such as family dynamics, personal growth challenges, and pregnancy anxieties.

    Relationship struggles include:

    • Impact of Family Dynamics: Characters experience relationship struggles stemming from family dynamics. For example, siblings’ involvement in each other’s lives can lead to tension. Differing opinions on handling pregnancies can also cause conflict among family members.
    • Personal Growth Challenges: Characters’ individual journeys of self-discovery and healing from past traumas can create friction in relationships. Differing levels of emotional maturity or commitment to taking responsibility can also lead to misunderstandings and disagreements.
    • Pregnancy-Related Anxieties: The anxieties surrounding pregnancy, such as concerns about body image and the future, can strain relationships. The source text shows characters grappling with unplanned pregnancies and the adjustments required.

    Under Your Skin: Purpose, Relationships, and Growth

    The characters in “Under Your Skin (The O’Malley Family Book 1)” grapple with finding purpose while navigating relationship struggles, family dynamics, personal growth, and pregnancy anxieties.

    Examples of characters finding purpose:

    • Taking Responsibility: Characters strive to take responsibility for their actions and decisions, suggesting a search for purpose through accountability and maturity.
    • Confronting Difficult Situations: Characters confront difficult situations and make tough choices, indicating they find purpose by facing adversity and demonstrating resilience.
    • Personal Growth and Self-Discovery: Characters reflect on their identities and consider their roles within their families and as individuals, indicating a journey toward finding purpose through understanding themselves.
    • Supporting Family: Despite conflicts, a strong sense of family loyalty and a willingness to protect one another is present, suggesting that characters find purpose in supporting and caring for their families.
    • Defining Relationships: Characters navigate changing relationships and evolving roles within their families, showing they seek purpose by adapting to new dynamics and defining their place within them.
    • Healing from Trauma: Characters grapple with past traumas and work toward healing and moving forward, implying they find purpose in overcoming adversity and seeking a better future.

    By Amjad Izhar
    Contact: amjad.izhar@gmail.com
    https://amjadizhar.blog

  • Power BI Data Transformation, Visualization, and Drill Down

    Power BI Data Transformation, Visualization, and Drill Down

    The text is a series of excerpts describing the process of creating data visualizations and dashboards using Power BI. It guides users on how to import, transform, and model data from sources like Excel using Power Query. The text covers topics such as cleaning data, creating relationships between tables, and using DAX functions to perform calculations. Various chart types are explored, along with techniques like drill-downs, conditional formatting, and grouping data using bins and lists. The final portion focuses on building a comprehensive dashboard from survey data, including considerations for layout and theme customization.

    Power BI Mastery: A Comprehensive Study Guide

    Quiz: Short Answer Questions

    1. What is Power BI and what is its primary function? Power BI is a data visualization tool within the Microsoft ecosystem used for creating interactive dashboards and reports. Its primary function is to transform raw data into insightful visuals for better decision-making.
    2. Where can you download Power BI Desktop and is it free? Power BI Desktop can be downloaded from the Microsoft Store, or from a direct download link. It is available for free.
    3. Name at least three different data sources that Power BI can connect to. Power BI can connect to various data sources including Excel workbooks, SQL databases, and online services like Google Analytics.
    4. What is Power Query and why is it important in Power BI? Power Query is a data transformation tool within Power BI. It is important because it allows users to clean, reshape, and transform data before creating visualizations.
    5. How do you rename a column in Power Query and where can you view the steps you have taken to edit data? In Power Query, a column can be renamed by double-clicking on its header and typing in the new name. The applied steps appear in the “Applied Steps” pane on the right side of the Power Query Editor window.
    6. How do you remove a filter that you have applied to data in Power Query? To remove a filter, locate the “Filtered Rows” step in the “Applied Steps” pane and click the “X” icon next to it.
    7. What are the three main tabs in Power BI Desktop and what is the primary function of each? The three main tabs are: Report (for creating visualizations), Data (for viewing and managing data), and Model (for defining relationships between tables).
    8. Give a brief overview of the importance of the “Model” tab in Power BI? The Model tab is important for defining relationships between different tables or data sources. These relationships allow you to create more complex and accurate visualizations by combining data from multiple sources.
    9. Describe what drill down is. What are the three drill down effects that Power BI offers? Drill down lets you explore data at increasingly granular levels within a single visualization. The three drill down features are: “turn on drill down,” “go to next level in the hierarchy,” and “expand all down one level in the hierarchy.”
    10. What is conditional formatting and what are some options for displaying conditional formatting? Conditional formatting highlights data points based on specific criteria, making it easier to identify patterns and outliers. Some options for displaying conditional formatting include background color, font color, data bars, and icons.

    Quiz: Answer Key

    1. Power BI is a data visualization tool within the Microsoft ecosystem used for creating interactive dashboards and reports. Its primary function is to transform raw data into insightful visuals for better decision-making.
    2. Power BI Desktop can be downloaded from the Microsoft Store, or from a direct download link. It is available for free.
    3. Power BI can connect to various data sources including Excel workbooks, SQL databases, and online services like Google Analytics.
    4. Power Query is a data transformation tool within Power BI. It is important because it allows users to clean, reshape, and transform data before creating visualizations.
    5. In Power Query, a column can be renamed by double-clicking on its header and typing in the new name. The applied steps appear in the “Applied Steps” pane on the right side of the Power Query Editor window.
    6. To remove a filter, locate the “Filtered Rows” step in the “Applied Steps” pane and click the “X” icon next to it.
    7. The three main tabs are: Report (for creating visualizations), Data (for viewing and managing data), and Model (for defining relationships between tables).
    8. The Model tab is important for defining relationships between different tables or data sources. These relationships allow you to create more complex and accurate visualizations by combining data from multiple sources.
    9. Drill down lets you explore data at increasingly granular levels within a single visualization. The three drill down features are: “turn on drill down,” “go to next level in the hierarchy,” and “expand all down one level in the hierarchy.”
    10. Conditional formatting highlights data points based on specific criteria, making it easier to identify patterns and outliers. Some options for displaying conditional formatting include background color, font color, data bars, and icons.

    Essay Format Questions

    1. Discuss the importance of data transformation using Power Query in the Power BI workflow. Provide examples of common data transformation tasks and explain how they contribute to creating accurate and meaningful visualizations.
    2. Explain the significance of the “Model” tab in Power BI, focusing on how relationships between tables are created and managed. Discuss the different types of cardinalities and cross-filter directions, and how they impact data analysis.
    3. Compare and contrast aggregator functions and iterator functions (like SUM vs. SUMX) in DAX. Provide specific examples of when each type of function would be most appropriate and explain how they differ in their evaluation context.
    4. Describe the various types of visualizations available in Power BI and provide examples of scenarios where each would be most effective. Consider the strengths and weaknesses of each visualization type and how they can be used to convey different types of information.
    5. Explain the purpose and application of conditional formatting in Power BI reports. Discuss the different conditional formatting options available and provide examples of how they can be used to highlight key trends and outliers in the data.

    Glossary of Key Terms

    • Power BI: A Microsoft business analytics service that provides interactive visualizations and business intelligence capabilities.
    • Data Visualization: The graphical representation of information and data.
    • Dashboard: A visual display of the most important information needed to achieve one or more objectives; consolidated and arranged on a single screen.
    • KPI (Key Performance Indicator): A measurable value that demonstrates how effectively a company is achieving key business objectives.
    • Power BI Desktop: A free Windows application for creating interactive dashboards and reports.
    • Data Source: The location where the data being used originates (e.g., Excel file, SQL database).
    • Power Query: A data transformation and preparation engine used within Power BI to clean, reshape, and enrich data.
    • Data Transformation: The process of converting data from one format or structure into another.
    • Applied Steps: A record of each data transformation step performed in Power Query.
    • Report Tab: The area in Power BI Desktop where visualizations are created and arranged.
    • Data Tab: The area in Power BI Desktop where the underlying data can be viewed and managed.
    • Model Tab: The area in Power BI Desktop where relationships between tables are defined.
    • Cardinality: Defines the relationship between two tables (e.g., one-to-many, one-to-one).
    • Cross-filter Direction: Determines how filters applied to one table affect related tables.
    • DAX (Data Analysis Expressions): A formula language used in Power BI for calculations and data analysis.
    • Measure: A calculation performed on data, typically aggregated (e.g., sum, average).
    • Column: A field in a table containing a specific attribute of the data.
    • Aggregator Function: A DAX function that calculates a single value from a column of data (e.g., SUM, AVERAGE, MIN, MAX).
    • Iterator Function: A DAX function that evaluates an expression for each row in a table (e.g., SUMX, AVERAGEX).
    • Drill Down: A feature that allows users to explore data at increasingly granular levels within a visualization.
    • Bin: A grouping of continuous data into discrete intervals or categories.
    • List: An ordered collection of values or items.
    • Conditional Formatting: Highlighting data points based on specific criteria, making it easier to identify patterns and outliers.

    Power BI Tutorial: Data Visualization and Analysis Guide

    Okay, here’s a detailed briefing document summarizing the key themes and ideas from the provided transcript of the Power BI tutorial.

    Briefing Document: Power BI Tutorial Series

    Overview:

    This document summarizes a comprehensive Power BI tutorial series focused on data visualization and analysis. The series aims to take viewers from complete beginners to proficient Power BI users, covering essential skills from data acquisition and transformation to creating interactive dashboards. The content emphasizes practical application and encourages viewers to follow along using provided sample datasets.

    Main Themes & Key Ideas:

    • Power BI as a Leading Data Visualization Tool: The tutorial positions Power BI as a prominent tool within the Microsoft ecosystem.
    • “powerbi is one of the most popular data visualization tools in the world of course it’s within the Microsoft ecosystem”
    • Hands-on Learning Approach: The tutorial emphasizes a practical, hands-on approach, encouraging viewers to download datasets and actively participate in the exercises.
    • “I’m going to leave the Excel that I’m going to be using in the description you can go and download it and walk through this with me”
    • Data Acquisition and Connectivity: A significant portion focuses on connecting to various data sources, highlighting the flexibility of Power BI.
    • “it’s going to give us a lot of different options for where we can get data from… you have a ton of options there’s databases… SQL databases… Google analytics”
    • Power Query for Data Transformation: The tutorial introduces Power Query as a crucial tool for cleaning, shaping, and transforming data before visualization.
    • “it’s going to take us to powerbi power query which is going to allow us to transform our data”
    • “this is the window to basically transform your data and get it ready for your visualizations”
    • Applied Steps in Power Query: Emphasis on the importance of the “Applied Steps” feature in Power Query for tracking and modifying data transformations.
    • “everything that you do every single step that you apply to transform this data is going to be right over here and if I want to … I can just click X and it is going to get rid of that”
    • Data Modeling and Relationships: Connecting multiple data tables and defining relationships between them is covered.
    • “this is especially useful when you have multiple tables or multiple excels and you need to join them to kind of connect them together”
    • DAX (Data Analysis Expressions): Introduction to DAX functions for creating calculated columns and measures.
    • “What we’re going to be using are these new measures and new columns to create create our Dax functions”
    • Aggregator vs. Iterator functions (SUM vs. SUMX): Explains the difference between aggregate functions (operate on an entire column) and iterator functions (operate row by row).
    • Conditional Formatting: Applying visual cues (colors, icons, data bars) to highlight trends and patterns in data.
    • Drill-Down Functionality: Creating hierarchical visualizations that allow users to explore data at different levels of detail.
    • Lists and Bins: Grouping data using Lists and Bins to aid in visualization and create cohorts.
    • Visualization Techniques: Stacked Bar Chart, 100% Stacked Column Chart, Line Chart, Clustered Column Chart, Scatter Chart, Donut Chart, Cards and tables are covered and used throughout the tutorial.
    • “the very first one that we’re going to start with probably the easiest one and the one that you’ll recognize the most is a stacked bar chart”
    • Project-Based Learning: The series culminates in a final project using real-world survey data from data professionals.
    • Customizing Dashboards: Demonstrates how to improve the look and feel of dashboards using themes and color schemes.

    Important Ideas and Facts:

    • Power BI Desktop can be downloaded for free.
    • “we’re going to click this download free button”
    • Power Query Editor is used to transform data.
    • “this is the window to basically transform your data and get it ready for your visualizations”
    • Data relationships are crucial for combining data from multiple sources.
    • DAX functions are essential for creating calculations and performing advanced analysis.
    • Drill-down functionality allows for interactive exploration of data.
    • “what happens is is someone some stakeholder in our company is saying hey Alex we want this and we want to know we want to drill down on on this IP address”
    • Bins can be used to create groups of data.
    • Conditional formatting enhances the readability and impact of visualizations.
    • “it’s just better to have these simple visualizations on this table rather than just having the numbers themselves makes it a little bit more easy to read and understand stand”
    • The choice of visualization depends on the data and the insights you want to convey.

    Quotes:

    • “By the end of this video you’re going to be an expert in powerbi you’re going to be creating all sorts of dashboards and kpis and reports and you’re going to be sending all of them to me so I can be really impressed.” (Illustrates the goal of the tutorial.)
    • “Everything that you do every single step that you apply to transform this data is going to be right over here and if I want to if I go back and I say you know I really didn’t want to rename that column I can just click X and it is going to get rid of that and take it back to its original state” (Highlights the flexibility and ease of data transformation in Power Query.)

    Conclusion:

    The Power BI tutorial provides a comprehensive guide for users of all levels. By focusing on practical skills, real-world examples, and hands-on exercises, the series equips viewers with the knowledge and confidence to effectively use Power BI for data analysis and visualization.

    Power BI: Quick Answers and Usage Guide

    Power BI FAQ

    Here’s an 8-question FAQ based on the provided source material:

    1. What is Power BI and why is it useful?

    Power BI is a data visualization tool from Microsoft. It allows users to create interactive dashboards, KPIs (Key Performance Indicators), and reports from various data sources. Its usefulness stems from its ability to transform raw data into understandable visual formats, providing actionable insights for decision-making. If your organization uses Microsoft products there is a good chance you already have access to it.

    2. How do I download Power BI Desktop?

    You can download Power BI Desktop for free from the Microsoft Store. The source material recommends a download link in its description for direct access. Once on the store page, click the “Download” button to begin the installation process.

    3. What types of data sources can Power BI connect to?

    Power BI offers a wide array of data connection options, including Excel workbooks, SQL databases, cloud services (like Azure Blob Storage), and online platforms (like Google Analytics). Some connectors are free, while others may require an upgrade.

    4. What is Power Query Editor and why is it important?

    Power Query Editor is a data transformation interface within Power BI. It allows you to clean, reshape, and transform data before creating visualizations. You can rename columns, filter rows, change data types, and perform other data manipulation tasks. Every transformation step is recorded, allowing you to easily modify or undo changes.

    5. How do I create a basic visualization in Power BI?

    To create a visualization:

    1. Import your data into Power BI Desktop.
    2. Navigate to the Report tab.
    3. Select the type of chart you want to create from the Visualizations pane.
    4. Drag and drop the desired fields (columns) from your data into the appropriate areas of the chart (e.g., axis, values, legend).

    6. What are relationships in Power BI and how do I create them?

    Relationships define how tables in your data model are connected. They’re crucial when working with multiple tables, as they enable Power BI to combine data from different sources.

    To create a relationship:

    1. Go to the Model tab.
    2. Drag a field from one table onto the corresponding field in another table. Power BI will automatically attempt to create the relationship. You can double click it to edit the relationship.

    Cardinality (one-to-one, one-to-many, many-to-many) and cross-filter direction (single, both) are important properties to configure. The single filter setting in the relationship limits data transfer between tables, whereas the both setting causes both tables to act as a single table.

    7. What are DAX measures and columns?

    DAX (Data Analysis Expressions) is a formula language used in Power BI.

    • Measures: Calculations performed on the fly, typically used to aggregate data (e.g., sum, average, count). They are not stored in the data model.
    • Columns: New columns created in your data model that contain calculated values for each row.

    Examples of DAX functions include SUM, AVERAGE, COUNT, and IF. Aggregator functions (like SUM) and iterator functions (like SUMX) behave differently, with iterator functions performing calculations on each row of a table. Date functions such as DAY can also be used in DAX expressions.

    8. What is Drill Down and how can I use it in my visuals?

    Drill Down allows users to explore data at different levels of detail within a visualization. To use it, add multiple fields to a hierarchy in a chart’s axis. When Drill Down is activated, clicking on a data point will “drill down” to the next level in the hierarchy, showing more granular data. Useful for presenting data with layered levels of information.

    Power BI Data Transformation with Power Query

    Data transformation within Power BI involves using Power Query to prepare data for visualization. Power Query Editor allows for a variety of transformations.

    Key aspects of data transformation include:

    • Accessing the Power Query Editor Accessing the Power Query Editor to transform data can be done by selecting ‘Transform Data’.
    • Applied Steps Every transformation step is documented in the ‘Applied Steps’ section, enabling review or removal of changes.
    • Common TransformationsChanging data type Data types can be changed by clicking the icon in the column header.
    • Filtering rows Filter rows to remove null values or specific values.
    • Removing columns or rows Columns or top rows can be removed.
    • Renaming columns Columns can be renamed.
    • Using first row as headers The first row can be used as headers.
    • Unpivoting columns Converting columns into rows can be achieved by selecting the columns and using the unpivot columns option in the transform tab.

    Salary Data Analysis and Visualization in Power BI

    The sources provide information on how salary data can be analyzed and visualized in Power BI, including transforming salary data and analyzing average salaries based on various factors.

    Key aspects of salary analysis discussed in the sources:

    • Data Transformation for Salary AnalysisSalary data often needs transformation to be usable, especially when provided as a range.
    • Splitting Columns: Salary ranges can be split into separate columns representing the lower and upper bounds of the range.
    • Data Type Conversion: Convert text data to numeric data types to enable calculations.
    • Calculating Average Salary: Create a new column to calculate the average salary from the range by summing the lower and upper bounds and dividing by two.
    • Salary Analysis and VisualizationAverage Salary by Job Title: Calculate and visualize the average salary for different job titles using a clustered bar chart.
    • Average Salary by Sex: Visualize the average salary for males and females using a donut chart.
    • Impact of Country on Salary: A tree map can be used to filter salary data by country, acknowledging that the average salary varies significantly depending on the country.

    Programming Language Popularity Among Data Professionals

    The sources discuss how to analyze the popularity of programming languages among data professionals using Power BI.

    Key aspects include:

    • Identifying Favorite Programming Languages Survey data can be used to determine the preferred programming languages of data professionals.
    • Data Transformation The survey may include an “Other” option where respondents can enter their preferred language. This necessitates splitting the column to separate the pre-selected languages from the write-in languages.
    • VisualizationA clustered column chart can effectively display the count of votes for each programming language.
    • The visualization can be enhanced by including job titles, allowing for a breakdown of language preferences by profession. For example, it can show which languages are favored by data analysts versus data scientists.

    Power BI Analysis of Survey Demographics

    The sources contain information regarding the collection, transformation, and visualization of survey demographics using Power BI.

    Key aspects of survey demographics discussed in the sources:

    • Data Collection The data was collected via a survey of data professionals. The survey collected information such as job titles, salary, industry, programming language preferences, and demographic information including age, sex, and country of residence.
    • Data Transformation Several transformations were performed on the raw survey data within Power BI’s Power Query Editor to prepare it for analysis. These transformations included:
    • Splitting columns The ‘Job Title’ and ‘Favorite Programming Language’ columns were split to separate pre-defined options from free-text entries, simplifying analysis.
    • Calculating average salary Salary ranges were split into lower and upper bounds, and a new column was created to calculate the average salary.
    • Demographic Visualizations The transformed data was used to create visualizations to analyze survey demographics:
    • Average Age A card visualization was used to display the average age of survey respondents.
    • Country of Residence A tree map was used to visualize the distribution of survey respondents by country. This allows users to filter the data and examine other variables by country.
    • Sex A donut chart was considered to visualize the distribution of male and female respondents and their average salaries.
    • Difficulty to Break into the Field A pie chart was used to visualize the distribution of how easy or difficult it was to break into the data field.
    • Interactivity Visualizations such as the tree map showing the “Country of Survey Takers” allows users to click on a country and see how the other visualizations change based on that selection.

    Power BI: Data Visualization Techniques and Best Practices

    The sources cover various aspects of data visualization using Power BI, from basic chart creation to more advanced techniques and considerations.

    Data Visualization Options and Usage

    • Basic Chart Creation:
    • Stacked Bar/Column Charts: Useful for comparing different categories and their composition. These can represent customer purchase breakdowns, showing what percentage of purchases come from specific products.
    • Clustered Bar/Column Charts: Useful for comparing values across different categories.
    • Line Charts: Effective for visualizing trends over time, especially with date-related data.
    • Pie/Donut Charts: While sometimes discouraged due to difficulty in comparing slice sizes, they can be used to show proportions.
    • Cards: Display single values, like total survey takers or average age, for quick insights.
    • Tables: Display data in a tabular format.
    • Scatter Charts: Useful for identifying outliers and trends in data.
    • Advanced Visualization Techniques:
    • Combination Charts: Combine different chart types (e.g., line and clustered column) to display multiple aspects of the data in one visualization.
    • Conditional Formatting: Use rules, color gradients, and icons to highlight data within tables or charts.
    • Data bars Data bars can visually represent values within a table, making it easier to compare magnitudes.
    • Drill Down: Allows users to explore data at different levels of granularity within a visualization.
    • Gauges: Visualize survey data, showing average scores and satisfaction levels.
    • Tree Maps: Visualize hierarchical data, allowing users to click through different levels for more details.

    Key Considerations for Effective Data Visualization:

    • Choosing the Right Visual: Different chart types are suited for different data types and analytical goals.
    • Customization: Visual elements like titles, labels, colors, and data presentation should be customized to enhance clarity and readability.
    • Data Transformation: Data often needs to be transformed and cleaned before visualization to ensure accurate and meaningful representations.
    • Interactivity: Incorporate interactive elements like drill-down to allow users to explore the data.
    • Color Coordination: Choosing appropriate color schemes and themes can significantly improve the visual appeal and effectiveness of a dashboard.
    • Clear Titles and Labels: Use clear and descriptive titles and labels to ensure the audience understands the visualization.
    • Summarization: Instead of “Don’t Summarize,” choosing Sum, Average, Minimum or Maximum functions to derive insights.
    • Conditional Formatting: Add background colors based on gradient or rules, data bars, and icons.
    • Drill Down: Can be enabled to present data at different levels.
    • Bins and Lists: Numeric and date data can be grouped using bins. Lists can group customer names.

    Specific Examples and Applications

    • Survey Data: Visualizing survey responses, such as satisfaction levels, is facilitated through gauge charts.
    • Sales Data: Analyzing sales data and identifying top-performing products and customer segments.
    • Geographic Data: Visualizing data by country using tree maps, enabling comparisons and filtering based on location.
    • Salary Data: Presenting salary distributions and averages, broken down by job title, gender, and country.
    • Programming Language Preferences: A clustered column chart is used to display the count of votes for each programming language.
    Learn Power BI in Under 3 Hours | Formatting, Visualizations, Dashboards + Full Project

    The Original Text

    what’s going on everybody welcome back to another video today we are going to learn powerbi in under 3 [Music] hours now powerbi is one of the most popular data visualization tools in the world of course it’s within the Microsoft ecosystem and so if your company uses any Microsoft products you most likely have access to the Microsoft Suite which includes powerbi I use powerbi for many years as a data analyst and then when I became a manager of analytics I actually switched our entire team over to powerbi and so I know how amazing powerbi can actually be so we’re going to be taking a look at several things in this long long video we’ll start with some of the basics of just creating some visualizations but we’ll dive into a ton of other things as well by the end of this video you’re going to be an expert in powerbi you’re going to be creating all sorts of dashboards and kpis and reports and you’re going to be sending all of them to me so I can be really impressed so without further Ado let’s jump onto my screen and get started all right so the first thing I’m going to do is download powerbi desktop I will leave this link in the description so you can just click on it go to it and download it we’re going to click this download free button and once we click it you can go to the Microsoft store and I already have it downloaded so when you see it uh it’ll already say downloaded but um for you you can go in here you can click download and it will download it for you I’m on Microsoft uh but it may look a little bit different for you if you’re on a different system but once that is done we are going to open up powerbi so let’s go right down here to our search let’s go to powerbi and it is going to open up for us all right so right away this is what it’s going to look like when you open it and we’re going to go right over here to get data and let’s click on that it’s going to open up this window and it’s going to give us a lot of different options for where we can get data from now some of these are free and some you need to upgrade from but you just taking a quick glance through here you have a ton of options there’s databases there’s um you know blob storages there’s postr SQL or different SQL databases um there’s Google analytics there’s a lot of places and you can go through the process to connect to that data and you can pull that data in from those data sources now for what we are doing we’re just going to be using an Excel I’m going to leave the Excel that I’m going to be using in the description you can go and download it and walk through this with me so what we’re going to do is click on Excel workbook and we’re going to click connect so we’re going to go right here in our powerbi tutorials folder and we’re going to click on apocalypse food prep so let’s click on that and it is going to connect and pull that data in now right here we have our Navigator and so if you had a lot of different sheets you can click on that and choose which ones to pull in I just clicked on it right over here and we’re able to preview the data but I can’t load or transform it yet I need to select which sheets I’m bringing in so we only have on that’s the only one we’re going to bring in so you can go ahead and load the data or you can click on transform data it’s going to take us to powerbi power query which is going to allow us to transform our data so I’m going to have an entire video on how to transform the data but I’m going to give you a really quick glance at it to kind of show you what it is so right up here it says our power query editor and this is a the window to basically transform your data and get it ready for your visualizations now you can do this in Excel if you want to and do that beforehand or you can do it here and there are lots of things that we can do in here as you can see at the top again I’ll have an entire video dedicated to just power query but let’s take a quick look at the data and see if there’s anything we want to transform quickly before we actually go and start building our visualizations so over here we have the store where we purchased it we have the product that we purchased the price that we paid and the date that we bought it now the first thing that jumps out to me is that this just says date on it um we might want to say date uncore purchased and we’re going to hit enter and if you noticed right over here on these applied steps it says renamed columns everything that you do every single step that you apply to transform this data is going to be right over here and if I want to if I go back and I say you know I really didn’t want to rename that column I can just click X and it is going to get rid of that and take it back to its original state so again I’m just going to say purchased and we’re going to enter that now this is our apocalypse food prep so this is food that we are buying for the Apocalypse um for this example and if we look at our products we have bottled water canned vegetables dried beans milk and rice and all of that stuff makes sense except for the milk U milk will not stay or last long in the apocalypse so I think what we’re going to do is we’re going to filter that out really quickly and we’re click okay and right over here again says filtered rows and so now if we scroll down there’s no milk so what we are going to do is we are going to go over here to close and apply and it is going to actually load the data into powerbi desktop so on this left hand side it immediately takes us to the report Tab and what we want to do is go right here to the data Tab and take a look at our data so again there’s our date purchased and as you can see the milk is not in there another tab that we’re going to take a look at um and again in this report tab this is where we actually build our visualizations the is where we can see the data and and change it up a little bit and change some small things about it like sorting The Columns or even creating a new column and over here we have this other Tab and is called model and this is especially useful when you have multiple tables or multiple excels and you need to join them to kind of connect them together we don’t have that but in a future video I’m going to walk through how to use this entire tab so now let’s go back to the data Tab and I want to just look at the data really quickly before we go over to the report Tab and we start building our first visualization as you can see I’ve been buying these different products in different months so this rice I’ve been purchasing in January February March and April and I’ve been buying it from three different locations because I wanted to see if I was spending less money at one location on all of the products so then I would just shop there in the future and save a lot of money or if there were specific products that were really cheap at one location but others they were cheaper at a different location so I should just buy like the dried beans at Costco but everything else I should be buying at Walmart and so that’s what we’re going to look at in just a little bit so let’s go over to the report tab right up here at the top there’s this data section so you can kind of choose if you want to add any more data now that we are here we can also write queries or transform the data like we were looking at in the power query editor window over here in the insert we can add a new visualization or a text box and then in the calculation section we can create a new measure or a quick measure and then over here we have share where you can actually publish your report or your dashboard online now over on the visualization section on this far right this is a very important area this is where a lot of the actual creating of the dashboards happen so let’s take a look really quick and we’ll get into a lot of these things as we’re actually building our dashboard so we’re not just sitting here looking and talking we’re going to be actually building and doing all right so we’re going to click right here on this drop down on sheet one it’s going to show us all of our columns now two of the things that we wanted to look at were where are we spending the least amount of money buying the exact same product that’ll help us determine where we want to shop and the second thing was should I be buying all my products at the same place or are there certain products that they’re going to be cheaper at a specific store and I should buy it there so let’s start out with the first one which we’re just going to see uh with the store and the price uh where we’re spending the least amount of money and just at a quick glance we can see we’re spending the least amount of money at Costco at $210 versus Target 219 and Walmart at 225 and that really answers our question but we want to visualize it better be able to see it in an easier way so we’re going to go right over here and we can click on a lot of these but the one that probably makes the most sense is the stocked column chart and it’s going to show Walmart Target and Costco now they’re all the same color let’s add a legend so we’re just going to drag store over here down to this Legend and let’s make this larger while we’re working on it so now we can see we’re spending the most amount of money at Walmart uh right in between at Target and then at Costco is the lowest and so right there we know that Costco is the place to go for our apocalypse food prep but is it going to be that way for every product I don’t know let’s take a look let’s put this up in this corner and let’s start a new one we’re going to need to select the product for sure and the price and probably Additionally the store as well and let’s click on let’s not do this one we need a cluster column chart that’s what we need let’s bring this over here let’s expand this quite a bit and so really at a glance this is giving us everything that we need we can see each product right here and we can see how much we’re paying per store and so for Rice we’re paying it looks like a lot more for our rice at Walmart while at Target is actually where we are paying the least now if we look at all of these it looks like for Costco the only one that we’re really paying a lot more on is on our rice but for our dried beans our bottled water we’re paying quite a bit less and really it’s pretty negligible for these canned vegetables we’re paying maybe what 60 cents 50 60 cents more per can so that’s pretty negligable but for the big ticket items um we’re really spending a lot less at Costco if we wanted to SP to save just a little bit more money we could go to Target for our rice now if I want to make this more like a dashboard and we’re only keeping these two things I’m going to kind of size them kind of like this whoops going to show you that in a little bit I’m going to size them a little bit like this so now that we have that looking good we want to change the title of both of these so what we’re going to do is go over here in our visualizations and format your visual uh and we are going to go to this General go to title and now we can name it anything we really want for this we’re going to say best store for product and while we’re in here one other thing that I wanted to do is I want to go to this visual go right down here to these data labels now we haven’t added any data labels so I’m going to click on and you’ll see exactly what it does uh it just puts the labels and the numbers above it so you don’t have to actually like hover over it and see what it is now it is actually rounding these numbers so what we’re going to do is go down here we’re going to go down to values and we’ll go down to display units and it’s on auto so it’s Auto rounding those numbers and we’re just going to say none so we can see the actual value of these numbers and we can do the exact same thing over here it probably is a good thing to do um and it just is going to visualize it a little bit differently in here but you can always change that if you want to go over here to title and we’re going to say total by store and now we you’re going to take a look and so in a matter of minutes we were able to take our data from an Excel put it into powerbi transform it a little bit then we’re able to create these visualizations that gave us concrete answers to some very important topics we now know that Costco is the place to go for basically every single product except if we’re buying rice if we want to save just a few dollars we’re going to head over to Target and that’s genuinely going to change my shopping habits for the next several years until the apocalypse happens all right all right so before we jump over to powerbi and start using power query I wanted to take a look at the data and this is the Excel from our last video called apocalypse food prep and in that video we went through and we bought some rice some beans water vegetables and milk all for the apocalypse getting prepared for that now we decided to buy some additional things like rope some flashlights duct tape and a water filter several water filters and after we purchased those uh our boss or whoever were working with there somebody decided to go and make a pivot table now in this pivot table they kind of broke it out by Costco Target and Walmart and had all the items had some subtotals as well as some Grand totals right here and then they decided to kind of copy and paste that into this and you’ll see this a lot when you’re working with uh people who use Excel they like to kind of make things like this maybe makeing them to like a table or or format a little bit differently but you’ll see stuff like this a lot so this is what we’re going to actually pull into Power query and work with now we’re going to imagine that this is all we have this is the only thing we were working with and I’ll kind of reference this pivot table a little bit but we’re going to pretend this is all we have and we want to transform it to make it a lot more usable to where we can make visualizations with it so let’s hop over to powerbi and pull this excel in so what we’re going to do is Click import data from Excel we’re going to click apocalypse food prep and click open and then it’s going to bring up this window right here now this is where we can choose what data to bring in so we can take a preview and just click on it real quick and this is the pivot table that we were looking at so it does have that pivot table so we are able to pull in just a pivot table and then we have the purchase overview where it’s kind of that formatted um thing that we were just looking at with all the colors we’re going to pull both of those in so we’re going to pull in the pivot table and the purchase overview now we could just load it or we could transform it and we’re going to click transform and that’s going to bring us to power query so let’s click on transform data so now really quick before before we actually jump into working through this and transforming it I want to show you what the power query editor looks like so if we go right over here we have our queries and these are the tables that we actually pulled in and we can click on those and kind of go back and forth between them now up top we have our ribbon and the ribbon offers a lot of functionality we have things like remove columns keep rows remove rows split columns these are all things that we’re likely to use when using this power query editor there’s also another tab called transform where there’s a lot of functionality here as well things like unpivoting a column or transposing columns and rows and using a first row as a header some of the things that we’ll be looking at today there’s also another tab called add a column and this one’s pretty self-explanatory where you can add additional columns like deleting a column creating an index column or a conditional column those are the three main ones there’s also view tools and help but we’re not going to really be looking at those today and then on the far right side we have our query settings you can do things like change the name so we call it pivot table 2022 and it’ll update right over here on our query side and we have our applied steps now our applied steps are extremely important and very very useful anytime we make any change to transform this data it’s going to be documented right here and then we can go back and look at it or we could even delete that change in the future if we want to and go back to a previous version of what we just did so when we loaded the data into to powerbi it did a few things for us it CHS The Source the navigation and it promoted the headers and then it also changed the data type so if we want to check we can actually see those things or change those things like this Source right here we can click on this little icon and it’s going to bring up the actual path where we got this file so if we wanted to change that or or it changes in the future we can come here and we can change this file path but we’re not going to do that right now so let’s click on cancel and let’s go back down to change type so I promoted these headers and obviously these headers are not correct we’re looking at this pivot table and not the purchase overview but it changed these column headers and so in the future if we wanted to we could easily change those but it did that for us and it changed the type as well so if you look right here it says abc123 all the way over here to where it just says ABC ABC means it’s only going to be text where abc123 means it could be basically anything uh text or it could be numeric so now let’s go over to purchase overview View and this is the one that we’re actually going to be working on the most but we might be looking at pivot table just a little bit to kind of reference it and see some of the differences so before we do anything let’s just take a look at how powerbi decided to take this data in so it shows this apocalypse food prep overview as kind of the First Column and that was kind of our header or the title of what we were looking at before and then all these other columns are basically column 1 2 3 4 fivs so that’s something that we’re going to want to change in just a little bit there’s also all these blank uh columns right at the top and kind of these null values as we go along and we’ll take a look at those and we kind of are going to want to get rid of some of this and just clean this up to make it more usable for our powerbi visualizations this may be perfectly fine and acceptable in an Excel but when you’re pulling it into powerbi the real reason you’re pulling it in is to create visualizations not just it to look good in an Excel so we’re going to need to clean this up quite a bit so let’s go right up top the first thing that I want to do is I want to get rid of these top r so we’re going to go to this top ribbon and we’re going to click remove rows and we’re going to select remove top rows and we’re going to select two because we have one two rows of all nulles and those are completely useless we just want to get rid of them right away so let’s cck Okay and it removed those the next thing that we want to do is these this location product and all these dates these are actually the column headers that we wanted so what we need to do now is we want to go over to transform and you want to say use first row as headers and just like that we have location products and these dates as our headers exactly how we wanted them now let’s say for whatever reason you know we made a mistake and we needed to go back we would just select remove top rows and that would be perfectly fine now you can see over here it promoted the headers but it’s also changed the data type so before if we went to before we removed the headers these were all AB 123 abc123 cuz it had a lot of different data types in there so it just kind of made a generic data type but when we promoted these headers the first thing that I decided to do was also change this data type for us giving us its best guess as to what this data type is and it decided to do this decimal so this one two is a decimal but we’re actually going to change that and all you have to do is click on This 1.2 or or the data type that it has right here for you and we’re going to click on fixed decimal number and let’s do replace current and now it’s just a little bit better so now it’s 2.70 2.5 and that’s normally how we would read uh values like this because this is money so we would normally read it to the second decimal just like that and if we have it on the second decimal for some we should probably have it on the second decimal for all of them so really quickly I’m going to go through and I’m just going to change that and it should be pretty quick so hang with me for just a second all right right that is perfect now for the purposes of what we’re about to do we don’t actually need these subtotals or this Costco total Target total and Walmart total as well as the grand total really we want to get rid of those and so what we’re going to do is we’re going to go right over here we’re going to click on this dropdown and we’re going to try to filter this data before we actually load it into Power VII so we’re going to filter and we’re going to say remove empty and let’s remove those and it’s going to take out all of those nulles if we wanted to try to filter this out by saying something like Costco total or Target total we could do that by going right here clicking this drop town on products go to text filters and saying does not contain and let’s do insert and we’re going to say does not contain and we want to say total and let’s click okay and again it filtered out all of those things so there’s a few different options that you can do if you want to filter out rows that contain either null values or specific values now the next thing that we’re going to do is actually get rid of a column this grand total column and so what we’re going to do is we’re going to click on the very top part where it says grand total we’re going to go back over here to home and we’re going to click on remove columns and it says insert that’s because we’re on this filtered rows one right here um but what we’re going to do is just insert that and it’ll insert it right there that’s totally fine we can just move it to the bottom now we got rid of this column entirely now this looks really good visually I like how this looks I I like how everything is set up the biggest thing about this is that when you’re actually wanting to use this for visualizations these columns as dates doesn’t really work too well and so what we’re going to want to do is we’re going to want to transpose this or pivot this to where these dates are actually rows so what we’re going to do is select the first date which is January 1st all the way through April 1st and we’re going to hit shift and click on that April 1 right there to select all of them at the same time and then we’re going to go over here to the transform Tab and we’re going to click unpivot columns and let’s see what this does and so now what we’ve done is we’ve basically recreated our original Excel that we had so let’s go back and take a look really quickly at that so this looks almost identical to what we have in powerbi right now and this is extremely usable and very good for visualizations and is much much better than this but again we were pretending that this is what we were given at the beginning so you have to imagine you know somebody just handing you this and you to make it much more usable for visualizations in the future which happens a lot and we actually wanted to create this we just weren’t given this now a few last things that we might want to do is we want to clean this up just a little bit we’re going to select the data type and change this to date and then we’re going to select the value and I double clicked on the value and I actually want to call this cost uh or product cost productor cost and then for the location I actually want want this to be called store so now this looks really good but I want to show you one thing really quickly on this pivot table 2022 so let’s go back here this looks very similar to how we had it when it first started one thing I wanted to show you uh really quickly and I want to click on this first one we’re going to make this our column header and then we’re going to try to Pivot or unpivot this January February March April so really quickly let’s do that so we’re going to transform use first row as headers so now we have this January February March April now if you notice these are not dates these are actually text it says January February March and April so if we go to do this and we click unpivot and here’s the columns that are created when we unpivot it it is January February March and April these are not dates so we cannot go and change this to a date because that would out because it’s actually text so it’s something that you want to look out for it’s something that you need to be aware of and you can change that in the pivot table so you want to be aware of how it actually sits and looks in the Excel or whatever data source you’re pulling from before you actually pull it into Power query to transform and now the very last thing that we need to do to finalize all of this is go over here to close and apply and once we click that everything that we’ve worked on is going to be applied to the actual data and it’s going to load into powerbi to create our visualizations so let’s go ahead and click on that and so now the data has been pulled into powerbi let’s go right down here to data and we can see the data right here if we need to transform this data again we can bring it back into the power query editor window by just clicking the transform data button and it’s going to bring us right back all right so before we jump over to powerbi and start creating our relationships and our model I want to take a look at the data in Excel we realized we were buying so many products for the apocalypse that we decided to start our own store and we have several customers and some client information down here and so I wanted to take a look at some of the columns and these tables that we’re going to be looking at first thing we have is the apocalypse store these are the things that we are selling I know it’s a very limited inventory but these are the really high sellers these are the ones that I wanted to sell so we have this product ID our product name price and production cost then we have this apocalypse sales this is how many sales we’ve actually made to our customers so we have this customer ID our customer name product ID order ID unit sold and the date it was purchased and then we have our customer information right here here are all of our clients so we have this customer ID customer address city state and zip code so now that we’ve taken a look at our data let’s go and load it into powerbi so we’re going to say import data from Excel we’re going to choose this model right here we’re going to click open and we are going to want all three of these so I’m going to click on all of them and we’re just going to load it we’re not going to transform the data at all so now the data has been loaded let’s go right over here on the left hand side to our model Tab and let’s scoot this over just a little bit and move back and we’re going to move these tables up to where it’s a little bit easier to see so right off the bat you can already see that there are these lines between these tables so there are already relationships that powerbi has automatically detected and created from my experience powerbi actually does a really good job at creating these relationships automatically but we’re going to go in and take a look at these and kind of see what everything means and then we’re going to go back and create these relationships from scratch just to make sure that we know how to do every single part so to get us started let’s double click on this line connecting the customer information table to the apocalypse sales table and it’s going to bring up this edit relationship page right here so this line right here connecting these two tables actually gives us quite a bit of information without actually having to click into to this edit relationship page what this is showing is that we have a one to many relationship and there’s only one or a single cross filter direction and you can find both of those things right down here and I’m going to walk through what those mean in just a little bit on this page you can also see the columns that powerbi decided to choose in order to tie these two tables together now for our example they decided to use the customer and customer right here from the customer information table as well as the apocalypse sales but I don’t really want want to use those specifically because on this apocalypse sales table I might remove this customer information and just keep the customer ID it may have chosen these customer columns because they have the exact same name and really the same information but I want to use this customer ID anyways so what I’m going to do is I’m going to click on that column and click on this column and then I’m going to click okay and if we go back into it by double clicking again we’re going to see that and now save that and if we did what we just did before which is kind of hover over it it’s going to show us what those two tables are joined on so opening this back up let’s go down here to this cardinality and cross filter Direction cardinality has several different options that you can choose from you have one to many one to one one to many and many to many now for this example we’re looking at apocalypse sales and we’re going apocalypse sales down to customer information now there are a lot of rows in the apocalypse sales but there’s very few in this customer information and there’s only one customer per row whereas in the ocalypse sales up here the customer can have several rows for several different orders so that’s why the cardinality is many to one now if we flip this and we say we want the customer information here and we want the apocalypse sales down here we tie that together now it’s going to flip and it’s going to say one to many now let’s look at the cross filter Direction and there’s only two options here it’s either single or both and if we choose both and we click okay this now goes from a single arrow pointing in one direction to two arrows pointing in both directions but what does this really mean so in order to demonstrate this I’m going to put this back to a single Direction and what we’re going to try to do is connect the data over here or the columns over here to the columns in this apocalypse store so let’s go over here to build a visualization and what we’re going to do is we’re going to take this customer information and let’s just say we want to look at state so I’m going to click on state right here and I’m just going to make this into a table and the customer information table is only tied right now to this sales table so we’re actually going to go over to the apocalypse store and we want to see how many product IDs are being bought in these different states so really quickly we’re going to come up here and create a new measure and all we’re going to say is this measure is the count of Apocalypse store product ID and we’re going to create that and now we’re going to select it so it’s added to that table so now what this is showing is that there are 10 product IDs which there are 10 products for each of these states but that’s not actually technically correct because not every state purchased these 10 different items if we go back to our model and we change both of these to a both Direction then we’re going to go back and see what changed in our numbers so now let’s go back to our visualization and now we can see that Minnesota actually only ordered seven different product IDs Missouri 8 New York 9 and Texas 10 this is actually much more accurate than before when you use the both option it takes these tables and treats them as if they are a single table but the single option is not going to do that and so for our example if we’re trying to connect this table to this table and one of the last things that I want to show you is this option right down here which says make this relationship active now if we don’t click list and there are other options in here that connect these things like the customer to the customer then that may be the active relationship but if I select this is the active relationship that means this is going to become the default relationship between these two tables so now let’s come out of here we’re going to click cancel we’re going to zoom in just a little bit and bring these tables a little bit closer so we can zoom in just a little bit more now we are going to go ahead and delete these so we’re going to say delete yes and delete yes so just for demonstration purposes we’re going to build these relationships from scratch so we’re going to come over to the customer information t table and we’re going to drag it all the way over here and put it on top of this cust ID or the customer ID in Apocalypse sales and it’s going to automatically create that relationship and we can open this up and as you can see it created the relationship between this customer ID and the apocalypse sales and the customer ID in the customer information it also defaulted the cardinality from many to one and the cross filter direction to single so we’re going to go ahead and change that to both and click okay and then we’re going to come over here to the product ID in Apocalypse store and drag this over the product ID in the apocalypse sales and again if we open it up it created that relationship for us it created the cardinality automatically and we’re going to change this cross filter direction to both and click okay and so on a really small scale that is how it works of course it becomes a little bit more complex the more tables that you add and the more relationships that are created but this is how you’re going to actually create the relationships in the model tab within powerbi all right so let’s take a look at our tables and data before we get started so we have two tables the apocalypse sales the apocalypse store for this apocalypse sales table we have the customer product ID order ID unit sold and the date it was purchased and then for the apocalypse store we have product ID product name price and production cost now these are joined together or they do have a relationship together via the product ID so what we’re going to be using are these new measures and new columns to create create our Dax functions so really quickly let’s go over to this report Tab and let’s drop down our Fields over here so we can see everything and so to get us started we’re going to go right up here to apocalypse sales we’re going to rightclick and click new measure and it’s going to open up this right here which is basically our bar where we can create our functions and so right here it’s automatically given us the name measure but we can change that and we’re going to say count of sales so now we can start writing our Dax function that’s just going to be the name of it and what’s going to show up right over here once we click enter so let’s go over here and we’re going to say count and as we’re typing it’s automatically giving us options it has something called intellisense if you’ve ever used other Microsoft products intellisense is their kind of autoc completion that helps you look at other options very quickly and so we’re just going to click on this count and it’s prompting us to put in a column name and so we can come down here and we can select one or or we can type it out and it’ll try to predict and help us choose which column to select so for us we’re going to use this order ID but let’s just start typing it out we’ll say order ID and then we can click on it and we’re going to close this parenthesis and click enter or you can go over here and click this check mark but we’re just going to click enter and so over on this right side it finalized that and save that and we can actually look at that by clicking on this box next to it and we want to look at this in a cable so now we can see that there are 74 sales now for this we want to see who’s buying our products we want to see what our what our client name is so we’re going to go over here we’re going to choose customer and we’re going to put customer on top of sales and we’re just going to take a look at it like this so now we can see that our number one customer is Uncle Joe’s Prep shop he has 22 orders now they have the most orders with us but it doesn’t necessarily mean that they’re spending the most money with us but we can take a look at that later the next thing that I want to take a look at is how many products we’re actually selling what are our big products that we’re selling we have 10 different items but I don’t know exactly which one is selling the best if if one is doing really poorly and getting no orders this is something that I want to look into so all we’re going to do is go right back up here to apocalypse sales again right click and select new measure and for this one we’re going to call it the sum of products sold and all we’re going to start out with is by doing sum and if this seems familiar to something like Excel you’re 100% correct it is very similar and remember these are both Microsoft products so there’s going to be similar functionality in both of them and so this Dax is going to have a lot of similarities to exactly how it has it in Excel so we’re going to do an open bracket and now what we’re going to choose is this units sold we want to sum up all of these units sold and see how many were we actually selling so we’re going to say units sold I’m going to hit tab it’s going to autocomplete that I’m going to close my parenthesis and I’m going to come over here and click this checkbox so now it’s created that measure and we’re already selected in this table so all we have to do is click the check mark and it’s going to show us that we have 3,000 total products sold and we can go through here and see what the big sellers are and probably the biggest one that I see right off the bat is this multi-tool survival night so these Dax functions that you can write can be very simple and lead to really good insights that you can use for the visualizations later on now I want to take a look at the difference between something like sum which is an aggregator function and something like sum X which is an iterator function because if you add X to some of these aggregator functions you can create them or or make them into an iterator function so you can have sum and some X or average and average X adding X onto the end of them can make them into an iterator function so let’s take a look and see how that actually works I’m going to show you the difference and then I’m going to talk through the difference at the end so really quickly let’s go back to our data and let’s go to the apocalypse store now what we have right here is we have the price and we have the production cost and we want to see how much profit we’re getting from each of these as well as we can take a look at the unit sold and see how much money we are actually making so what we’re going to do is we’re going to come back over here we’re going to go to apocalypse store we’re going to right click and create a measure and in just a little bit we’re going to be creating a new column and that’ll kind of show the difference really well so we’re going to create this new measure and we’re going to name it profit and we’re going to come over here and what we’re going to do is we’re going to take the sums we’re going to start with our sums we’re going to take the sum of the price and then we’re going to close that parenthesis and we’re going to subtract the sum of the production cost so all that does is it says if something cost $20 if we sold it for $20 and it only cost us $10 that’s $10 in profit for that item and then what we’re going to want to do is we’re going to actually want to encapsulate that really quickly because we’re about to use multiply and then we’re going to sum and now we’re going to take the units sold so how many units were actually sold at that profit that we just made so let’s see if that works and let’s click the check right here and so we have the profit so let’s click on the profit oops that’s not what I wanted to do let’s use a new one let’s create a new uh table we’re going to click profit let’s make it a table and I’m going to pull this right over here now we have our profit but I really want to know is which customer is spending the most money at my store so we’re going to come right over here we’re going to click on customer customer at the top and just at a glance we can see that Uncle Joe’s Prep shop is spending the most money at the store now what I want to show you is the difference between Su and sum X so what I’m going to do so I’m going to go back to this profit and going to copy this this entire thing and we’re going to go back here to this table now we just created a measure and we were able to break it down by each customer so let’s go back over here now let’s go up here to home and we’re going to create a new column and we’re going to call this profit underscore column and we’re going to literally paste the exact same thing into here and we’re going to hit enter and each row is the exact same thing so what it’s doing is it is going through the price and it’s adding all of it up and calculating it at the bottom it’s adding the production cost it’s going all the way down and calculating it at the bottom and then it’s going over and looking at how many units it sold and then it’s performing this calculation up here and then it gives us the total and it’s doing it for every single row but that’s not really what we wanted to show what we wanted to show is the profit for each row what we wanted to say is here’s the price for the Rope the production cost for the rope and then how many units we actually sold and then it’ll calculate that and give us the actual profit for just that row but we cannot do it by just using this sum what we need to do is use something called sum X so let’s add another column let’s go back to home say new column and now we’re going to say profit underscore oops underscore column underscore sum X and now we’re going to use sum X and hit Tab and we need to choose the table that we want to put this in so we’re going to say apocalypse sales because that’s the table that we’re looking at right here we’re going to say comma and now we need to input an expression which it says it Returns the sum of an expression evaluated for each row in a table before when you’re just using sum it’s looking at all of these combined now it’s taking it row by row so what we’re going to do is basically input the same thing as we did before I’m going to copy I’m going to paste that it’s not going to be correct I need to get rid of these sums but it’s basically the exact same equation give me just a second and let’s get rid of this sum and let’s see if this works so let’s click the check button and now this looks a lot better so what this is now showing us is at a row level this nylon rope made us 51,000 almost $52,000 the waterproof matches made us $155,000 and we can go down and look at each item and see how much that actually made us versus this profit column and so that is the biggest difference between some and sum X hopefully that made sense I know that sum and sum X and and the difference between an aggregator function and iterator function can be a little bit confusing especially if you’ve never done it before but hopefully that was a good example for you to understand that concept now let’s go back over here to apocalypse sales right here we have a date purchase now in the Dax function we have some ways that we can interact with dates and so I want to take a look at those really quickly so we’re going to go right up here and click on new column and we’re just going to leave that as column but what we’re going to say is day so there’s a few different ones we have Day dates YTD next day previous day and weekday and they all are pretty self-explanatory if you click on it let’s click on weekday day it says it’s going to return a number from 1 to 7 identifying the day of the week of a date so let’s use this really quickly and so we’re going to say date purchased and click tab hit comma and it’s going to give us a three different options basically it’s a one a two and a three um right here if you hit this button read more you can read more on it this is going to say Sunday is equal to one Saturday is equal to seven I like this one personally which is Monday equals one in my brain it just makes more sense so I’m going to click on two I’m going to close that parenthesis and we’re going to I guess I’ll say uh let’s say day of week for the column let’s click that check box and now Saturdays are equal to sixes Mondays are equal to one this allows us to see which day of the week people are buying the most products on or or which day of the week is somebody submitting their orders on and so let’s go over to our report let’s get rid of this I’m just going to move this oh jeez I hate moving stuff sometimes all right really quickly I want to show you the difference between what we just did and what we already have so we have this um date purchased and let’s make that into a bar graph and what we’re going to be taking a look at is actually the units sold so right here we have this and obviously for we don’t want 2022 we’re going to get rid of the year we only have one quarter right here we can see January February March so we can tell that January has the most sales or the most units sold in that month if we get rid of that we go down today we do have some information but we don’t know what day of the week it is it could change from month to month and it’s really hard to tell exactly what if there’s any pattern there at all that’s where what we just created comes in handy so let’s recreate this exact same thing but instead we’re going to use day of week so we’re going to select day of week in unit sold let’s drag that down move this over right here and this day of the week should be on the x-axis and it’s really easy now to see if there’s a pattern here there’s really not at least not for this fake data that we have um but just I I want these uh data labels on really quickly um it’s not easy to see if there’s any pattern again Monday has the most so maybe that that I mean it goes down a little bit and then it picks back up so maybe middle of the week is our least uh sales day our Wednesdays and Thursdays are a little bit lower than the rest and the beginning and the end of the the week tend to be the highest again not a huge pattern but you know it’s much easier to see if there is a pattern from week to week or what day of the week now that we use this weekday function and so this can be really really useful let’s go back here to our data and now we’re going to look at our last Dax function for this video let’s go up here and create a new column and we’re going to be looking at something called the if statement now if you’ve ever used Excel I’m sure you have heard of this and you can do the exact same thing here in powerbi and so we’re going to name this one order size order undor size and so all we’re going to say is if we’re going to click on this one right here we need to perform our logical test and then we want to say if it’s true what’s our value and if it’s false what is our value so what we’re going to be looking at is units sold so we’re looking at order size so we’re going to say if unit sold is greater than 25 what’s going to happen if it is true if the order is larger than 25 we want to say it’s a big order and if it’s not we want to say it’s a small order super simple we’ll close that parenthesis we’ll click okay and now really quickly we’re able to see if this is a big order or a small order and so that is all I have for you today there are a lot of other doc functions but the ones that we looked at today are ones that are very common ones that you’ll see the most and there can be a lot of of really complex and intricate Dax functions that you can create and in our project at the end of this series I will be sure to include some more complex Dax functions but hopefully this gave you a good introduction into Dax so you know how to use it a little bit better all right so before we get started I wanted to remind you that you can find the data that we’re going to be working with in this tutorial in the description you can go and download it from my GitHub now the two tables we’re going to be looking at are apocalypse sales and purchase tracker and if you’ve ever created any visualization you probably seen something like this where you’ll have the store and the price and this is the the things that we actually bought so this is the total amount of Apocalypse prepping uh equipment that we bought and we’ll put the store in this Legend right here and you’ve probably seen something like this and if you’re anything like me you’re going to be in a meeting and you’re going to be presenting this and some higher up is going to be like hey Alex Alex great but I want to you know see what things we actually bought in Target how much this cost can you create a visualization for that and you’re going to be like well I could or I could use drill down and so you could have done this in the first place uh which you should have so what we’re going to do is all we’re going to do is we’re going to say we’re going to say the product right here and these are going to be the actual things and we’re going to put it right under store now you can’t see these things right but there is a a hierarchy here so once we added this these options became available let’s take it out and all those just disappeared and then if we add it back right here they came back and so you can do right here which is click to turn on drill down you can go to the next level in the hierarchy or you can even expand all down one level in the hierarchy so let’s look at each of those really quickly so let’s click on this one it’s just going to turn on drill down mode so now if I go and I click on target it’s going to drill down into these and if we want to I can then put product under this Legend and we can see all of those things but of course if we go back up it’s going to be all broken up into this clustered column chart which is more like um this which isn’t exactly what we were going for but it works now uh let me get rid of this I actually want store in the legend now if we turn that off and we click it doesn’t do that anymore so what it does now is it just highlights Walmart it highlights Costco it highlights Target so we’re going to keep that on uh but we can also do something called going down in the next level of hierarchy so let’s click on that and so now this is going to go down to the next level down to this product level because that is the next level and now it’s going to show us each of those things but it’s going to have it broken out by the store and so it’s a completely different visualization but all within the same Realm of the data that we’re looking at and what we actually care about so let’s go back up in the hierarchy and then let’s use this one right here which is expand all down one level in the hierarchy and so this one is again extremely similar except it just visualizes it differently and now what it’s doing is Walmart rice Target dried beans Costco rice so instead of having an all uh like this one where it’s stacked on top of each other it’s breaking it down individually so this one column would become three separate columns now I’m going to minimize this right here uh I’m actually going to go back up in the hierarchy just for visual purposes now I’m going to show you one more example we’re going to use this apocalypse sales up here and this is one that I actually use all the time so the one you’ve seen you know you’ll get stuff like that as especially if you’re working with like sales and stuff but I work in operations right so I have a lot of order IDs product IDs stuff like that now this one this one genuinely I use quite often I’ll have a customer and let’s make it we’ll just go like this we have a customer and we have unit sold and let’s use the customer as the legend so let’s make this one quite a bit larger and I’ll have something like this and they’ll say okay well we want to see the order IDs that go with it cuz we want to know what orders are actually happening for each of these people obviously I’m not using this exact data but very very very similar and all you have to do is take these order IDs and slide it right under here under customer and this visualization right here is something I’ve done a thousand times because what happens is is someone some stakeholder in our company is saying hey Alex we want this and we want to know we want to drill down on on this IP address we want to drill down on this certain database we want to drill down on something and we want to see the order IDs within them so then all you do is you turn on drill mode or drill down mode you’ll click on it and you can see every single order ID that’s in there and then they can go and look those up in their system and resolve them or whatever they’re trying to do with it and it helps a ton and it’s very very useful this one is extremely applicable and that’s really all drill down is again you have these different hierarchies as well um but for different things it’s not as useful as you can see we also have this hierarchy which again is not as useful so it just depends on the data that you’re using and how you want to use this drill down effect but I promise you that drill down is used all the time especially when you’re giving presentations where people want to know more information than just the the visualization that you’re presenting all right so before we get started I wanted to let you know you can go and download the data that we’re going to be using in this tutorial in the description below is on my GitHub so we are going to be looking at bins and lists today um and for this we’re going to be going over here to this apocalypse sales uh and let’s open up our data right over here and we want to look at apocalypse sales really quickly I feel like more people would know what a bin is so we’ll kind of start with a list just go a little bit backwards than we normally would uh I’m going to use this customer or we’re going to use this customer column right here for a list really quickly and you can do that in two ways you can come up here and you can right click on on the customer and go to new group or you can come over here under this uh the Field section on the far right and go to customer right click and click new group so let’s click on that now and right now is only giving us the list type it’s not giving us bins because bins have to be numeric so we really can’t do that at the moment um so we’re going to call this just customer groups just or or we’ll actually call it list just so it’s easier to recognize when we create it and so all we’re going to do is we’re going to basically group these but it’s going to be called a list and so what we’re going to do is we’re going to select and we’re going to select and we’re going to say group and click on this group button and then it creates this Alex the analyst apocalypse Preppers and uh this prep for anything prepping store so that it kind of named it for us but if we double click on it then we can rename this and we can call this the best prepping stores and then we have these last two and we can we can click on one and then click control and click on the other one so we get both of them and then we can click group and we can call this and we’ll double click and we’ll call this the worst prepping Stores um and then that’s it and that’s all we have to do and what we’re then going to do and if you want to undo this and you want to switch it up and do whatever you can click on group but we’re not going to do that we’re going to click okay and here is the column that it created and it basically tells us what list we put it in if it’s Uncle Joe’s Prep shop that’s in the worst prepping stores list and if it’s the Alex the analyst apocalypse Preppers that is in the best prepping stores so it’s kind of like an if statement you could even create a calculated column do it on this customer create an if statement this is just a lot faster and a lot easier than doing that but it basically would do the exact same thing now you can use lists as well on things like numeric so let’s say we have order ID and we’ll go to new group and it’s going to Auto go to bin because typically that’s what you’ll use but you can do list as well and let’s say you know we want to say we want to call these like we’ll group these and call these the first um we’ll call this the first customers or the first orders because we’re looking at order IDs look at the first orders and then we will go back here we’re going on the left side we’re going to click oops we’re going to go back to the top we’re going to hit shift group all of these and we’ll say the latest orders and you absolutely can do this um again this is kind of like an if statement right so you’re saying if it falls between this range and this range then it’s called the first orders and if it’s between this range and this other range it’s the latest orders um again it’s just a much simpler version of an if statement and so you don’t have to write it all out you can just have this user interface kind of do it for you uh and and it’s really really useful so now let’s talk about bins and by far the easiest way to demonstrate this and I’ll show you one other way uh but by far the easiest way to show this is by using age and so uh for absolutely no reason whatsoever these customer IDs uh who are right here in this customer information they decided to give us some of their buyer information who are actually buying their products on their website it or in their store they just decided to give it to us as well as some uh simple demographic information I I don’t know why but what we’re going to use bins for is grouping these age brackets so you know you might be interested in say well I want to know if my core population who are buying my products are within a certain range and you don’t want to look at every single age because then it just you know in your visualizations it’s not going to look right you want to kind of group them and make it easier to visualize so what we’re going to do is going to go through here and we’re going to basically go by tens so 10 20 30 40 50 60 and see what age bracket these people fall in so we’re going to go to age we’re going to right click and we’re going to say new group and we’re going to go to bin and we’ll leave it as the default age bins um and you can do two things you can do the size of the bins which splits it uh uh which splits it by this number right here or you can go based on the number of bins so if you only want to do five different bins it’ll count calculate that for you and it’ll say okay if you only want five bins you’re going to have to do it at 12.2 if you want 10 bins it can be 6.1 but it is completely up to you on how you want to do that um you can do the size and we’ll just say every 10 which is what we’re going to do or you can go through and then you can create you know the how many bins you actually want so let’s go ahead and click okay and it’s going to create those bins for us so if somebody is 78 they’re going to be in the 70s bin if somebody’s 41 they’ll be in the 40 bin if somebody is 29 they’ll be in the 20 bin and so on and so forth so when we go to visualize this we don’t have you know 71 72 73 74 have a lot more things on our visualization it’ll just be the 70 or it’ll just be the 20 now we can also use bins on dates as well so let’s go back to apocalypse sales we have this date purchase so we can create a bin for this as well so let’s go to date purchased let’s go new group now you can also create a list and that’s totally fine if you would like to do that um and it would look kind of like this where you can go through and you can select it and you can say okay this group all these dates you can group those and say this is going to be January uh and you can do that and that’s totally okay um but for this one we’re going to do bins I think it’s a little bit easier to do bins because what we can do is go right here and we can specify what we want seconds minutes hours days months or years and so um for the data that we have it goes January February and March so we’re going to do months and we’re going to say the bin size is going to be one month so each month should have its own bin so it’ll be three bins total so we’re going to select okay and as you can see on this right side we have January of 2022 and that correlates to the January over here then it goes down to February and then it goes down to March and then when we visualize this uh we don’t have to do this the hierarchy stuff that we do in here where we filter it down down to months we can just use this right here and that will be our month’s column so now let’s go over to our visualizations and we’ll see how this looks really quickly we’re not going to look at all of them but we will take a look at few of them so the first one that we can look at is age so let’s look at the buyer ID and then we’ll do age as well and so let’s spread this out and we can see our distribution of our buyers so it looks like we have very few uh who are in the 10 range thank goodness and we can even put the age right under here under the age bins and we have this now we kind of have this drill down and so if we go right here and we drill down right there this will actually give us the breakdown so this is what it would have kind of looked like our visualization would have looked like if we had just kept it the age because now we’re drilling down into the age and so it looks like we have one 18-year-old and maybe a 20-year-old as well um um let’s go back up yeah so it looks like we only have one buyer ID yes so there’s only one 18-year-old so of legal age to start buying you know all these prepping equipment and probably uh buying online and stuff like that which makes sense right so uh this gives you kind of a quick breakdown in the bins rather than um doing it the alternative way so now let’s take a look at the customer list as well as the unit sold and it looks like the best prepping store uh is actually performing much worse surprisingly uh than the worst prepping store right so before we get started if you want to use the data that we’re using in this video you can find it in the description on my GitHub now conditional formatting is super simple and you’ve most likely used it in Excel before but you can also use it in powerbi and let me show you how to do that so the first thing we’re going to do is come over to our apocalypse store and we’re going to pull up our product name as well as the price and what we can do is come over here and we’re going to go to price and it has to be Under The Columns so you can’t come over here and do this we’re going to come right over here to price and we’re going to right click and let’s go to conditional formatting and we have background color font color icons and web URL let’s take a look at background color first this is most likely the one that we’ll look at the most so we’re going to get this pop up and I’m going to slide this over now there’s a lot of different things we can customize in here and the first thing I want to take a look at is format style we have the gradient and what it’s going to say is the lowest value will be this color highest value will be this color it’ll give us this gradient color scale and so we’ll use that in just a little bit but we can also create rules kind of like an if statement and if it is between this range and this range we’ll give it a color and if it’s between a different range and a different range we’ll give it a different color so we’ll also try that one and then we have this field value uh and this one is one that uh honestly I don’t use that much I’ve used it maybe once and what you can do is select a text field like customer and you can do some izations on the first and last and that is it so what we’re going to do is we’re going to look at gradient specifically for not the customer but we’re going to go back to the apocalypse store and we’re going to do it on the price now what I’m going to do is keep it as the count because this is what the default is and we’re going to go back and fix it later but what we want our lowest value to be is this bright green showing that it’s it’s a cheap product it’s easy to purchase the high value ones are going to be just the shade of red more expensive and we’ll do it on the count now remember the count is on each of these and we’re not doing a count of how many are sold we’re doing a count of each product so it’s just one per row so it all should be the same color let’s take a look so it is all the same color but what we really want to show is the actual price not just the count of the price so let’s go back to conditional formatting we’re going to click the background color again and this time we’re going to change the summarization now you can do sum you can do average minimum maximum it really doesn’t matter for this example the number is the same regardless of really which one we choose so we can just choose the minimum and it’s going to choose the minimum of each row which is the price so we’re just going to select minimum for this example we’ll select okay and it should correct it accordingly which means the bright green is the lowest and it goes all the way up to the highest which is the red now let’s go over here to apocalypse sales We’ll add in the units sold and let’s move that out a little bit and I’m doing that on purpose because we’re about to look at something within the conditional formatting so let’s go to unit sold and we’ll look at the conditional formatting for this one now if you noticed we now have a new one on here called datab bars now we’re able to see data bars on unit sold and not price because unit sold is something like a sum an average something that’s aggregated but let’s take a look at data bars because I want to show you how to use this and then we’ll go back to the background color so for data bars we are going to taking a look at the lowest to the highest value again we’re going to go from bright green all the way to this exact red it’s going to be from left to right and what it’s going to show you is if it is a positive number which all of these are is going to be a green bar basically representing the number that you see in here along this line so let’s click okay and we’re going to be able to see the highest numbers and let’s let’s scooch this over quite a bit so you can kind of get a better understanding and we’re going to do it from highest to lowest so we sold the most multi-tool survival knives at 477 and so this entire bar this row is entirely filled up or almost all the way filled up while as it gets lower and as we sell only 182 solar battery flashlights the bar is going to represent that and show that now I’m about to completely mess up this visualization on purpose because it’s about to get very messy to show you that you can do a little bit too much uh it is possible what we’re going to do is we’re going to go right over here to this background color unit sold and instead of gradient let’s look at rules now with the price we just did a gradient scale but we can do basically groups of these and say if a number is greater to or equal than this number then it’s going to be a certain color and then if it’s in a different range we can give it a different color so we’re going to say if it’s greater than or equal to zero and we’re going to say number not PR and if it’s less than 266 cuz we have 265 right here let’s make it a nice uh like gold a beautiful lovely mustard gold just just great now we’re going to say if it’s greater than or equal to we’ll do 266 because this is less than 266 so it should be greater than or equal to 266 number and if it is less than we’ll say 500 now we want to do this this one and we’ll give it uh let’s do like a peach and we’ll click okay and now we have another conditional formatting on top of that that can give us more information now again you should not do this it’s just too many now let’s go one step further and make it even more ridiculous and show you one more thing before I show you how you may actually want to use this uh let’s go back to unit sold we’re going to right click go to conditional formatting and you can do something called icons um font color is the exact same thing as back color except it changes the the font and so I’m not really going to look into that one icons are very simple extremely similar to Excel and how you’ve seen them and the rules that you can apply to them are basically the same as if you’re doing like a gradient and it’s these if statements that we saw before now it autog gives us this right here which basically says 0 to 33% 33 to 67 67 to 100 if it’s in the bottom 3% it gives us this red the middle is yellow and the top is green green so we can go through and change all of this but honestly this looks pretty good so let’s click on it and so the ones that are least sellers are these red ones right here and the top sellers are up here now this is just based on unit sold and this looks absolutely terrible so let’s kind of take this exact information but make it a little bit better so we’re going to create a new visualization or at least a new table so let’s click on product name and we’ll take the price unit sold and revenue and what I think makes the most sense for looking at revenue is these data bars right here but there’s only one problem I can’t do that because it’s not summarized like unit sold was but what I can do is to get that those data bars is I can come right down here instead of saying don’t summarize I can summarize it and I can just click the sum so it now is summarized it’s the exact same number but if I right click on here as sum of Revenue I go to conditional formatting I can now use those data bars and so we’re going to use those data bars and we’re going to say for the lowest value and the highest value and let’s just make it a nice maybe a darker green I don’t want it to well that’s that’s hideous let’s make it this color right here a nice dark green and there’s no negative so it doesn’t really matter we’re going to go left to right and you can show the bar only but we’re going to keep it because I want to see it and we’re going to go just like this we’re going to order and this is pretty telling um honestly I did not think the weatherproof jackets were performing so well but I mean they are by far a number one seller so you know our weatherproof jackets multi-tool survival knives and the nylon rope are perform outperforming all of our other products so those might be the ones that I focus on the most while duct tape the n95 masks and waterproof matches I mean those are those are garbage so I might be looking to replace those in the near future with some other items that might sell a little bit better so that’s how you use conditional formatting and it’s actually pretty useful there are a lot of times where I’ve done something like this in an actual visualization for work and it looks something like this it just depends on what you’re visualizing but this is very much a simple thing that you can do to just add a little bit more information and and actual visuals to this little chart or table that you’re going to create sometimes it’s just better to have these simple visualizations on this table rather than just having the numbers themselves makes it a little bit more easy to read and understand stand all right before we jump into it there is a link in the description where you can get the data that we’re going to be using for these visualizations if you want to practice them yourself before we actually get into it we do need to combine this and if you download that Excel and you see this you’ll have to do the same thing all we have to say is that this product ID is the same as this product ID purchased and now we are good to go do one to many and it’s okay if it’s one way so right over here under this visualizations tab there are lot lots of different options and it can be a little bit overwhelming you don’t really know which one to choose there are some in here that I have almost never used for my job ever so I’ll Point those out as we go through but the main focus is going to be focusing on the ones that I do use that I have used and showing you how to actually create that visualization maybe spice it up just a little bit but we have a lot of them to go through so let’s jump right into it and the very first one that we’re going to start with probably the easiest one and the one that you’ll recognize the most is a stacked bar chart and what we going to do is go ahead right over here to the product name and we want this unit sold as well so we’re going to click product name and it’s going to go straight into the Y AIS for us and then we’re going to click unit sold and that will go into the x axis automatically it just kind of intuitively knows but sometimes it will make a mistake and then you can just fix it or flip it and we do want this uh let me make this much larger we do want this to be a little bit more colorcoded that is what this Legend is down here so what we’re going to do is drag this product name down to the legend and now we have each product as its own color and in previous videos we have gone through and looked at some of these Visual and general options that you have when you’re actually creating these visualizations but we’re going to do some of them while we’re in here as well so we’re just going to go down here we’re going to choose data labels and we’re going to shrink that and if you go higher the higher you go the less you see so if you want all of them all the way down to the green we’re going to go right about there and we’re going to make it smaller so now we can go ahead and click anywhere outside of that visualization and now we can create a new one if we had just kept it like this where we were still interacting with this visualization and we clicked on a different one it would have then changed our visualization completely which we don’t want so let’s hit contrl Z click out of it and now we can create a new one let’s go right over here to this 100% stacked column chart I’m going to click on it try get get over here and make it much larger and we’re going to come right over here to this customer information and we’re going to click on customer and then we’re going to go up to unit sold and click on unit sold and we want to break these out and so basically what this is doing is it’s breaking it out by each of these shops and we can see the total of what they’re buying the units sold but we want to see exactly what products make up this percentage or this 100% so we’re going to go right over here to product name we’re going to drag that down to the legend and as you can see now we have each of these products and each of the products is up here so this backpack we can see the backpack right here backpack right here and right here and we can see which customer is buying what percentage of their purchases so for this prep for anything prepping store they have a very large percentage 40% is duct tape so they’re buying a lot of duct tape so really quickly we’re able to see what clients are purchasing or which clients are purchasing what products the most so so just like this Alex analyst apocalypse Preppers they’re buying a lot of water purifiers we like drinking clean water um you know that’s just what my audience likes and so you know we can easily get a quick glance of that again we’re going to go in here I tend to like putting these data labels on here that’s just what I preference so you know something like this it looks nice it looks clean um we can always go back and change these names which we’ll do for this one so we’re going to go over here go to title we’ll go down to to the text and we’ll do customer oops customer purchase oh jeez breakdown pretend I’m really good at spelling and we’re GNA do it just like that we’ll get out of there so now we have customer purchase breakdown and that looks really nice it’s a good uh a good visualization and we’re going to bring that right over here we’re going to have a lot on the screen so I may have to uh make them smaller or larger to fit everything all right so let’s go on to our next one another really common visualization is this one right here which is the line chart and the line chart is great especially when you’re using things like dates I have found this one to be the best and a lot of people use this as well so we’re going to go right over here and click on date purchased and then unit sold and on the x-axis you can see it’s broken up by year quarter month and day so we don’t want to do at that high level we only have three months of data in here so we’re going to get rid of the year we’re going to get rid of the quarter and then we at least have this and let’s break it out cu right now we’re looking at all of the units sold so we’re going to drag the product name right down here to the legend and now it breaks it out by the actual product and for each month in January February or March you can follow these products and see how they did in each of those months and if we wanted to we can come right over here to the filter on the product name and we could filter it by maybe the top three so let’s do multi-tool survival knife the nylon rope and the duct tape we can have it just like this and you know you can do those for any product that you want but again we just want to do it for those three just for an example and that really doesn’t give us a ton of information we could even go down to the day and you know it might give us a little bit more information and so we’ll keep it like that and we can go over here change the name as well we’re not going to do for all of them again we’re just looking at the different types of visualizations I think are really good to know but we’ll change this one as well to products purchased by date we’ll keep it just like that again nothing fancy we’re just trying to look at a bunch of different stuff so let’s put this over here down here now let’s click out of there and there are other ones in here um that are definitely useful and you absolutely can use um like this one is a stocked bar chart this one is a stacked column chart it’s basically the same thing just a different orientation we went to here it’s just a different orientation it’s the same thing um just like this clustered bar chart cluster column chart it’s just its orientation either horizontal or vertical then we have things like an area chart a stacked area chart not really things that I’ve used too much in previous positions one that I have used though is a line and clustered column chart so it kind of combines a few of these with you know you have these bar charts as well as line charts into one visualization so let’s look at this one because this is one that I have used several times in my actual job so for our x axis we’ll use the product name then we’ll look at something like the price and so let’s make this a lot larger so you can actually see it so now we have the price and now we can look at something like the production cost and that can be our line Y AIS so now we’re looking at the price of it how much someone is actually paying for it and then we’re looking at how much it’s costing us to actually produce that product and so really quickly at a glance you can kind of see that it’s around the halfway to 2/3 point on most of these you can see that the production cost is always lower than the actual price because of course we’re out here to make a profit on these products so let’s minimize this one we’re going to put this one right down here let’s make it even smaller smaller let’s click out of that and the next one that we’re going to take a look at is a scatter chart so let’s click on that and make it much larger oops there we go so let’s use the price and the production cost again and so our x axis is the price our y AIS is the production cost but now we need to fill in this values right here so let’s go over here and click on the product name and drag that into values and so now we have our values we just don’t know what they are but we can see see it so let’s drag this down to Legend as well and it breaks it out and we kind of have this scatter plot and you know for this fake data that we’re using it doesn’t really show a lot U but if you’re using real data you can definitely find outliers and Trends and patterns using this type of visualization let’s go ahead and make that one small as well iy get right down into the corner now let’s go right over here and we have the the dreaded pie charts um and donut chart now look I think it’s kind of a joke in the data analyst Community about pie charts and dut charts but at the same time people use them and they request them and so sometimes you’re going to use it whether you like it or not so let’s click on the dut chart and let’s make this one a lot larger and let’s go over here and let’s click on State and we’re also going to click on total purchased and that’s really all you have to do these ones are pretty straightforward you can change a few different things like where these labels are if you want them inside you can also do that that would look totally fine um again I’m just not a super huge fan but you will get this one requested people like this and want to see it and the reason a lot of analysts don’t like using this is because when you start glancing at these it’s really hard to tell the difference between these sizes if you look at something like this you can easily see that this is larger like if you’re looking at this one the multi-tool survival knife is obviously the longest and it gets shorter shorter shorter shorter but when you start getting in here it’s really hard to approximate the size I would not be able to tell the difference between this 5.63 5.78 two 7.72 I would not be able to tell really the difference between these or or kind of the the difference between them very easily that’s why a lot of people don’t want to use them in general so again I want to show you this one because I think it’s worth noting and worth knowing how to use but I don’t really push people towards this because I don’t think it’s the best visualization available most of the time all right the next two are super easy but are used all the time uh maybe more than some of these even but they’re just so easy to use so I’m kind of saved them for last this one is the card and all the card is is it displays one number or multiple numbers if you want to use a multi- card but we’ll just look at the card for now all we’re going to look at is the total purchased and it’s just going to display it just like this and you can make it as large or as small as you’d like and normally it goes on like the top and you’ll put card here a card here um just for example I’ll kind of show you how this might look so it look something like this right and at the top it’ll have different usually High overarching information and this is super common to see and I’m sure if you’ve looked at other people’s visualizations you’ll see something like this this is usually totals or averages or something like that in here where it’s super easy to look at so like right here this is total purchased and we can go in and look at the minimum and then we can go over here and this one can be account and so it gives us a lot of information just at a really quick glance and then we have all of our more in-depth colorful visualizations that kind of have more information than just a single piece like the card does and then the very last one that I’m going to show you is this one right here which is the table and this one is obviously extremely popular it’s like an little Excel table and we can go in here and we can get the customer wherever that is and then we’ll also get the unit sold and this is what it looks like and it’s super easy and often times you’ll have it like on the side as well uh and all the other visualizations over here and so you know if we’re going to take all these visualizations and pretend they were like a real thing you know there’s a lot in here but we’ll just kind of really quickly do this um you know we might have something like this and we’ll make this larger and make this wider and you know we have a lot of information just in here and this is not a project so don’t go put this on your portfolio I’m just threw a ton of random visualizations on you know this dashboard but you can already see a lot of these you most likely have seen in other people’s work and other people’s visualizations on LinkedIn or on YouTube these are very common very very popular and again we did not go through all of the ones over here there are maps that you can use but I haven’t used Maps ever in my job there are things like gauges and deom composition trees and waterfall charts and uh tree maps and all these different things but I really have never use those in my actual job and I don’t see them a lot in others people’s work either otherwise I would be telling you to learn these and use these what’s going on everybody welcome back to the powerbi tutorial Series today we’re going to be working on our final [Music] project now this is our final project of the powerb tutorial Series so if you have not watched all of those videos leading up to this I recommend going and watching those videos so you can make sure that you know all the things we’re going to be looking at in today’s project I am really excited to work on this project with you because I think it is a really good one and it uses real data that we collected about a month ago where I took a survey of data professionals and this is the raw data that we’re going to be looking at and so I think it’s just really interesting that we collected our own data and now we’re using it for a project we’re going to transform the data using power query and then we’re actually create the visualization and finalize the dashboards as well as create a theme and a different color scheme to kind of make it a little bit more unique without further Ado let’s jump on my screen and get started with the project all right so before we jump into it I wanted to let you know that you can get the data below it is on my GitHub you can go and download this exact file that we’re going to be looking at now in the past several projects we have been using this fake apocalypse data set you know it was fun it was you know whatever this data set is real this is a real data set it was a survey that I took from data professionals I posted on LinkedIn and Twitter and all these other places and we had about 600 700 people who responded to the questions so before we actually get into it and start cleaning the data and doing all this stuff in powerbi I just wanted to show you the data all right so this is the CSV that I downloaded from the survey website that I used and this is completely raw data I haven’t done anything to it at all but let’s go through the data really quickly and we’ll kind of see what we have and we are not going to make any changes at all in Excel we’re going to do all of our Transformations or at least a few transformations in powerbi because again this is a powerbi tutorial and project so I want you to kind of learn how to use that and not use Excel because you can go through my Excel tutorial if you want to do that so let’s just look at it in Excel and then we’ll move it over to powerbi and actually start transforming the data so we have this unique ID these are all the people that actually took it oops don’t want to do that we have an email which this is completely Anonymous I didn’t collect any data or user data on this then we have the date Taken um and let’s get into the actual good information then we have all of these questions so we have question one which title fits you best and they can choose things now uh let’s add a filter really quickly that we can look at this now you had the pre-selected ones which were like data analyst architect engineer but then there was an option where you could say other and you could specify what that was so if you look in here we’re going to have all all these different other please specify with different titles right and there were a lot of them now typically what you want to do is really clean this up and we’re not going to be doing a ton ton ton of data cleaning but we are going to do some in powerbi but none in here but typically with this amount of data and the way that it’s formatted we would do so much data cleaning um with this one I mean there is a lot of work to be done um like this current year salary this is one that I would absolutely be cleaning up because it’s arranges and it has a dash and a k and all these numbers this is something that I would be cleaning up and using but we’re not going to be cleaning this up right now so anyways let’s just get into it let’s see what questions we asked uh we have the yearly salary what industry do you work in favorite programming language then there were a lot of different options this is like one question where they picked multiple options so is how happy are you in your current position with the following you have your salary work life balance um then we have co-workers management upward Mobility learning new things um and they could rank it from zero to 10 so some people ranked upward Mobility at 10 so I’m ranked it a zero or a one um and again they can answer however they want how difficult was it to break into Data very difficult very easy um if you’re looking for a new job we have have you know what would you be looking for remote work better salary Etc we have male female which country are you from and then this is more like demographics so if you’re a male how old you are and this was in a Range so this is like a a a a sliding bar so you can slide to the exact age you had there’s some people who are apparently 92 um which if that’s true I mean good for you man or woman actually really quickly I’m going to see just just while we’re here I’m going to see if this is a male male or female that’s a female from India very cool um so we have all this information and it is a lot of information when you have something like this I mean there is so much data cleaning that can be done I mean I already see like 20 plus different things that I would need to do to make this a lot better um and we also have date Taken and the time taken as well as how long they took on it like the time spent really really just really interesting data but again this is a beginner tutorial Series this is the beginner project so we’re not going to get do anything too crazy I will be using this exact data set in a future video doing a lot more data cleaning and creating a much more advanced visualization with what we have and what we’re looking at right here but for this video we’re just going to be doing a pretty simple visualization and dashboard that you can use uh to practice with or put on your portfolio if you know that’s where you’re at right now so let’s get out of here and let’s put this into powerbi so let’s exit out and let’s come right over here to import data from Excel we’ll click on powerbi final project and open give that a second doing this all in real time we only have the one so we’ll do be we won’t be practicing any joins or anything but we’re not going to load it we’re going to transform this data so let’s put it into Power query editor and now we have all of our data in here and it should look extremely familiar now when I’m looking at this when I start looking at this information I kind of need to know beforehand what I want to get out of this do I need to clean every single column do I just need to clean a few of them do I need to get rid of columns that’s kind of where my head’s at and so right off the bat I can already tell you that there are columns that we can just delete to get out of our way so we’re going to do that at the beginning so that we don’t have to do that later on or they’re just in our way so I’m going to click on browser and then I’m going to hit shift and I’m going to go over here to refer and I’m just going to go up here to remove columns and everything that we do is going to go over here to this applied steps if you’ve been following this series um you know we can remove things add things but anything we do will show up right over here so we can track it and go back if we need to now one column that I know for sure that I’m going to be using quite a bit is this which title fits you best in your current role because I I specifically wanted to do a breakdown of diff people’s roles and how much they make and different stuff like that so I know that I want to use this but as we saw before there’s kind of the issue is is it’s not very clean right it has data analyst data architect engineer scientist databased developer and then all like a hundred different options and then a student or or none of these right um and so for the purpose of this video right here we are not going to take every single one of these options because this involves a lot more data cleaning let me give you an example this says software engineer this also says software engineer and with AI these two would typically be combined or standardized to software engineer but it’s not very easy to do that in powerbi we could do that in Excel but not really in powerbi or even SQL if we pull this from a SQL database um and you can find lots of different you know options of that we have data manager and data manager if we separated these out these would be different options when we created our visualizations and we don’t want that so what we are going to do uh and this is going to be kind of a an easy way out to just make sure that this is pretty clean and doesn’t we don’t have a thousand different options we’re going to create this to other so we’re going to simplify this a lot and then we’re going to use this so we’ll have maybe six or seven options instead of the you know let’s say 50 that we would have if we actually did the harder work which just break it out standardize it and clean it up that way so what we’re going to do is we’re going to click on this right here and we’re going to go up here to split column in this ribbon up top we’ll go to split column and we want to do it by a delimiter and if you notice let me see if I can move this over if you notice we have other and then we have this parentheses and in no other option way is there parenthesis so what we’re going to do is we’re going to use a custom and we use this open parentheses what that’s going to do is it’s going to separate it by this parentheses it’s going to leave the other it’s going to create separate columns um just one separate column for each of these and we can do that at each occurrence or we can do the leftmost and we really we only need it for the leftmost because there’s only one of these uh left-handed or left-sided uh brackets or or whatever this is called and then let’s go and click okay and it should create another column so it’s going to have 0.1 point2 and now we have if we click on this now we only have these options we have analyst architect engineer data scientist database developer other and student looking or none that is what we want it makes it so much simpler and it’s not perfect but again I’m trying to show you what we are able to do in powerbi so now we’re just going to remove that column and we’re going to go and do the exact same thing to this one as well cuz I know that we want to use this and I really wanted to use this one as well but if we look at this one also um there’s a lot so I said what is your favorite programming language and people there were pre-selected answers like JavaScript Java C++ python R things like that and then there was other option and in this other option I mean it was free text so they can fill it in as they want I mean there’s four five six different ways that people put SQL that is something I would standardize and you know that would be the way I cleaned it but that’s not how we did it in here so we’re going to do the same thing we’re going to keep that other so we’re going to split this column again we use a delimiter and for this delimiter though we’re going to use a colon so we’re going to say we’re going to do a colon right there just do the leftmost we’ll click okay and then we have our options and it’s much simpler now I really would have rather kept all these and because sql’s in there quite a bit but you know a lot of people don’t think SQL is even a programming language so uh we’re going to delete that column now one that I just skipped and I kind of wanted to go back to is this current yearly salary I really want to use this let’s see if we can use it I here’s what I want to do with it and this is not perfect um but for this video I want to try it what I want to do is break up these numbers 106 25 and then take the average of those numbers so then we’ll use some docks in there so we’ll take 106 125 create that into two separate columns then we’ll create a third column that will give us the average of those two numbers so we’ll do 106 plus 125 divided by two and then we’ll have the average of that now that is not perfect but it’s going to give us at least you know an average a kind of roundabout number because they gave us this range they said my salary is between 106 125,00 000 so if we say that their salary was 112,000 at least gives us it makes it usable it’s a numeric value instead of being this which is text which we really we could use and and I’ll show you how to do that because we’re going to keep this column I’ll create a copy of this and I’ll show you the difference between this and using the average but for but for this data cleaning portion let’s just try it let’s see what we can do and see if we can make it work so first let’s create a duplicate so we’re going to uh duplicate the column so now we have this copy at the very very end and we can use this one instead of having to use the original way way way back here so we’re going to leave that one how it is and we’re going to use this one so let’s go ahead and split this one up we’re going to click on the column header then we’re going to click on split column and we’ll do it by digit to non-digit and if you look at it right here it’s broken it out kind of um in the fact that now in this one we just have numeric values and in this one we have k-h numeric or just Dash numeric and now this can be easily cleaned whereas this one we can just completely get rid of because it’s only K so we’ll just remove that column and then in this one we’re going to rightclick we’re going to click on replace values and so if it just has we’re just do a k a we’ll replace with nothing do okay and then for the last one we’ll go to replace values and we’ll do it the dash or the minus sign and we’ll place that with nothing and so now we have our values as well oh we also have a plus let me get rid of because that’s when some people had 250 or 225,000 plus so for that one the average is just going to be 225 we’ll have to specify that in our decks I forgot but actually if somebody has 225 let me find this plus really quick uh let me filter by it because that’s a lot faster what we actually want to do for the purpose of this one is we want to put 225 here so that when we do 225 plus 225 divide by two it comes out to 225 that’s just what we’re going to put it as and there’s only two people so uh I’m actually going to replace this I’m going to do replace values I’m going to say plus with 225 and we’ll click okay on awesome we can unfilter these select all so we’re going to go right up here to add column we’re going to say custom column and we’re going to go right over here actually let’s make it uh average salary so we get average salary so we’re going to insert this we going to say parentheses and we’re going to say plus this insert and close the parenthesis divided by two and it says no syntax errors have been detected let’s click on okay and it’s giving us an error so it’s saying we cannot apply operator plus to types text and text which makes perfect sense these aren’t uh numbers so let’s make it a whole number and let’s make it a whole number and then let’s see if this will actually work now or maybe we just need to try a whole another one so let’s try transform or add column custom column let’s try this all again see if uh I can make it work insert this one plus this one and we’ll do divided by two and let’s try this one and there we go so now let’s get rid of this column columns and we can actually remove these ones as well because now we have this um average salary column which when we look at this or when we use this uh we can let me see if I can just move this way way way over all right I might cut because this taking forever so if you take the average of these two numbers you’ll get 53 if you take the average of 0o and 40 you’ll get 20 so now we have this average salary and again when we get to the actual visualization part I’ll show you why this isn’t as useful as having this average salary and just a reminder this is not perfect uh I wouldn’t typically do this especially if I had it in Excel or if I was you know creating this survey in a different way I would probably have a very specific value where they can do it on a slider but this is how it is so we’ve at least made it usable or more usable in my mind and we have a few other things that we can change like what industry do you work in where we can break this one out so I’m going to go ahead and break this one out as well as this one right here which country do you live in I’m going to breako both of those out to where it’s the country or other I’m not going to have these other values although there are a lot of them because there’s a lot of people who live in these different countries but we can’t really do that super well in here because again the same issue kept happening Argentina Argentina Argentine a Australia so we can’t normalize those values unless we spend just a copious amount of time doing do that so I’m going to go ahead and do these I’m going to fast I’m going to fast speed this so it goes a lot faster so I’m just going to go silent and let this happen really quick and then we’ll get to the end and we’ll actually start building our visualizations all right so we’ve split them up and as you can see we have all these options as well as other and I think you know there let me tell you there is so much more that we could do with this I mean just so many other things but this is like what the bare minimum of what we need for this project so let’s go ahead and close and apply this and if we need to come back at any point and actually fix anything or change anything we can so it’s not like that’s permanent um so as you can see we have everything over here we have all of our data as it is transformed in here as well and now we can start building out our visualization so let’s go back to our report and let’s start building something out all right so let’s add a title to our dashboard make this right at the top call this the data professional survey breakdown and let’s make that quite a bit larger make it bold why not and we’ll put that in the center and now let’s um let’s add some effects let’s change that background to something like it’s too dark something like this and I do not like that bold let’s take that off there we go so something like this just has a quick title to what we’re about to do what we are about to build so we’re going to start off with the most simple visualizations that we’re going to do and we’ll kind of work our way towards kind of the harder ones so the first one that we’re going to start off with is a card and the cards are obviously like just super super easy they usually just display one piece of information so we’re going to go right over here to the very bottom at the unique ID and we’re going to select it and we’re going to say a count of distinct or a count it doesn’t matter um and it says 630 count of unique ID now we’re not going to keep that as is we’re actually going to go right over here we going to say rename for this visual and it says count of unique ID but we’re going to say count of survey takers and you can say whatever you want here but in in general that is what it is we’re we’re counting how many people um you know took this survey and that’s just a kind of a total maybe I say total amount or of survey takers but you can say count of survey takers how many people took the survey so let’s click out of there let’s click on card let’s make it about the same size size we’re going to drag it up here and try to make them about the same we will in a little bit we’ll make them the same size um but for this one we’re going to look at age so we’re going to look at current age so I’m click on that and we’ll say want the average age so our average age taker is almost 30 years old so let’s go right over here we’re going to say rename for this visual we’ll say average age of survey this might be too long average age of survey taker again name it whatever you’d like so again these are meant to be high level numbers so when somebody’s looking at your dashboard they can just really quickly glance at this and know exactly what it is instead of like some of these other visualizations that we’re about to create they don’t really have to dig into it look at the x- axis the y axis the the different uh Legend colors and whatnot they can just see these high numbers and get a really quick glance of the data now let’s create our first visualization and what we’re going to do for that one is a clustered bar chart so let’s go ahead and click on the clustered bar chart and create as small or as large as we’d like and for this one we’re going to be looking at the job titles now remember we kind of change the job titles or you know uh transform those if you want to say that so we’re going to look at Job titles and then we’re going to look at their average salary and if you remember we transformed that one as well we have average salary now this one is it looks like a text right now so it may not work properly and what we’re actually going to do is go over here I want to see the average salary so let’s click on average salary and see if we can change this data type from a text to a decimal number let’s click yes I forgot to do that when we were transforming it and there we go this is perfect um so now we can go back and we can select our average salary and as you can see it has this um this function symbol so now we can click on it and it’ll look a lot better and although this says average salary as the title it’s actually doing a count or the sum so we can click average right here and what we want to do is actually break this down by the job title and so now we can see data scientists are making the most by far they’re making average of 93,000 at least from the survey takers that took it then we have our data Engineers making 65,000 data Architects are making 63 and then we the data analysts data analysts are right here making 55 so again we had 630 people take this survey and so the vast majority of them were data analyst so this one’s probably the most accurate out of all of them and I actually don’t like how this looks as the clustered bar chart let’s try the Stacked bar chart and put this as the legend that’s more more what I was going for I don’t know I didn’t want as skinny because when you’re doing this one it typically they have multiple options per um uh x axis and so I think that’s why it was that little skinny line but this one is more what I was looking for but let’s make that smaller and let’s definitely change that title because good night um this is like incredibly long let’s go over here to this format visual we’ll go to the general the title and we’re just going to say average salary by job title just like that and this looks a lot better now we’re not going to kind of format all our whole dashboard yet we’re going to create our visualizations and then we’re going to kind of organize everything and kind of play Tetris with it to make it look the best so we’re just going to minimize this and put it right up here for now um but we will go back and kind of make everything look better at the end and actually while we’re here I also want to change this as well so rename for this we’re GNA say job title Oops why did I do that job title and for this one we’re just going to say name average salary there we go looks much better much cleaner uh took away a lot of the anxiety that I was feeling about two minutes ago when we first put that up there so let’s go on to our second visualization the next one that I’m interested in is actually what programming language people were using the most so we have salary there’s a thousand different things we can look at in here but I want to know you know what is people’s favorite programming language so let’s take a look at that so we have favorite programming language let’s find that so we have our favorite programming language and we also have how many people actually took it or the unique people so right now this is columns we don’t want that let’s um let’s do a clustered column chart click on this right here and it looks like here we go that is kind of what we’re looking for and instead of count of unique ID we’ll say count of let’s do count of Voters and for favorite programming language we’ll say favorite oops programming language and get rid of that as well and then we’re going to go into here also and change the title and say favorite programming languages or favorite pro programming language just like this now let’s make this a lot bigger so you can see it but really quickly at a glance you can see python is by far the most popular are other C++ JavaScript Java now all we’re seeing is the count so it’s all the same same it’s just blue we can see how many people voted for each one but if we wanted to break it out similar to how we did with the job titles we could still do that so all we’d have to do is break it out uh bring this job title down to the legend it now breaks out like this and that’s not exactly what I was going for I was going more for something like this where we can see the still the whole count but now we can see who is actually voting for these things so I’m just not a huge fan of the colors that are pre-selected here and kind of the whole whole theme of this dashboard at the very end we’re going to completely revamp this change a bunch of colors the background and make this look a lot nicer rather than just the white background like we have it um and so for now let’s just make this a lot smaller and put it into this corner these will not be staying there but we need to we need room to create our next visualizations and just a cleaner space to do things now the next thing that I really want to include is a way to break break down where they’re from their country because especially something like salary is very dependent on your country whereas the average salary in the United States for a data analyst may be like 60,000 in another country it could be 20,000 that could bring down the average quite a bit so we need a way to be able to break that down now we can do something like a filled map and there’s no problem with that at all um but you know for what we’re building what we’re creating it’s not probably going to work out the best I mean this looks okay we could stick it in the corner or something um and you can do that and that’s perfectly fine I think what I’m going to do is something like a tree map which I don’t use a lot but I want something where they can just click on it they can look at the values distinct they can look at the values and just click on it and it’ll be right there for them so they don’t have to filter it out on their own or no geography and look at this map they can just read C other United Kingdom India United States and click on that and so for example let’s click over here on United States the numbers change quite a bit now the average salary for a data scientist is 139,000 for data analyst it’s 80 and if we look at India you know the average salary for a data scientist is 68 the average salary is 26 for a data analyst that doesn’t mean that they make less money in India that just means that the cost of living is probably lower in India therefore they don’t need the higher US dollars salary because again this was all done in US dollars so just something to think about let’s click out of that so we’ll keep that one as well so now let’s create our next visualization and this is one that I do not get to use enough in my actual job so we’re going to use it in this project um and it’s going to be this gauge right here so let’s add that one put it right over here we’re going to add two of those let’s just go ahead and add another one while we’re at it we’re going to have them kind of like right here right next to each other the first one and these ones are really good for kind of looking at these kind of surveys and I don’t get to work with surveys enough but we can see you know how happy are they in terms of work life balance so we can add that we’re going to add work life balance um and right now it’s doing a count and if we don’t have minimum or maximum values in there yet so it’s going to look kind of weird but we’re going to look at the average rate or the the average score of these then we’re going to pull this over to the minimum value and we want to put that at the minimum and pull this over and add the maximum value so now it actually has 0 to 10 and it shows that the average person is happy with which one was this their average person is happy with their work life balance uh they rate about a 5.74 overall now let’s really quickly change the title of this because this is ridiculous I want to say happy with work life balance and this is their rating uh you know change it to whatever title you want that’s what I’m going to do and we’ll also do happy with their salary let’s click on salary we’ll add that to minimum and we’ll add the maximum value as well to make sure that we know how to use that and then we’ll take the average so not many people are happy with their salary I’m just finding out I mean this is a real survey this is real data so I mean it’s uh pretty interesting let’s go to the title let’s go to happy with or maybe it’s happiness happiness with salary maybe that’s what we should make it and I’m going to change that over here as well I think it sounds better some of this I’ve already planned out some I haven’t this is not something I’ve planned out so uh so we’re going to say happiness with work life balance happiness with salary really interesting um we may go back and tweak these just a little bit in the future but the very last visualization that we’re going to do is male versus female kind of got to have that in there um I don’t typically like py charts and donut charts but you know I’m feeling I’m just feeling it so let’s try it um and we will do see let’s make this larger so we have male female and what do we want to look at like what do we want to measure so we have male versus female we can measure anything um but maybe what we’ll do is the average salary again I mean we’ve kind of only looked at salary once in this one right here um and a little bit of like how happy they are but we’ll look at the average salary between males and females and then we’ll look at not the current age Oops I meant average salary and then we’ll look at the average and it looks like the average salary is actually really close versus males versus females 55 for female versus 53 for male so actually the females are a little bit higher congratulations so they’re just a little bit higher in terms of pay so now we need to start organizing all of this cleaning it up making it look a lot better than it does right now it looks great uh you know but we can do a lot more with this so I’m gonna I’m we’re we’re going to keep these or all these kind of over on this left hand side I’m going to put this I want this up here we also need to change that title I want this up here um and again we’re GNA kind of change the theme as we go I just want to format it right we have it just like this let’s change the title of this let’s go to title and we’re going to say country of survey takers uh I’m not the the survey takers I’m not really stuck on that if you find something better you think of something better I would go with that but um you know it definitely doesn’t look bad and where did this where did my other visualization go there it goes um I think this one I want to make kind of more tall um so I might move it this this way jeez this is such a I hate I hate having a lot of visualizations on here it just really is annoying to me so what we’re going to do I think we’re gonna step this to the side put this to the side as well I want to make it to where it’s just okay I didn’t want it to cut off we’ll do that might make these make these a little bigger actually so I want it to kind of match the size like right there match this perfect this one I kind of want to bring over here and bring it down a little bit maybe something like this maybe I’m not sure I’m not I’m not sold on that um I added a few different visualizations that I didn’t have in my original so now I’m kind of having to do this on the fly so um I might fast forward some of the parts where I’m like really thinking about it or taking too much time on it but I’m going to bring this down a little bit actually because I don’t like how close that is to um the the text above it but one thing we do need to do I’m going to put this up kind of like this I think that looks fine I think I’m going to put this at the very bottom so let’s make some room for it all right just like that stretch it to the side and we’ll lower it and I think we’ll keep that as is kind of like this um okay there’s a lot going on in here and there are some things I’m just noticing as we’re walking through this that I kind of missed um like I need to change some titles and stuff like that so let me go ahead and change some of those things so we’re going to do title do average salary by gender or by sex do like that average salary by sex I also don’t like that it’s in the middle um I don’t like that it’s on the outside I want them on the inside for this so let’s go to the details let’s go to inside and see if that looks any better oh that looks terrible um let me see if I can change that maybe I don’t no I definitely want it um I guess we’ll do outside I you can’t even see the information oh the decimal is crazy long um let me go and see if I can change that decimal to just like a whole number or like 1.1 uh because that’s a problem so maybe I need to go over here to the value all right so I think I want to change this one it’s just not working out exactly how I wanted and you guys know if I make mistakes I’m going to keep it in here so you guys can see it I I hoped that this was going to turn out better but it didn’t um one that I do want to add because this is kind of a a breakdown and a nice visualization I want to add this difficulty piece so I want to add this how difficult was it for you to break into data science let’s get rid of these and I want to click on this really quickly see what it gives us um values okay so now this shows us percentages um of how easy it was again it’s neither easy nor difficult difficult easy very difficult very easy these numbers make absolutely no sense we need to kind of order them a little better so I’m going to come over here to slices we have our colors over here we want very difficult to be like the most difficult um so we’re going to make that red and then we want difficult to be maybe like an orange let’s see if we can find an orange there we have an orange this does not look red enough there we go oh no no no very difficult is red difficult is orange we have neither easy nor difficult and that’s kind of a neutral um let’s see if we have something neutral in here kind of like this yellow I don’t know let’s try it out then we have easy and very easy and these will be like our Blues so I’m going to keep that um I’m going to keep that kind of like a dark blueish and then our blue for super easy is just going to be like really blue U and that doesn’t look bad the I mean look I’m I’m not a color person I I’m not great with colors and we’re going to kind of organize this in just a little bit but this looks better to me um but we need to change up some stuff as well like the title need to do difficulty to break into Data there we go and we’re also going to change this title right here we’re just say difficulty difficulty difficulty this looks better to me um again not perfect and there’s a thousand different things you could have done but that’s just what we’re going to do I need to go through here and see what I need to change so right off the bat I can see I need to change this um to let’s see right here I’m going to rename this job title just like we did in this one right here uh count of Voters that’s fine programing language breaking into difficulty happiness happiness average count okay okay so what we have here is very close to a finished product now it’s not 100% complete I mean I I do want to make it look a little nicer rather than just the typical white so what we’re going to do we’re going to go up here we’ll go to uh what is it View and we have all these different filters and we’re just kind of play around with it see if we can find something that we like um this doesn’t look too bad it’s uh not really my style u we can do this one Frontier this is pretty neat I kind of am digging this we might come back to it I like the natural tones I don’t know why I said tones like that but I did um this one’s not bad but I don’t I don’t that’s not that’s not my I don’t like how dark that is um and so maybe it’s like you know uh we change like the background color of all of these as well as match it with um match it with something else whatever you want genuinely you customize this however you want I kind of like this one it’s kind of groovy man and um it’s not perfect by any means but what we can do and we can customize this current theme we can come in here customize this theme however we’d like I personally don’t want color five which is the data analyst color I don’t like it to I don’t want to go out go and change it because I don’t like it but I don’t really like that color per se you know I might want to choose a different color um but it has to be like this muted like it has a style to it so you can come in here and you can customize this and make it however you’d like and and really mess around with it play around with it for me uh I’m just going to keep it how it is because I don’t really want to mess with it and break it or anything like that so um let me just that up just a tiny bit so this is it this is the project I hope that it was helpful um I am not joking when I say that I’m because I’m gonna do a different project I’m gonna go really in depth in another project it’s probably GNA be like a two hour project it’s gonna be crazy long um well for a YouTube video but I can see doing a thousand different things with this data creating a really great dashboard really cleaning cleaning the data which is a large part of of actually doing this and we didn’t do much data cleaning at all there’s just so much you can do with this and so really dig into this see what you like see what you don’t like see what you want to clean what you don’t want to clean you could put it in SQL you could put it in um Excel and just and just standardize the data to make it a lot more usable do whatever you want with it I mean I I took this survey for you guys that we could use it so go out and use it and make the best dashboard that you can POS possibly do so I hope that this was helpful I hope that you enjoyed this thank you so much for watching this video If you like this thank you so much for watching if you like this video be sure to like And subscribe below and I’ll see you in the next video [Music]

    By Amjad Izhar
    Contact: amjad.izhar@gmail.com
    https://amjadizhar.blog

  • Iqbal: Faith, Nation, and the Modern Muslim by Maulana Maudoodi

    Iqbal: Faith, Nation, and the Modern Muslim by Maulana Maudoodi

    This text comprises excerpts from an interview and lecture discussing the life and legacy of Allama Iqbal, a prominent Muslim figure in early 20th-century India. The speaker analyzes Iqbal’s impact on Indian Muslims during a tumultuous period marked by political and religious upheaval, highlighting Iqbal’s efforts to combat Western influence and foster a strong sense of Muslim identity and self-reliance. The sources also address misinterpretations of Iqbal’s views, particularly claims that he was a socialist, and emphasize his unwavering commitment to Islam. Furthermore, the text explores Iqbal’s profound spirituality and personal piety, contrasting his public image with his private life of devotion and simplicity. Finally, the speaker urges listeners to uphold Iqbal’s vision of a strong, unified Muslim community.

    A Deep Dive into the Thought of Allama Iqbal: A Study Guide

    Quiz

    Instructions: Answer the following questions in 2-3 sentences each, based on the provided source material.

    1. According to the text, what was the state of Muslims in India between 1924 and 1938, and what caused this state?
    2. How did Muslims react to the failure of the Khilafat movement, according to the source?
    3. What is meant by “Maghribiyat” in the context of the text and why did Iqbal oppose it?
    4. What did Iqbal believe was the root cause of the Muslims’ problems?
    5. What did Iqbal mean when he said that the nation is made by faith?
    6. What was Iqbal’s view on the relationship between religion and politics?
    7. According to the source, what did Iqbal advocate as a solution to the problems faced by the Muslims of his time?
    8. Why does the text assert that Iqbal was not a socialist or believer in “Islamic socialism”?
    9. According to the text, what was the role of Allama Iqbal and Quaid-e-Azam in the creation of Pakistan?
    10. How did Iqbal’s understanding of Islam deepen over time, as described in the text?

    Quiz Answer Key

    1. The text describes a period of crisis for Muslims in India between 1924 and 1938. Muslims faced disappointment and defeat after the failure of the Khilafat movement. This led to a loss of faith in their leadership and a state of despair.
    2. The failure of the Khilafat movement led to severe disappointment among Muslims who had invested everything in it. Many lost their faith in the leadership that had promoted the movement, and were also left feeling disillusioned and betrayed.
    3. “Maghribiyat” refers to the influence of Western culture and philosophy. Iqbal opposed it because he believed it was causing Muslims to abandon their own traditions and culture.
    4. Iqbal believed the root cause of the Muslims’ problems was their loss of self-recognition. Muslims had become ashamed of their own praise, culture, religion and morals, believing instead the West had superiority.
    5. Iqbal emphasized that a nation is made by faith rather than by nation or language. He wanted Muslims to see themselves as a unified community with shared beliefs and culture, distinct from other communities.
    6. Iqbal believed that politics can only be good when guided by God. He stressed that separating politics from faith would lead to barbarism and cruelty.
    7. According to the text, Iqbal advocated for Muslims to follow the Quran and implement the principles of Islam in their lives. He believed that only through Islam could Muslims overcome their problems.
    8. The text emphasizes that Iqbal’s emphasis was on the implementation of Islam and not a hybrid of socialism and Islam. According to the source, while he may have used the term “Islamic Socialism” he didn’t preach it, and there’s no evidence that he believed it.
    9. The text indicates that Iqbal gave Muslims the vision for Pakistan through his emphasis on Islam and a separate identity. Quaid-e-Azam then brought the vision into reality by creating the actual state.
    10. The text asserts that Iqbal’s understanding of Islam deepened over time and became his sole focus. In the later phases of his life, he became immersed in the Quran. He would not keep any other book in front of him, using it as the basis for all of his thoughts and actions.

    Essay Questions

    Instructions: Answer the following essay questions based on the provided source material. Each essay should demonstrate a comprehensive understanding of the text and be 3-4 paragraphs in length.

    1. Analyze the complex relationship between the Khilafat movement, Hindu-Muslim relations, and the subsequent disillusionment of Muslims in India as described in the provided text. How did these events shape Allama Iqbal’s thinking?
    2. Discuss Allama Iqbal’s critique of Western civilization and the concept of “Maghribiyat,”. How did his experiences and perspectives inform this critique, and what solutions did he propose to counteract it?
    3. Explore Iqbal’s concept of Muslim identity and his views on nationalism and faith. How did he advocate for a distinct Muslim identity, and why was it crucial, according to the text, to preserve that identity?
    4. Examine the text’s discussion of Iqbal’s philosophy, particularly his view on the relationship between politics and religion and what he saw as the failings of contemporary Muslim leadership.
    5. Evaluate the text’s portrayal of Allama Iqbal’s evolution as a thinker, from his exposure to Western education to his complete immersion in the Quran. How does this journey inform our understanding of his overall message?

    Glossary of Key Terms

    • Khilafat Movement: A movement in India (1919-1924) to support the Ottoman Caliphate which was led by Indian Muslims, as they saw the Caliphate as a symbol of pan-Islamic unity.
    • Maghribiyat: The influence and adoption of Western culture, philosophy, and values. Iqbal saw this as a form of cultural imperialism that Muslims should reject.
    • Nazm: (Often in reference to Iqbal’s writing) Poetry or verse, often used in this text to describe the type of work he produced.
    • Tahrir: In this context, the movement to restore the Caliphate, and liberate Muslim Holy Places from foreign control.
    • Mokama: (Likely a mispronunciation, perhaps of Mecca) The Holy city of Islam.
    • Namazi: (Also spelled ‘Namaz’) The Islamic practice of prayer.
    • Roza: Islamic fasting, typically during the month of Ramadan.
    • Shariat: Islamic law, derived from the Quran and the teachings of the Prophet Muhammad.
    • Maulvis: (Also spelled ‘Maulvi’) A Muslim religious scholar, particularly one who is well-versed in Islamic law.
    • Ulema: (Also spelled ‘Ulama’) Muslim religious scholars.
    • Hadith: Sayings and actions of the Prophet Muhammad, used as a source of guidance in Islamic law and theology.
    • Agyaats: (Likely a mispronunciation, likely ‘Agni’, which means ‘fire’ or the fire worshippers) A reference to Hindu people in a derogatory way.
    • Kalimi: (Also spelled ‘Kalima’ or ‘Kalime’) An Arabic term referring to Islamic declaration of faith.
    • Faqr: In this context, the state of being devoted to God and independent of worldly desires, in the way that a true fakir lives.
    • Quaid-e-Azam: An honorific title for Muhammad Ali Jinnah, the founder of Pakistan, meaning “Great Leader.”
    • Pakistan: In this context, meaning the creation of a separate and independent Muslim state in India, founded on the concept of distinct Muslim culture and community.
    • Madrasahs: Islamic religious schools.
    • Khatib hazrats: Islamic preachers or orators
    • Amrit, Naaziyat, First Year: References to specific ideologies that are criticized in the text. They represent Western/European forms of governance that the text argues are not aligned with the principles of Islam.
    • Mustfair: In this context, a place for residence.
    • Akliat: A person’s intellectual ability.
    • Wala Jana: Devotion and affection to the Prophet Muhammad’s family.

    Iqbal: Islamic Revival and the Creation of Pakistan

    Okay, here is a detailed briefing document reviewing the main themes and important ideas from the provided text:

    Briefing Document: Analysis of Iqbal and His Impact

    Introduction:

    This document analyzes a series of excerpts focusing on the life, works, and impact of Allama Muhammad Iqbal (Rahmatullah Alaih). The sources provide insights into the socio-political context of Iqbal’s era, his intellectual contributions, and his enduring legacy, particularly in relation to the identity and destiny of Muslims in India. The excerpts cover a variety of perspectives on Iqbal, exploring his views on Islam, nationalism, Western influence, and the importance of self-awareness.

    Key Themes and Ideas:

    1. The Critical Period of Muslim History in India (1924-1938):
    • The period was marked by the failure of the Khilafat Movement, which left Muslims disillusioned and vulnerable. Muslims had “invested all their wealth in the Khilafat” and “left no stone unturned in uniting…with those Hindus…only on the hope that somehow we will be able to save the institution of Khilafat.”
    • The Congress and Hindu leaders, with whom Muslims had allied, turned against them, leading to Hindu-Muslim riots and a “double defeat” for the Muslims. They had trusted Gandhi “the most” but he “never had the opportunity to open fire on us Muslims on this issue against the Hindu castes.”
    • This resulted in “severe disappointment” and a loss of faith in the existing leadership, leaving the Muslim community in a state of despair and questioning their future. “Muslims lost their faith in this leadership which had raised the issue of Tahrir and had joined hands with Congress.”
    1. The Rise of Anti-Islamic Trends:
    • The period saw a rise in anti-religious sentiment among Muslims, with open criticism of Islam and its teachings. There was a shift where people felt those who prayed “should be ashamed of his actions, and the one who is not doing so need not be ashamed.”
    • The influence of Communism and Western ideologies impacted Muslim education, promoting secular and anti-religious ideas.
    1. Iqbal as a Force for Islamic Revival:
    • Amidst the turmoil, Iqbal emerged as a powerful force for Islamic revival and preservation of Islamic and religious values. He was seen as the “greatest power…for the Islamic Tariq Islamic Tehri for the call of Islamic passion” during the 14-year period from 1924-1938.
    • He attacked Western culture (“Maghribiyat”), including “female chauvinism”, effectively challenging its dominance over the Muslim mind, while addressing its appeal from the perspective of a man fully familiar with western culture. He “knew more about the west than them and was more aware of the philosophy of the west and the western life than them.”
    • He aimed to break the “mental slavery” of Muslims, encouraging them to recognize their own worth, heritage and the fact that “you are the most powerful person in the whole world.” They had become ashamed of their own traditions, religion, morals, thinking that “if there is anything worth praising in the world, then it has been presented only by the people of the Maghreb.”
    • He emphasized that Islam’s principles are relevant in every era and not an outdated system, stating that “Islam is ancient and the arrival of the prophet, Islam can never become old, its principles are worth implementing in every era.”
    1. Iqbal’s Philosophy of Self-Recognition (Khudi):
    • Iqbal urged Muslims to recognize their own identity, culture, and religious values. He created the feeling that “you have lost yourself and have turned your reality around, understand your comic task, and implement your culture at your home for the sake of its height”.
    • He challenged the notion that Muslims should be ashamed of their heritage, emphasizing the uniqueness and strength of Islamic culture. He taught that “nation is made by faith and our country” not “nation and language”.
    • He aimed to counter the feeling that “the work of the people of the world is to just chant Allah Allah or read the Quran and Hadith in mosques and madrasas” and instead, asserted that there should be no separation of “politics from day,” because “the result of this is there can be no other explanation except barbarism and cruelty.”
    1. Iqbal’s Critique of Nationalism and Patriotism:
    • He critiqued the concept of nationalism, arguing that it could lead to the dissolution of Muslim identity by saying, “the nation too is a ghost and the condition of the nation is doubtful.” He rejected that “there is no threat to your nation from your nation”.
    • He emphasized the importance of Islamic unity, countering communalism and the conflicts that divided Muslims.
    • He instilled a sense of “Islamic community” (Ummah) in Indian Muslims, laying the groundwork for the creation of Pakistan. “If this rigidity had not been done at the time…then this Pakistan would not have existed today.”
    1. Iqbal’s Views on Politics and Religion:
    • He argued for the integration of religion and politics, suggesting that politics without a moral compass is destructive, “politics can be good only when God is present with it as a guide to keep it on the right path.”
    • He rejected the idea that Islam was a source of backwardness, stating that the problems of the era arose because of a flawed understanding and application of Islam, and that “all the oppression, tyranny, deceit, poor and humanity that is being cried for, is all the work of these Islam.”
    • He believed that the solution to the problems faced by Muslims lay in the implementation of Islamic principles. “If there is any solution to the problems of the Muslims, then it is only in the implementation of the Islamic principles, then it is in me.”
    1. Iqbal’s Stance Against Socialism:
    • The source addresses the claim that Iqbal was a socialist. It argues that such an interpretation is a misrepresentation of his work, which was consistently focused on Islamic principles. “He was never convinced that by adopting anything with Islam or anything with Islam, we can be saved.”
    • It explains that his use of the term “Islamic Socialism” was incidental and not an endorsement of the political system, but rather an assertion that Islam encompasses social justice. “There is no need to go towards any socialism for shruti and justice, all this is present in Islam also, rather it would be more correct to say that it is present only in Islam.”
    • The source argues that Iqbal’s poetry and writings were often interpreted incorrectly, specifically citing his couplet about burning fields as a metaphor for divine justice, not a call to action for humans. “The sequence of words was that Allah Taala is ordering his angels that the oppression and cruelty that is going on in the world is inviting our punishment.”
    1. Iqbal’s Devotion to Islam and the Quran:
    • The document emphasizes Iqbal’s deep devotion to Islam, particularly during the final phase of his life. It notes his shift towards a more Quran-centric approach, that “in the last phase, Iqbal had separated all the books from the Quran and he would not keep any other book in front of him.”
    • He saw the Quran as the ultimate source of wisdom and guidance, and he approached life and philosophy through its lens. “Whatever he thought, whatever he saw, he saw it from the point of view of the Quran.”
    • His devotion to the Prophet Muhammad was profound and unquestioning.
    1. Iqbal’s Legacy and Pakistan:
    • Iqbal’s vision was instrumental in the creation of Pakistan, which was founded on the idea of a separate Islamic identity. It is said that “Iqbal ( may Allah have mercy on him) gave you a country on the basis of this. He gave you concern and vision.”
    • The document warns against deviating from the founding principles of Pakistan, emphasizing the importance of maintaining its Islamic foundation. “If the basic vision of this country or in other words the foundation of its vision or its vision is removed, then this country cannot survive.”
    • It calls on the Muslim community to unite and uphold the principles of Islam.

    Conclusion:

    These sources present a multifaceted view of Allama Iqbal, emphasizing his role as a catalyst for Islamic revival and self-awareness among Muslims in India. The text stresses that he fought Western cultural dominance, promoted the idea of a separate Muslim identity and community, and laid the intellectual foundation for the creation of Pakistan. The sources also highlight the importance of understanding Iqbal in his full complexity and not to reduce his message through simplistic interpretations. His deep love of the Quran and his devotion to Islam are emphasized, as well as his rejection of socialism as a separate doctrine from Islam. The enduring significance of his vision for Muslims globally is also emphasized.

    Allama Iqbal: Life, Thought, and Legacy

    Frequently Asked Questions about Allama Iqbal

    1. What were the key challenges faced by Muslims in India between 1924 and 1938, the period during which Allama Iqbal was particularly active?
    2. During this period, Indian Muslims experienced significant disillusionment and challenges. They had invested heavily in the Khilafat Movement, hoping to preserve the institution of the Caliphate and protect Muslim holy sites. However, their efforts were ultimately unsuccessful. Furthermore, they faced increasing hostility from Hindus and the Congress party, with whom they had previously cooperated, leading to a series of Hindu-Muslim riots. This resulted in a sense of betrayal and a loss of faith in their leadership, coupled with rising internal discord, a perceived threat of Hindu dominance, and the spread of Western and communist ideas which challenged traditional religious practices and beliefs.
    3. How did Allama Iqbal respond to the challenges faced by the Muslims of India?
    4. Allama Iqbal emerged as a powerful voice against the prevailing despair. He actively worked to revive Islamic fervor and self-respect among Muslims. He did this primarily through his poetry and philosophical writings, attacking Western culture and its influence on Muslims, which he saw as a form of mental slavery. He sought to reawaken a sense of Islamic identity, pride in their heritage, and the belief that Islam was a viable and relevant way of life for the modern era. He emphasized that a Muslim’s strength was in their own culture, religion, and morals, not by emulating the West. He stressed that Islam was not an outdated system, but a timeless truth relevant to any era.
    5. What was Allama Iqbal’s view on nationalism and how did it relate to his concept of the Muslim community?
    6. Iqbal strongly critiqued the concept of territorial nationalism, arguing that it was a “ghost” and a “doubtful condition.” He asserted that a nation is not defined by territory or language, but by faith and shared culture. He emphasized that Muslims, due to their shared beliefs and culture, formed a distinct community (or Ummah) separate from other communities, including Hindus. This viewpoint was meant to counter the idea of Muslims being absorbed into a larger Indian national identity and is often seen as a key step towards the eventual demand for a separate Muslim state.
    7. How did Allama Iqbal view the relationship between Islam and politics?
    8. Iqbal believed that politics divorced from religion was dangerous, leading to barbarism and cruelty. He argued that politics must be guided by God and that the contemporary problems plaguing humanity were a result of such separation of politics and faith. He rejected the notion that Muslims should confine themselves to religious practices alone, with no engagement in political matters, as he saw Islamic principles as applicable to all aspects of life, including governance. In essence, he advocated for a political order guided by Islamic principles and values.
    9. What was Allama Iqbal’s view of western thought and philosophy and why did he criticize it?
    10. While deeply knowledgeable about Western philosophy and culture, Iqbal strongly critiqued it. He believed that its dominance over Muslims was leading to a loss of their own cultural identity and values, in turn causing them mental and spiritual enslavement. He specifically criticized western materialism, secularism, and what he viewed as its corrupting influence on morality. He sought to expose the flaws of Western civilization and its incompatibility with Islamic values, motivating Muslims to return to their own heritage for solutions. He believed that a society based solely on secularism was doomed to fail.
    11. How did Allama Iqbal’s view of Islam influence the idea of Pakistan?
    12. Allama Iqbal is considered a key intellectual figure behind the idea of Pakistan. He believed that Muslims could not preserve their culture and identity within a united India where the Hindu majority was increasingly dominant. His 1930 speech, while not explicitly using the word “Pakistan,” laid out the foundation for a separate Muslim state where Islamic principles could guide society, providing Muslims with the space needed to safeguard their identity and culture.
    13. Was Allama Iqbal a socialist, and what does the source say about this claim?
    14. The sources strongly refute the idea that Allama Iqbal was a socialist, either of a Western or Islamic variety. While he occasionally used terms like “Islamic Socialism,” this was to make the point that the justice and social concern that they claim to address are found within Islam, but are superior as God is their basis. The sources argue that attributing socialism to him is a misrepresentation of his lifelong commitment to promoting Islam. He did not develop or preach a systematic socialist ideology but rather emphasized Islamic principles and values as the solution to the issues of his time. His criticisms of injustice should not be confused with advocating socialism.
    15. What was the importance of the Quran in Allama Iqbal’s life and thought?
    16. The sources depict the Quran as the absolute center of Iqbal’s life and thought, especially towards the end of his life. It’s described that he distanced himself from all other books, finding that the Quran contained all wisdom. He interpreted everything from a Quranic perspective. His actions were seen as an attempt to live a life according to its principles, and he had deep devotion and unwavering faith in the teachings of the Quran and Prophet Muhammad’s teachings, even if it went against the conventions of his era. His approach was to live and act in line with Quranic teaching and the actions of the Prophet.

    Iqbal: Reviving Islamic Identity

    Allama Iqbal’s life was marked by his efforts to revitalize Islamic thought and identity in the face of various challenges, particularly during the period of British colonial rule in India.

    • Historical Context: From 1924 to 1938, Muslims in India experienced a critical period, marked by the failure of the Khilafat movement and increasing Hindu-Muslim tensions. Muslims faced a “double defeat” with the collapse of the Khilafat and attacks from those they had allied with. This period also saw a rise in Western cultural influence and criticism of Islam, leading to a sense of despair and a loss of faith among Muslims.
    • Iqbal’s Response to the Crisis: In response to this, Iqbal emerged as a powerful force for the revival of Islamic spirit and values. He aimed to combat the mental slavery and feelings of shame that had gripped the Muslim community, encouraging them to recognize their own worth and the value of their culture, religion, and morals. He emphasized the timeless relevance of Islam and its principles, and challenged the notion that it was outdated or incompatible with the modern world.
    • Iqbal’s Critique of Western Culture: Iqbal was critical of the influence of Western culture (“Maghribiyat”), which he saw as a threat to the Muslim identity. He attacked what he perceived as the negative aspects of Western civilization, including materialism and a focus on nationalism at the expense of religious identity. He also criticized Western politics.
    • Iqbal’s Focus on Islamic Identity: Iqbal emphasized the importance of a distinct Muslim identity based on faith and culture. He argued that Muslims were a unique community with their own beliefs and traditions, separate from other groups in India. He stressed the concept of Islamic unity, countering communalism and divisions within the Muslim world. He worked to instill a sense of Islamic pride and purpose in Muslims, particularly the youth.
    • Iqbal’s Philosophy and Vision:
    • Iqbal’s philosophy was centered on the idea of self-realization for Muslims, urging them to understand their true selves and their potential. He believed that Muslims had lost sight of their own heritage and had become overly influenced by Western thought.
    • He advocated for the implementation of Islamic principles in all aspects of life. He believed that the solution to the problems faced by Muslims was in adhering to the Quran and the teachings of Islam.
    • He emphasized that political freedom was not the ultimate goal, but rather the protection of Islam and the ability for Muslims to live according to its principles. He was a proponent of a separate and independent Muslim state, which ultimately led to the idea of Pakistan. He believed that Muslims could not maintain their culture while living with Hindus.
    • Iqbal’s Later Life: In his later years, Iqbal increasingly focused on the Quran, using it as his primary source of knowledge and guidance. He rejected any form of non-Islamic viewpoints. He also emphasized the importance of following the example of the Prophet Muhammad. He was critical of those who saw Islam as a source of sorrow and instead believed it to be a source of guidance and truth.
    • Iqbal’s Legacy:
    • Iqbal’s work was instrumental in shaping the intellectual and political landscape of the Muslim community in India. He is credited with inspiring the creation of Pakistan, with the vision of the country coming before the actual formation.
    • His poetry and writings are known for their depth and powerful articulation of Islamic ideals. He used his art to promote Islamic values and challenge the status quo.
    • He is considered a key figure in the revival of Islamic thought and the development of a modern Muslim identity. He believed in the importance of action and the implementation of Islamic principles in the world.

    Iqbal’s life can be seen as a struggle against cultural and political subjugation, and his lasting legacy lies in his passionate defense of Islamic values and his vision for a vibrant and self-aware Muslim community. He is seen as a figure who used his education, including his knowledge of Western thought, to advocate for the importance of Islam and Muslim identity.

    Muslim Disillusionment in India (1924-1938)

    The sources describe a period of significant disappointment for Muslims in India, particularly between 1924 and 1938. This disappointment stemmed from a combination of political setbacks, social challenges, and a perceived crisis of faith.

    • Failure of the Khilafat Movement: Muslims had invested considerable resources and effort in the Khilafat movement, aiming to protect the institution of the Caliphate and Muslim holy places. The ultimate failure of this movement was a major blow, leading to a sense of disillusionment. The Khilafat, which they had tried to save, was ruined, and the residents of the holy places became divided and engaged in conflict.
    • Betrayal by Allies: Muslims had allied with Hindus and the Congress party during the Khilafat movement. However, after the movement’s collapse, they faced attacks from their former allies, leading to Hindu-Muslim riots. This betrayal contributed to their disappointment, as they had trusted leaders like Gandhi, who did not stand up for the Muslims against oppression.
    • Double Defeat: Muslims experienced a “double defeat,” having failed to achieve their goals in the Khilafat movement and facing hostility from those with whom they had allied. This left them in a state of despair and broke their courage.
    • Loss of Faith in Leadership: The disappointment led to a loss of faith in the leadership that had advocated for the Khilafat and allied with Congress. Muslims felt that their leaders had failed them, contributing to a sense of being lost and without direction.
    • Fear for the Future: There was a widespread fear that non-Muslims were working to completely occupy India, while Muslims were ill-prepared to face the situation. This fear further intensified their sense of disappointment and helplessness.
    • Internal Crisis: In addition to the political and social challenges, Muslims also faced an internal crisis. There was a rise in open criticism of Islam and a decline in religious observance. People began to question the value of traditional practices like prayer and fasting, and some felt ashamed of their religious identity.
    • Influence of Western Culture: The rise of Western culture and communism influenced the education of Muslims, and religious texts began to be openly challenged. This further contributed to the sense of crisis and the weakening of traditional values and faith.
    • Political Disunity: Muslim leaders were also in disarray. Those who had previously defended Islam either became silent or became opponents of the Muslims, and some abandoned the path of inviting people to Islam for inviting them to community and religion. This lack of unified and effective leadership added to the community’s challenges.

    In the midst of this widespread disappointment and despair, Allama Iqbal emerged as a powerful figure, working to revive the Islamic spirit and address the root causes of Muslim disillusionment. He challenged the mental slavery imposed on Muslims and urged them to recognize their own value and potential, aiming to restore their faith in themselves and their religion.

    The Khilafat Movement: Failure and Disillusionment

    The Khilafat Movement was a significant effort by Muslims in India to protect the institution of the Caliphate and Muslim holy places, but it ultimately ended in disappointment. The movement’s failure, coupled with other factors, led to a period of disillusionment and crisis for the Muslim community.

    Here are the key aspects of the Khilafat Movement:

    • Goal: The primary goal of the Khilafat Movement was to save the institution of the Caliphate (Khilafat) and to liberate Muslim holy places from what they perceived as the clutches of the enemy. Muslims invested significant resources and efforts into this cause.
    • Muslim Investment: Muslims dedicated their wealth and lives to the Khilafat movement. They spared no effort in their attempt to save the Khilafat and free their holy places. They united with Hindus, despite historical differences, hoping that this alliance would help them achieve their goals.
    • Alliance with Hindus: Muslims, putting aside centuries of experience and feelings regarding Hindus and their relationship with Islam, united with them, on the hope of saving the Khilafat and freeing their holy places. They even trusted leaders like Gandhi and made him their leader.
    • Failure and Disappointment: Despite their efforts, the Khilafat Movement ultimately failed. The institution of the Khilafat, which they had fought to protect, was ruined. The residents of the holy places became divided, engaging in conflict and animosity among themselves.
    • Double Defeat: The failure of the Khilafat Movement was a major blow to the Muslims, leading to what is described as a “double defeat”. Not only did they fail to achieve their goals, but they also faced attacks from the Hindus and the Congress party with whom they had allied.
    • Betrayal and Riots: After the collapse of the Khilafat movement, the Congress and Hindus, with whom the Muslims had allied and fought, turned against them, leading to a series of Hindu-Muslim riots beginning in 1924. The leaders of the Congress did not address the oppression faced by the Muslims.
    • Loss of Faith: The movement’s failure led to a significant loss of faith among Muslims, both in their leadership and in the alliances they had formed. They were disappointed by the outcome of their efforts and by the betrayal of their former allies. This left them in a state of despair and broke their courage.

    The Khilafat Movement’s failure was a major factor contributing to the disappointment and disillusionment experienced by Muslims in India during the 1924-1938 period. The collapse of the movement, along with the subsequent betrayal by former allies, created a crisis of faith and identity among Muslims, which Allama Iqbal sought to address through his work.

    Iqbal’s Islamic Revival in India

    The sources describe an Islamic revival led by Allama Iqbal in response to a period of significant disappointment and crisis for Muslims in India. This revival was marked by a renewed emphasis on Islamic identity, values, and principles, and a rejection of Western cultural and political dominance.

    Key aspects of this Islamic revival include:

    • Context of Crisis: The revival occurred in the context of the failure of the Khilafat Movement, which left Muslims disillusioned and facing attacks from former allies. There was a widespread sense of despair, a loss of faith in leadership, and a fear for the future. Additionally, Western cultural influence and criticism of Islam led to a questioning of traditional values and practices.
    • Iqbal’s Role: Allama Iqbal emerged as a key figure in this revival, working to counter the mental and spiritual decline of the Muslim community. He aimed to restore their sense of self-worth, pride in their heritage, and faith in Islam. He used his knowledge of both Islamic and Western thought to address the challenges faced by Muslims.
    • Emphasis on Self-Realization: Iqbal’s philosophy focused on the idea of self-realization for Muslims, encouraging them to recognize their true potential and identity. He argued that Muslims had lost sight of their own heritage and had become overly influenced by Western thought and culture.
    • Rejection of Western Culture: Iqbal was critical of Western culture (“Maghribiyat”), which he saw as a threat to Muslim identity. He attacked the materialism and perceived negative aspects of Western civilization, including Western politics. He also spoke out against what he saw as the negative influence of Western ideas on Muslim women.
    • Focus on Islamic Identity: Iqbal emphasized the importance of a distinct Muslim identity based on faith and culture. He argued that Muslims were a unique community with their own beliefs and traditions, separate from other groups in India. He stressed the concept of Islamic unity, countering communalism and divisions within the Muslim world. He worked to instill a sense of Islamic pride and purpose, particularly in the youth.
    • Timeless Relevance of Islam: Iqbal stressed the timeless relevance of Islam and its principles, challenging the idea that it was outdated. He argued that Islam’s principles were applicable in every era. He believed that the solution to the problems faced by Muslims lay in adhering to the Quran and the teachings of Islam.
    • Political Vision: Iqbal also had a political vision. He believed that Muslims could not maintain their culture while living with Hindus in India. This view led to his advocacy for a separate and independent Muslim state, which ultimately contributed to the idea of Pakistan. He saw the need for a country where Muslims could live according to the principles of Islam.
    • Critique of Nationalism: He challenged the concept of nationalism, arguing that it was a “ghost” that could dissolve Muslims into the larger Hindu community. He emphasized that the basis of a nation should be faith, not language or territory.
    • Return to the Quran: In his later life, Iqbal increasingly focused on the Quran, using it as his primary source of knowledge and guidance. He is described as having separated all other books from the Quran, dedicating himself to understanding and living by its teachings.
    • Legacy of Revival: Iqbal’s work was instrumental in shaping the intellectual and political landscape of the Muslim community in India. He is credited with inspiring the creation of Pakistan, and his work is viewed as essential to the formation and survival of the country. His legacy is viewed as a passionate defense of Islamic values and a call for a vibrant and self-aware Muslim community.

    Overall, the Islamic revival led by Iqbal was a comprehensive movement that sought to address the challenges faced by Muslims in India through a renewed focus on their faith, culture, and identity. His emphasis on self-realization, Islamic unity, and the timeless relevance of Islam had a profound impact on the Muslim community, and his ideas continue to be influential today.

    Iqbal’s Philosophy: Self-Realization and Islamic Revival

    Allama Iqbal’s philosophy was a comprehensive response to the challenges faced by Muslims in India during a period of significant crisis and disappointment. His philosophy aimed to revitalize the Muslim community by emphasizing self-realization, a return to Islamic principles, and a rejection of Western cultural dominance.

    Here are the key components of Iqbal’s philosophy:

    • Self-Realization (“Khudi”): A central theme in Iqbal’s philosophy is the idea of self-realization. He believed that Muslims had lost sight of their true potential and had become ashamed of their own culture, religion, and morals. He argued that Muslims had been subjected to a form of “mental slavery” by adopting Western ideas and values, and he called on them to recognize their own inherent worth and strength. He encouraged them to take pride in their Islamic heritage and to understand their unique role in the world. He stressed that a nation is made by faith and not by language or territory.
    • Rejection of Western Culture (“Maghribiyat”): Iqbal was a sharp critic of Western culture, which he saw as a major threat to Muslim identity and values. He attacked the materialism and moral decay that he associated with the West. He argued that Muslims should not blindly adopt Western ways but should instead draw strength from their own traditions and principles. He believed that the dominance of Western culture was a form of slavery that prevented Muslims from recognizing their own worth.
    • Timeless Relevance of Islam: Iqbal emphasized the timeless nature of Islam and its principles. He argued that Islam was not an outdated or irrelevant system but a source of guidance and strength that was applicable to all eras. He believed that the solution to the problems faced by Muslims lay in adhering to the Quran and the teachings of Islam. He saw the Islamic system as providing the framework for a just and prosperous society.
    • Emphasis on Islamic Identity and Unity: Iqbal stressed the importance of a distinct Muslim identity based on faith and culture. He argued that Muslims were a unique community with their own beliefs and traditions, and they should not be absorbed into other communities. He called for unity among Muslims worldwide, countering divisions and communalism. He also advocated for a political structure that would allow Muslims to live according to Islamic principles.
    • Critique of Nationalism: Iqbal was critical of the concept of nationalism, which he saw as a threat to Muslim unity. He believed that nationalism could lead to the dissolution of the Muslim community into the larger Hindu community. He argued that faith should be the basis of a nation, not language or territory.
    • Political Vision: Iqbal believed that Muslims could not maintain their culture while living as a minority in India. He advocated for a separate and independent Muslim state where Muslims could live according to Islamic principles. This vision ultimately led to the idea of Pakistan.
    • Return to the Quran: In his later life, Iqbal increasingly focused on the Quran as his primary source of knowledge and guidance. He is described as having separated himself from all other books, dedicating himself to understanding and living by its teachings. He believed the Quran contained all the answers for the problems of his time.
    • Concept of “Faqr”: Iqbal used the word “Faqr” extensively, which according to him does not mean poverty and puritanism, but having faith in Allah in all circumstances, being self-respecting in front of others, and being humble only before God.

    Iqbal’s philosophy was not just a theoretical framework but a call to action. He sought to inspire a sense of purpose and pride among Muslims, urging them to take control of their own destiny and to create a just and prosperous society based on Islamic principles. His work had a profound impact on the Muslim community in India, shaping both the intellectual and political landscape of the time. He is credited with inspiring the creation of Pakistan and is viewed as a key figure in the Islamic revival of the 20th century.

    Iqbaaliyaat Audiobook By Maulana Maududi || اقبالیات از مولانا مودودی
    Zindagi Baad A Maut book by Maulana Syed Abul-Ala Maududi – Audiobook
    Touhid o Risalat by Syed Abul Aala Maududi – توحید و رسالت – Audio Book in Urdu
    Deeniyat book by Maulana Syed Abul-Ala Maududi – Audiobook دینیات – سید ابو الاعلىٰ مودودی
    Al Jihad Fil Islam by Abul Aala Maududi Chapter 1/7
    Al-Jihad Fil Islam by Abul Aala Maududi Chapter 2/7
    Al-Jihad Fil Islam by Abul Aala Maududi Chapter 3/7
    Al-Jihad Fil Islam by Abul Aala Maududi Chapter 4/7
    Al-Jihad fil Islam by Abul Aala Maududi Chapter 5/7
    Al Jihad fil Islam by Abul Aala Maududi Chapter 6/7
    Al Jihad Fil Islam by Abul Aala Maududi Last Chapter 7/7

    By Amjad Izhar
    Contact: amjad.izhar@gmail.com
    https://amjadizhar.blog