Author: Amjad Izhar

  • Angular .NET Full Stack: Inventory and Customer CRUD Tutorial

    Angular .NET Full Stack: Inventory and Customer CRUD Tutorial

    The tutorial provides a comprehensive guide to building a full-stack web application using Angular 17 and .NET Core. It focuses on implementing CRUD (Create, Read, Update, Delete) operations for inventory and customer management. The tutorial demonstrates integrating Bootstrap for styling and utilizes SQL Server for data persistence. Key concepts covered include routing, data binding, HTTP requests, and the use of stored procedures. The explanation offers step-by-step instructions, from setting up the Angular components to creating the .NET Core API and interacting with the database, including UI enhancements for an improved user experience. The final implementation focuses on achieving real time updates for data being shown in the UI.

    Angular 17 & .NET Core Full Stack Application Study Guide

    Quiz

    1. What are the two prerequisites needed to begin building an Angular application according to the video?
    2. Explain what the command npm install -g @angular/cli does and why it’s important.
    3. When creating a new Angular application using the command ng new [application-name], what is the first question asked and why is this relevant to web development?
    4. What is the command code . used for in the context of this tutorial?
    5. Describe the significance of the src folder in an Angular application, particularly mentioning the index.html and app folders.
    6. What does the command ng serve do, and what benefit does it provide during development?
    7. Explain how bootstrap can be integrated into an angular app.
    8. Explain how the NG generate component command can add to the folder structure of an app.
    9. Explain how routing works in an angular app.
    10. Explain how to submit data from an Angular app to a .NET Core API.

    Quiz – Answer Key

    1. The two prerequisites are Node.js, which can be downloaded from nodejs.org, and Visual Studio Code, which can be downloaded from the Visual Studio Code website; these tools are required for running and developing Angular applications.
    2. The command npm install -g @angular/cli installs the Angular CLI (Command Line Interface) globally on the system; this allows you to use Angular commands like ng new to create and manage Angular projects.
    3. The first question asked is which stylesheet format you want to use (e.g., CSS, Sass); this is important because it determines how you’ll style your Angular application.
    4. The command code . opens Visual Studio Code from the command prompt, automatically loading the current directory (the Angular application folder) into the editor.
    5. The src folder contains the source code of the Angular application; index.html is the top-level HTML file, and the app folder contains the components and logic for the application, the code actually displayed.
    6. The command ng serve builds the Angular application and starts a development server, automatically reloading the browser whenever changes are made to the code.
    7. Bootstrap is installed through the command npm install bootstrap –save. The Javascript and CSS files of bootstrap are integrated into the angular.js file.
    8. The command NG generate component [path]/[component name] creates a new folder within the folder designated by the path variable, and then adds a folder with all of the component files into this folder.
    9. Routing allows for traversing to particular pages without reloading the entire application, keeping the main template intact; it requires defining routes in app.routes.ts and using routerLink in HTML templates.
    10. Data is submitted from an Angular app to a .NET Core API using the HttpClient module; this involves creating a data structure, defining the API URL, and using the post method to send the data, subscribing to the result to handle the response.

    Essay Questions

    1. Describe the process of setting up a full-stack application using Angular 17 for the front end and .NET Core for the back end, including installation of prerequisites, project creation, and initial configuration. Explain common problems and what to do when errors occur.
    2. Explain the significance of components and routing in Angular applications, detailing how they contribute to creating a modular and navigable user interface.
    3. Outline the steps involved in integrating Bootstrap into an Angular application and utilizing its CSS classes to style the user interface, focusing on the benefits of using a CSS framework.
    4. Discuss the implementation of CRUD (Create, Read, Update, Delete) operations in an Angular application, covering the front-end logic, back-end API calls, and database interactions.
    5. Compare and contrast the different approaches used for creating and managing dialog boxes in the Angular application described in the tutorial, highlighting the advantages and disadvantages of each method.

    Glossary of Key Terms

    • Angular CLI: A command-line tool for creating, managing, and building Angular applications.
    • Node.js: A JavaScript runtime environment that allows you to run JavaScript on the server-side.
    • npm (Node Package Manager): A package manager for JavaScript that is used to install and manage dependencies for Node.js projects.
    • Component: A reusable building block of an Angular application that encapsulates HTML, CSS, and TypeScript code.
    • Routing: A mechanism for navigating between different views or pages in an Angular application without reloading the entire page.
    • Bootstrap: A popular CSS framework that provides pre-built styles and components for creating responsive and visually appealing user interfaces.
    • .NET Core: A cross-platform, open-source framework for building modern web applications and APIs.
    • API (Application Programming Interface): A set of rules and specifications that allows different software systems to communicate with each other.
    • CRUD Operations: The four basic operations performed on data: Create, Read, Update, and Delete.
    • Model: A class that defines the structure and behavior of data in an application.
    • Data Binding: The process of connecting data between the component class and the HTML template in an Angular application.
    • Two-Way Binding: An ability to have data flow two ways, from HTML to the component code, or from the component code to the HTML.
    • HttpClient: An Angular module that allows you to make HTTP requests to a server to retrieve or send data.
    • JSON (JavaScript Object Notation): A lightweight data-interchange format that is easy for humans to read and write and easy for machines to parse and generate.
    • Dependency Injection: A design pattern in which objects receive other objects that they depend on (dependencies) instead of creating them themselves.
    • Intellisense: A code completion feature in IDEs like Visual Studio Code that provides suggestions and assistance while coding.
    • TypeScript: A superset of JavaScript that adds optional static typing, classes, and interfaces to improve code maintainability and scalability.
    • SPA (Single-Page Application): A web application that loads a single HTML page and dynamically updates the content as the user interacts with the application.
    • Localhost: A hostname that refers to the current computer being used.
    • REST API: An API that conforms to the constraints of REST (Representational State Transfer) architecture, enabling interaction with resources through standard HTTP methods.
    • Course: A set of HTTP response headers that are required to allow cross-domain communications.
    • JavaScript: A high-level, often just-in-time compiled programming language that conforms to the ECMAScript specification.
    • HTML: The standard markup language for documents designed to be displayed in a web browser.
    • CSS: A style sheet language used for describing the presentation of a document written in a markup language such as HTML.
    • DOM: A programming interface for HTML and XML documents.
    • Stringify: Converting objects to string format.
    • ASync: Describes one or more processes that do not occur at the same time.
    • Varchar: Variable length character string data type.
    • Integer: A numerical data type representing an integer.
    • Void: When a function does not return a value.
    • Json: A minimal, readable format for structuring data.
    • DB: Short for database.
    • TS File: The Typescript file for an angular component, where program logic is kept.
    • HTML Template: The HTML code associated with a component that is displayed in the application.
    • Module: The imports and exports used in a component.

    Angular 17 and .NET Core Full Stack Tutorial

    Okay, here’s a detailed briefing document summarizing the provided transcript, focusing on the key themes, ideas, and important facts, along with relevant quotes.

    Briefing Document: Angular 17 and .NET Core Full Stack Application Tutorial

    I. Overview

    The transcript details the construction of a full-stack web application using Angular 17 for the front-end and .NET Core for the back-end API. The tutorial walks through setting up the development environment, creating Angular components, integrating Bootstrap for styling, implementing routing, performing CRUD (Create, Read, Update, Delete) operations, and connecting the front-end to the back-end API.

    II. Main Themes and Key Concepts

    • Setting up the Development Environment: The tutorial begins by emphasizing the necessary prerequisites for Angular development. This involves installing Node.js and Visual Studio Code. “For building the angular application you have to make sure that you have prerequisites installed on your system so first of all you’ll have to go to this particular website nodejs.org and download the execut ible file for installing nodejs on your system”. The Angular CLI (Command Line Interface) is installed globally using npm install -g @angular/cli.
    • Angular Project Creation and Structure: The tutorial demonstrates how to create a new Angular project using the ng new command, specifying the project name (e.g., ng new hoc-gadget-shop) and stylesheet format (CSS). The importance of the src folder and index.html file as the top-level HTML template is highlighted. “If you take a closer look over here you’ll see that SRC folder is there and this is the folder that you want to focus on right now we have index.html file so how you can understand is that this is the top tier file of HTML template whatever is rendered in the browser this it is being rendered at the top level from here”.
    • Component Creation: The tutorial explains the concept of components in Angular and how to generate new components using the ng generate component command. It demonstrates creating an “inventory” component within an “app-components” folder. “For generating the new inventory component what we’ll have have to do is we’ll go to the solution and stop the running process with control c as I told before we’ll give the command NG generate component app components slash inventory”
    • Routing: The routing module is discussed as a mechanism for navigating between different components in the application without full page reloads. The tutorial uses routerLink in the HTML templates and configures routes in app.routes.ts to associate paths with specific components.
    • Bootstrap Integration: The tutorial demonstrates how to integrate Bootstrap for styling the application. This involves installing Bootstrap using npm install bootstrap –save and including the Bootstrap CSS and JavaScript files in the angular.json file. The tutorial then uses Bootstrap components like navbar and form elements. “npm install bootstrap save by using this command I should be able to successfully integrate bootstrap in my application”.
    • Data Binding: The tutorial explains how to bind input fields to variables using two-way binding (ngModel) and how to display data on the UI. Two way data binding is used to access the input fields. “You might already be aware about two-way binding and that is what we are going to perform here for that so I’m just giving the data structure just giving it some default values we have product ID product name available quantity it could be zero and then we have reorder Point”
    • CRUD Operations: The tutorial covers the implementation of CRUD operations:
    • Create: Submitting a form to create new inventory items or customers and saving the data to the database.
    • Read: Fetching data from the database and displaying it in tables on the UI.
    • Update: Editing existing data and updating the database with the modified values.
    • Delete: Removing data from the database.
    • .NET Core API Development: The tutorial guides the creation of a .NET Core Web API to handle the back-end logic. It covers creating API controllers, defining models, connecting to a SQL Server database, and implementing stored procedures for data access.
    • Database Interaction: The tutorial covers saving the inventory data into the database table through a store procedure.
    • Front-End to Back-End Communication: The tutorial demonstrates how to connect the Angular front-end to the .NET Core API using HttpClient. It shows how to make HTTP POST, GET, PUT, and DELETE requests to interact with the API endpoints. The integration of HTTP client to post the data for the HTTP method post to the server. “In this we’ll use this HTTP client post the data and we have to pass the URL so we’ll have to define the URL over here as well”.
    • Asynchronous Operations: The tutorial utilizes RxJS observables and the subscribe method to handle asynchronous HTTP requests.
    • CORS Configuration: The tutorial addresses Cross-Origin Resource Sharing (CORS) issues by configuring the .NET Core API to allow requests from the Angular development server (localhost:4200).
    • NG Bootstrap: Demonstrates usage of NG Bootstrap to create Dialog boxes and other components for user input. Includes installing ng-bootstrap using NPM. “using NG bootstrap for creating a dialog box that would be displayed once the user clicks on delete button”.
    • Dialog Box/Modal Implementation: Implements using NG Bootstrap to create Modal Dialogs, for user confirmation on Delete operations or inputting data to Add/Edit records.
    • Disabling Form Fields: Provides examples of how to disable form input fields using the disabled attribute in the Angular HTML template.

    III. Important Facts and Code Snippets

    • Angular CLI Command: ng new <app-name> (to create a new Angular project).
    • Install Bootstrap: npm install bootstrap –save
    • Command to generate new Angular components: NG generate component app components slash inventory
    • Run Angular application: NG serve
    • Include Bootstrap CSS in angular.json: (Example path) “node_modules/bootstrap/dist/css/bootstrap.min.css”
    • Include Bootstrap JS in angular.json: (Example path) “node_modules/bootstrap/dist/js/bootstrap.bundle.min.js”
    • RouterLink Syntax: <a routerLink=”/inventory”>Inventory</a>
    • HTTP Post: this.http.post(API_URL, body, httpOptions).subscribe(…)
    • CORS Configuration in .NET Core: (Snippet from Program.cs)
    • builder.Services.AddCors(options => {
    • options.AddPolicy(MyAllowSpecificOrigins,
    • policy =>
    • {
    • policy.WithOrigins(“http://localhost:4200&#8221;)
    • .AllowAnyHeader()
    • .AllowAnyMethod();
    • });
    • });
    • Example SQL Insert Statement in .NET Core:SqlCommand cmd = new SqlCommand(“SP_saveInventoryData”, conn);
    • cmd.CommandType = CommandType.StoredProcedure;
    • cmd.Parameters.AddWithValue(“@productID”, requestDto.ProductID);
    • // … other parameters
    • cmd.ExecuteNonQuery();

    IV. Potential Gaps/Areas for Further Exploration

    • Error Handling: The tutorial provides basic console.log for error handling. A more robust error handling strategy, including displaying user-friendly error messages, could be explored.
    • Validation: Form validation is mentioned, but the tutorial does not dive deep into implementing specific validation rules. Client-side and server-side validation techniques could be explored.
    • Authentication/Authorization: The tutorial lacks any discussion of authentication or authorization. Implementing user authentication and role-based access control would enhance the security of the application.
    • Testing: The tutorial does not cover unit testing or end-to-end testing. Adding tests would improve the reliability and maintainability of the application.
    • State Management: For larger applications, a more sophisticated state management solution (e.g., NgRx, Akita) might be beneficial.
    • More Complex UI Elements: The tutorial uses basic Bootstrap components. Integrating more advanced UI libraries or custom components could be explored.

    V. Conclusion

    The transcript provides a comprehensive guide to building a full-stack web application using Angular 17 and .NET Core. It covers the fundamental concepts, tools, and techniques necessary to create a functional CRUD application. While there are potential areas for further exploration, the tutorial provides a solid foundation for developers looking to learn these technologies.

    Angular 17 and .NET Core Full Stack Guide

    FAQ on Building a Full Stack Application with Angular 17 and .NET Core

    • Question 1: What are the prerequisites for building an Angular 17 application?
    • Answer: Before building an Angular 17 application, you need to have Node.js installed on your system. You can download the installer from nodejs.org. Additionally, you need a code editor such as Visual Studio Code. The tutorial uses Visual Studio Code, and you can download the installer for your operating system from the Visual Studio Code website.
    • Question 2: How do I install Angular on my system?
    • Answer: Once you have Node.js and Visual Studio Code installed, you can install the Angular CLI (Command Line Interface) globally using the command npm install -g @angular/cli in your command prompt or terminal. The tutorial mentions using Angular 17.
    • Question 3: How do I create a new Angular application?
    • Answer: To create a new Angular application, navigate to the desired folder in your command prompt or terminal and use the command ng new <application-name>. Replace <application-name> with the desired name for your application (e.g., ng new HGadgetShop). During the creation process, you’ll be prompted to choose a stylesheet format (CSS is recommended), and whether to enable server-side rendering.
    • Question 4: What is the basic structure of an Angular application?
    • Answer: A newly generated Angular application has a specific structure. The src folder is where most of your application code resides. The index.html file is the top-level HTML file that is rendered in the browser, and it contains the <app-root> tag. The application component is the core component with its template (.html), TypeScript file (.ts), and CSS file. The app.component.html file initially renders the content you see in the browser.
    • Question 5: How do I run an Angular application?
    • Answer: To run an Angular application, navigate to the application’s directory in your command prompt or terminal and use the command ng serve. This command builds the application and starts a development server. It will provide a URL (usually localhost:4200) that you can open in your browser to view the application. The ng serve command also watches for changes in your code and automatically reloads the browser when changes are detected.
    • Question 6: How do I integrate Bootstrap into an Angular application?
    • Answer: To integrate Bootstrap, first stop the running Angular application with Ctrl+C. Then, install Bootstrap using the command npm install bootstrap –save. After the package is installed, you need to include the Bootstrap CSS and JavaScript files in the angular.json file. In the styles array, add the path to node_modules/bootstrap/dist/css/bootstrap.min.css. In the scripts array, add the path to node_modules/bootstrap/dist/js/bootstrap.bundle.min.js.
    • Question 7: What is routing in Angular and how does it work?
    • Answer: Routing in Angular allows you to navigate between different “pages” or components within your application without reloading the entire page. To implement routing, you need to use the <router-outlet> tag in your main template (e.g., app.component.html). This tag acts as a placeholder where the content of the routed components will be displayed. You define the routes in the app.routes.ts file, specifying the path and the corresponding component. You then use <a routerLink=”<path>”> in your templates to create links that navigate to the specified routes. Additionally, you need to import RouterModule in your app.component.ts file.
    • Question 8: How do I perform CRUD (Create, Read, Update, Delete) operations in a full-stack Angular and .NET Core application?
    • Answer: To perform CRUD operations, you’ll need to create corresponding APIs in your .NET Core backend (e.g., using HTTP POST for create, GET for read, PUT for update, and DELETE for delete). In Angular, you’ll use the HttpClient service to make HTTP requests to these APIs. For example, to create a new record, you’ll use HttpClient.post() to send data to the create API. To read data, you’ll use HttpClient.get() to fetch data from the read API. The responses from these APIs can then be used to update the UI. Stored procedures are generally created for these processes in SQL Server.

    Angular Routing: Navigation in Single-Page Applications

    Angular routing is a concept that allows navigation between different pages or components in an Angular application without reloading the page. Here’s a breakdown of how it works, according to the sources:

    • Purpose Routing enables traversal from one page to another while keeping the main template intact. It helps in displaying HTML structures provided in a particular component upon clicking a specific item, like navigating to an inventory component.
    • Key Components:
    • Router Outlet: In the main template (e.g., app.component.html), the <router-outlet> tag designates where the content of the navigated component should be displayed. After the nav bar, any HTML template can be displayed through this outlet.
    • Router Link: In the HTML template, routerLink is used to specify the path to navigate to when a link is clicked. Instead of href, routerLink is used, along with the path.
    • Route Configuration: The app.routes.ts file defines the paths and their corresponding components. It consists of an array where each object maps a path to a component.
    • Configuration Steps:
    1. Set up the main template: Ensure app.component.html is the main template. After the nav bar, include <router-outlet>.
    2. Define paths: In app.routes.ts, specify the path and the component to be loaded for that path. For example:
    3. // Example route configuration
    4. { path: ‘inventory’, component: InventoryComponent }
    5. Link to routes: In the HTML, use routerLink to point to the defined paths. For example, in the HTML structure, point the routerLink to the path defined in the routes file.
    6. Import Modules: Import RouterModule in app.component.ts. Also, import the specific component.
    • Benefits Angular routing offers the benefit of traversing from one page to another without reloading the page.

    .NET API Development with ASP.NET Core

    A .NET API, as described in the sources, involves creating a web API using ASP.NET Core to handle requests from an Angular front end and interact with a database. The process includes several key steps:

    • Project Setup Create an ASP.NET Core Web API project, potentially within the same directory as the Angular project. The source mentions using Visual Studio for this.
    • Controller Creation Add an API controller to handle incoming requests. The source uses an empty API controller template and names it InventoryController.
    • Function to Process Requests Define a function within the controller to accept and process requests from the Angular application. This function should:
    • Return an ActionResult.
    • Have a descriptive name, such as SaveInventoryData.
    • Use the [HttpPost] attribute to handle HTTP POST requests.
    • Include a return statement (e.g., return Ok(“inventory details saved successfully”)).
    • Database Connection Establish a SQL connection within the function to interact with the database. This involves:
    • Importing necessary libraries.
    • Defining a connection string.
    • Creating a SQL connection object.
    • Stored Procedure Execution Call a stored procedure to save or retrieve data. This involves:
    • Defining a SQL command object.
    • Specifying the command type as a stored procedure.
    • Setting the command text to the name of the stored procedure.
    • Opening and closing the connection.
    • Executing the command using ExecuteNonQuery for data manipulation or ExecuteReader for data retrieval.
    • Model Definition Create a model (class) to represent the data being passed between the Angular application and the API. This class should:
    • Reside in a “Models” folder.
    • Have properties that match the fields being sent from the front end.
    • Be used as a parameter in the API function to accept the incoming data.
    • Parameter Passing Pass the data from the model to the stored procedure. This involves:
    • Specifying the parameter name in the SQL command.
    • Providing the value from the model for each parameter.
    • CORS Configuration Configure Cross-Origin Resource Sharing (CORS) to allow requests from the Angular application, which typically runs on a different port. This involves:
    • Adding the AddCors service in Program.cs.
    • Defining a CORS policy that allows requests from the Angular application’s URL (e.g., http://localhost:4200).
    • Using the UseCors middleware before UseAuthorization in Program.cs.
    • Testing Test the API using tools like Swagger to ensure it’s working correctly. This involves:
    • Running the application and navigating to the Swagger endpoint.
    • Providing sample data and executing the API.
    • Verifying the response code and data.
    • Checking the database to confirm that the data has been saved or retrieved correctly.
    • Data Retrieval To retrieve data, a similar process is followed using HTTPGet and ExecuteReader. The data from the database is mapped to a Data Transfer Object (DTO) and serialized into JSON format for the response.
    • Data Updates and Deletion For updating and deleting data, HTTPPut and HTTPDelete methods are used, respectively, along with appropriate stored procedures and parameter passing.

    SQL Database Integration in Full-Stack Applications

    Based on the sources, here’s an overview of working with a SQL database in the context of building a full-stack application:

    • Database Choice: The tutorial uses SQL Server as the relational database management system.
    • Database Creation: A new database (e.g., GadgetShopDB) is created to store the application’s data.
    • Table Creation: Tables are created within the database to hold specific data entities. For example, an InventoryTable is created with columns like ProductID, ProductName, AvailableQuantity, and ReorderPoint. It is also mentioned to create CustomerDetails table.
    • Data Types: Appropriate data types are assigned to each column based on the type of data they will store (e.g., INT, VARCHAR, DATE).
    • Primary Key: A primary key is defined for each table, usually with an identity (auto-incrementing) property to ensure unique and non-null values.
    • Stored Procedures: Stored procedures are created to perform specific database operations.
    • Purpose: Stored procedures encapsulate SQL queries and provide a more secure and efficient way to interact with the database.
    • Naming Convention: A naming convention (e.g., SP_SaveInventoryData) can be followed to define stored procedure names.
    • Parameters: Stored procedures accept parameters as inputs, which are used in the SQL queries within the procedure.
    • Types: Different stored procedures are created for different operations such as:
    • Saving data (e.g., SP_SaveInventoryData)
    • Getting data (e.g., SP_GetInventoryData)
    • Updating data (e.g., SP_UpdateInventoryData)
    • Deleting data (e.g., SP_DeleteInventoryDetails)
    • Refresh: It might be necessary to refresh the database after creating a table for the objects to be valid.
    • SQL Queries: SQL queries are used within stored procedures to perform the actual database operations.
    • INSERT: Inserts new records into a table.
    • SELECT: Retrieves data from a table.
    • UPDATE: Modifies existing records in a table.
    • DELETE: Removes records from a table.
    • WHERE Clause: Used to filter records based on specific conditions.
    • Data Retrieval: When retrieving data, the data types from the database might need to be cast into appropriate types for use in the application.
    • Integration with .NET API: The .NET API interacts with the SQL database by calling stored procedures and passing parameters.
    • Connection String: A connection string is used to establish a connection to the database.
    • SQL Connection: The .NET code uses SQL connection objects to connect to the database and execute commands.
    • Data Transfer Objects (DTOs): DTOs are used to transfer data between the API and the database.
    • Key Considerations:
    • Security: Using stored procedures helps prevent SQL injection attacks and improves database security.
    • Performance: Stored procedures can improve performance by reducing network traffic and pre-compiling SQL queries.
    • Data Integrity: Constraints and data types enforced at the database level help ensure data integrity.

    Data Binding with ngModel Directive

    Based on the provided source, data binding is a technique used to link input fields in an HTML template to variables in a component’s TypeScript (TS) file. Two-way binding, in particular, allows changes in the input fields to update the corresponding variables in the TS file, and vice versa.

    Here’s how data binding is implemented and used, according to the sources:

    • Two-Way Binding Implementation Implement two-way binding by binding input fields to variables that are accessed on the component’s .ts file.
    • Variables: Create variables in the component’s .ts file to hold the data for the input fields. These variables can be part of a data structure or a model.
    • ngModel Directive: Use the ngModel directive in the HTML template to bind the input fields to the variables in the .ts file.
    • Name Attribute: It is also useful to give the input field a name attribute.
    • For example:
    • <input type=”text” [(ngModel)]=”inventoryData.productID” name=”productID”>
    • In this example, the input field is bound to the productID property of the inventoryData object in the component’s .ts file. Any changes made in the input field will automatically update the inventoryData.productID property, and any changes to inventoryData.productID in the .ts file will be reflected in the input field.
    • Displaying Bound Data: To display the data that is bound to the input fields, you can use string interpolation or other data binding techniques in the HTML template. For example:
    • <p>Product ID: {{ inventoryData.productID }}</p>
    • This will display the value of the productID property in the HTML. The stringify function converts the JavaScript value to a JSON string.
    • Considerations
    • For the data binding to work, the FormsModule must be imported into the component.
    • When binding to properties from the back end with different naming conventions (e.g., camel case vs. capital letters), ensure the property names match exactly in the HTML template to avoid binding issues.
    • Benefits
    • Data binding simplifies the process of synchronizing data between the UI and the application logic.
    • Two-way binding makes it easy to capture user input and update the UI in real-time.

    Angular, .NET, and SQL: Implementing CRUD Operations

    CRUD operations refer to the four basic functions of persistent storage: Create, Read, Update, and Delete. The sources describe the implementation of these operations in the context of building a full-stack Angular and .NET Core application.

    General Workflow

    1. Front-End (Angular):
    • The user interacts with the UI (e.g., filling out a form, clicking a button).
    • Data is bound to the UI elements using two-way data binding (ngModel).
    • An HTTP request is sent to the back-end API.
    1. Back-End (.NET API):
    • The API receives the request and processes it.
    • The API interacts with the SQL database.
    • A stored procedure is executed to perform the requested operation.
    • The API sends a response back to the front end.
    1. Database (SQL Server):
    • The database receives the request from the API.
    • The appropriate CRUD operation is performed on the data.
    • The database returns a result to the API.

    Operation Details

    • Create (Save)Angular: A form is used to collect user input. On form submission, the data is sent to the .NET API via an HTTP POST request.
    • .NET API: The API receives the data, creates a SQL connection, and executes a stored procedure (e.g., SP_SaveInventoryData, SP_SaveCustomerDetails) to insert the data into the database. A model is used to map the data to the stored procedure parameters.
    • SQL Database: The stored procedure inserts a new row into the specified table.
    • Once the data is saved, a success message may be displayed on the UI.
    • Read (Get)Angular: An HTTP GET request is sent to the .NET API to retrieve data. This is often done on component initialization (ngOnInit).
    • .NET API: The API receives the request, connects to the database, and executes a stored procedure (e.g., SP_GetInventoryData, SP_GetCustomerDetails) to retrieve the data. The data is mapped to a Data Transfer Object (DTO) and serialized into JSON format.
    • SQL Database: The stored procedure selects data from the specified table.
    • Angular: The JSON response is received and bound to the UI, displaying the data in a table or other format.
    • Update (Edit)Angular: A row is selected for editing, and its data is pre-populated in a form. An HTTP PUT request is sent to the .NET API with the updated data.
    • .NET API: The API receives the data, connects to the database, and executes a stored procedure (e.g., SP_UpdateInventoryData, SP_UpdateCustomerDetails) to update the data in the database. The stored procedure uses a WHERE clause to identify the record to update (e.g., based on ProductID or CustomerID).
    • SQL Database: The stored procedure updates the specified columns in the table for the matching record.
    • After a successful update, the table is refreshed to show the modified data.
    • DeleteAngular: A row is selected for deletion, and a confirmation dialog is displayed. Upon confirmation, an HTTP DELETE request is sent to the .NET API, including the ID of the record to be deleted.
    • .NET API: The API receives the request, connects to the database, and executes a stored procedure (e.g., SP_DeleteInventoryDetails, SP_DeleteCustomerDetails) to delete the data from the database. The stored procedure uses a WHERE clause to identify the record to delete.
    • SQL Database: The stored procedure deletes the specified record from the table.
    • After successful deletion, the table is refreshed to remove the deleted row.

    Additional Considerations

    • Stored Procedures: Using stored procedures is crucial for security, performance, and data integrity. They help prevent SQL injection attacks, improve performance by reducing network traffic, and enforce data constraints.
    • Real-Time Updates: After performing a CRUD operation, the UI is often refreshed to reflect the changes in real-time. This can be achieved by calling the “Read” operation after a “Create”, “Update”, or “Delete” operation.
    • User Experience: Displaying confirmation dialogs before deleting records and providing feedback messages (e.g., using alert boxes) enhances the user experience.
    • Data Validation: Validate data on both the front-end (Angular) and the back-end (.NET API) to ensure data integrity.
    • Error Handling: Implement proper error handling to catch and handle exceptions that may occur during the CRUD operations.
    • Asynchronous Operations: Use asynchronous operations (e.g., async/await) to prevent blocking the UI thread while waiting for the API to respond.

    By implementing these CRUD operations, the application enables users to manage data effectively through a user-friendly interface.

    Complete FULL STACK PROJECT | Angular | .Net Core | SQL

    The Original Text

    hi friends welcome back to our channel in this video I’m going to show you how to start building a full stack application for the front end we’re going to use angular 17 and for the back end it will be net code now first of all for building the angular application you have to make sure that you have prerequisites installed on your system so first of all you’ll have to go to this particular website nodejs.org and download the execut ible file for installing nodejs on your system so once you have it on your [Music] system you’ll open this file and as I already have node.js install this is the particular message that I’m getting but you’ll not get it you if you don’t have it already installed on your system you’ll have the setup completed and after the installation we’ll move on to the next prerequisite you’ll go to this particular website Visual Studio code and download for your particular OS the installer for visual studio code once you have this particular installation completed we are going to install angular on the system and how we do that we go to command prompt and give this particular command I press enter and the installation is in progress npm install Hy at angular CLI so this will make sure that the latest version of angular is installed on your system in this particular video when you are seeing it angular 17 is the version that we are using now this says is that 225 packages have been changed in 18 seconds so if you do not already have angular installed on your system it could have said the packages are added instead of changed once this is done what you can do is I already have a folder created over here in my system and in this particular folder I’m going to create the application that I require for my gadget shop and I’ll be giv giving it the name with this particular command NG new H gadget shop this is the application name that I am giving I press enter now what’s being asked is which style sheet format do I want to use and I want to use CSS so yeah basics of HTML and CSS You’ always want to know before beginning any angular project do I want to enable server side rering and static side generation no I do not want that so I press no the installation of the packages is in progress that means my application is being created great so I have got package installed successfully message over here that means the application generation has been completed and how I can make sure is that this is the folder that is created for my new application H gadget shop and what I’ll do is I’ll just change the directory over here that means that I have opened the command prompt inside that directory now I can give the command code dot this is the command that we use to open the visual studio code in from this particular application through command prompt now Visual Studio code should open and the application will be opened automatically in this so over here we have it the entire structure is ready for a newly generated application if you take a closer look over here you’ll see that SRC folder is there and this is the folder that you want to focus on right now we have index.html file so how you can understand is that this is the top tier file of HTML template whatever is rendered in the browser this it is being rendered at the top level from here and inside the body tag we have app hyphen root so this contains all of the HTML information that will be shown on the screen how do I make sure that what I am saying is correct I can show you over here by running the application so if I just open the terminal over here this is also open the power shell terminal we can see over here it says Powershell terminal so I’ll close this one and we can start working on this particular terminal which is inside the visual studio code it’s kind of easier to access I’ll run the application what I am doing is running the application through this particular command NG serve it is going to build the application first make sure that there are are no syntactical errors and then it’s going to give me a URL this is the Local Host URL that I have to open control and click the URL is opened Local Host colon 4200 and this is the entire information that is being rendered through index.html file but I see over here that I don’t have any tag related to what is being shown on the screen so how do I make sure that this is getting R rendered through app hyphen route through this particular app folder this is the component folder and this is the HTML structure related to it so if you’re already aware about the basics of angular it’s pretty easy for you but for the beginners who are going to learn angular this is the part where they can focus on and we see over here that this is the content that has to be replaced so whatever we are seeing on the screen is being rendered through tags from here from this particular app component and this is the TS file related to it so whatever typescript code we’ll be writing we’ll be writing over here in this file and whatever HTML tags we want to give whatever HTML template we need to provide to the application we’ll do it from here and the top tier file is index.html so I hope you have like understood the basic structure of the angular application and now to make sure that uh this is the correct file that we are referring to over here I’ll make a change I just type over here hello we already saw that in the browser we were seeing this your app is running I have written hello over here and saved it so after I saved it it says that page reload sent to client so this is a benefit of using NG serve command that it will automatically transfer whatever information you have changed whatever file you have changed to the browser you don’t have to reload the browser or rerun the application again and again after making your changes I’ll just open the browser again and over here we can see the changes so that has made sure that whatever information we are seeing over here it is being rendered through app. component right now but we want to make changes to this particular HTML template we don’t want it for our hoc gadget shop we’ll be using some bootstrap to enhance the look of the page so we’ll see how we are going to do that so first of all we’ll have to install bootstrap in angular application and how we’ll do that I press control+ C that will end the session that is running right now and now we have to type the command npm install bootstrap save by using this command I should be able to successfully integrate bootstrap in my application see this is the node modules folder and if I I expand it you’ll see that there is no folder named bootstrap I have to install it in the application to use it so that is why I have to run this command I press enter okay after running the query it says that bootstrap is integrated successfully into the application over here we can see that we have the folder now bootstrap in nodecore modules folder and we have to include the Javascript file and the CSS file of the bootstrap into the application how will we do that this is the angular.js file and over here if you scroll down you have online number 29 for me Styles mentioned and this is taking the styles from this particular file currently this is styles. CSS in the source folder so over here we have to mention the Styles as well as scripts and what will be the scripts it will include the JS file from the bootstrap folder this is the JS file that we want to include over here we’ll take the relative path nodecore modules bootstrap then we have this folder jsst strap. bundle. min.js so we have mentioned the JS file in the scripts so this is kind of global setting that we are making in the application and we’ll also mention the CSS file this is not JS it will be CSS strap do men. CSS in the CSS folder you can see this file over here so I have included in angular.js and now what we’ll do is we’ll make some changes in app. component. HTML we want to remove everything that is currently existing and I’ll just type this keyword hello over here to make sure that new changes are getting reflected I’ll do an NG serve to run the application again okay great so the application is running now and we see that all of the content that we were seeing earlier on the browser we’re not seeing now we are only seeing hello keyword that we have typed in app. component. HTML so what I want to do is I already have some HTML template for the landing page first of all what we’ll do we’ll add the header for our landing page so the template that I already have for the header I have referenced it from bootstrap website only so this is the one that we’ll be using and I’ll make the changes I’ll save it it says page reload send to C great I don’t have to reun the application again and again and you see over here that the new nav bar is coming this is the header that we want to have in our application what we can do is make some changes on this this is hoc gadget shop so I have replaced that nav bar with hoc gadget shop we’ll keep home as it is and then we could have admin over here this could be inventory and this could be customers I’ll just save it and we can check it also yes these are getting reflected inventory customers so you see this all of the changes are getting real time updated on our application whenever we are changing it and what I want to do is I can remove link and disabled as of now from the HTML template because right now we don’t need those I have kept the drop- down option available this is to make sure that you also understand how we use dropdowns in the Navar and saving this so I come over here we have inventory and customers so these are the options that we are primarily going to change so I hope you have all understood how we have integrated bootstrap successfully into the application and utilize the CSS over here for our landing page now what we’ll do is We’ll add a component for inventory so for generating the new inventory component what we’ll have have to do is we’ll go to the solution and stop the running process with control c as I told before we’ll give the command NG generate component app components slash inventory so why I have given Here app component SL inventory because what I want the command to do is create a new folder over here because I want to keep the newly created components in a different folder so here you can see that app components is generated through this command NG generate component app component / inventory and if you open this folder inventory is the new component over here we can see that it is created now we’ll start working in HTML file of the inventory component now in this inventory. component. HTML file we’ll be giving the HTML structure of the inventory component and how do we travel to this component because we know that currently app. component is working the HTML structure is showing whatever we have given over there but what I want to do is that on this particular item click the drop down that we have on the browser we want to go to this particular component inventory component right so how do we travel to this page and for that what we’ll be doing is we’ll be using routing so this is a very important concept for you to learn if you are not aware about routing concept already so please Focus here for the routing to work over here what we have to do is make this app. component. HT HTML like the main template of our solution so you have to understand that is app. component is like the main page which will be shown on all of the pages it will be uniform throughout the pages in the application and after our Navar we are going to write router hyphen Outlet so what we want the solution to do is whenever we are changing the pages after the nav bar whatever we are giving whatever HTML template we want to give it should be displayed through this particular router what I mean to say is this is the nav bar and we have multiple options we have inventory we have customers and then we have home so if I click on any of these whatever template I want to show it should be displayed through this router outlet and the router Outlet will be displaying whatever HTML structure I am providing in the particular component that is inventory component over here because this is the one that we are focusing on right now so what we want to do is when we click on this particular item it should show whatever content is being mentioned in this file in this inventory. component. HTML now for that what we have to do is we can remove the href from here we’ll make mention router link and then we have to give the path so over here I have mentioned the path but now what I have to do is Define this particular path and how do I do that we have to open ab. routes. TS so in the routes array we’ll be giving the path mention the path name this is the path that we gave in the HTML structure and then mention the component so we are going to use inventory component so the intellisense is working and that is why I am going to automatically use it and this has automatically imported it because the visual studio intelligence is working over here if it is not working for you then you have to import it manually through the given particular file structure that you have and then use it over here so now I’m saving it and then I am saving this particular file also app. component. HTML I have given the router link over here and I have mentioned router Outlet in this particular file only so what it should display is it should display whatever is mentioned in this particular file so let’s check if it is working or not we have have to run the solution for that okay so the solution is running but before we go to the browser through this particular URL what we have to do is we also have to include the component the inventory component and the router module in app. component.ts file so we’ll open that and over here we’ll write router modules that we are importing from angular / router and import it over here also and then we are also importing inventory component so the intellisense works and the import is all already mentioned here automatically if it is not working for you you can mention it manually also as I have said before I’ll just save this after saving we going to this particular URL opened I go to inventory great so the page says inventory works that was already mentioned in the HTML template of inventory component and we have come to this particular part that we gave in the routes file this is happening after we are clicking over here on the inventory option so I hope all of you have understood how routing is working in angular through routing we can go to any particular page we are not reloading the page as it’s the benefit of using angular and we are traversing from one page to another keeping our main template intact because you see over here that we have the nav bar and we are keeping our main template intact and still traversing from one page to another the main template is mentioned in app. component. HTML so after this what we can do is we can give the correct HTML template over here that we want want to have on this particular page so I’ll just paste that particular HTML template over here remove this inventory works and I’ll be saving this HTML file go to the browser on the link so this is the basic HTML template that I have used because um the whole project series is regarding angular and then at the back end we using net core and SQL for the apis we don’t want to focus on bootstrap as much but I do want all of you to understand how it works and how you can use the bootstrap integrated into your angular application you can use this HTML template if you’re building a basic project and even if you want to enhance the project you can use the advanced learnings of some CSS so over here we have the submit form for the inventory details that we want to capture and you see over here that I have already mentioned some static data in the HTML template in the inventory details table we are going to populate this particular data dynamically I’m going to do it with you through the series and all of this data will be shown dynamically and not directly from the static HTML template so that we are going to do but firstly what we are going to see is how to add a new row how to submit this particular form for the inventory details and add this Row in our table that is the first operation of the cred operations is to create create a row in the table so this is what we’ll be seeing now because if I am just choosing over here I’m mentioning something and this is the product ID that I have or I could use any particular product from the dropdown I want to save it so if I click on it currently nothing is going to happen that we have to build the functionality we have to build along the way and also what I want to do is add a row for this particular form where I’m mentioning the details of the text box also because currently we are seeing the text box but we are not seeing what it stands for so that is one thing that I’m going to add in the HTML template and then start with the create operation for this form okay so let me just go to inventory. component. HTML file and over here I’ll add the code for a row which will just describe the fields that we have on the form and I’ll save it and come back to the browser over here we are seeing the labels also for the fields and now what I want is an action being performed when I click on this submit button after filling some values so for that we have to consider this form and take all of this data and submit the form once the data is filled for that what we’ll do is I’ll have a function created on the TS file okay so here is the TS file related to this component now what I am saying is on submit this should be a function return type is void I don’t want anything to be returned and I just want to alert form submitted successfully and now how will the form know that this function is to be called when I am clicking on the button from the HTML file that link has to be done right so for that what we’ll do is just have an NG submit attribute and call the function so this is the function and currently we are not validating any field so I’ll just add a no validate attribute also saving it one more thing that we have to do is add an import of the form module so what we are saying is we are asking the this particular comp component to import the forms module and perform the submit action we are having it in Imports saving the file going to the browser now and now I click on submit so just an alert should come and here it is it says form submitted successfully that is the message that we had given over here if you see form submitted successfully right so So currently we are not showing any data which was entered in the fields we are not just considering any data that was entered in the fields I just wanted you to show how the form submission would work and now step by step we’ll go we’ll take the fields also into consideration so after that what we are doing okay so now for that what I’m going to do is bind this input field that we have not just one but all of the input fields to the variables that we’ll access on the component.ts file so you might already be aware about two-way binding and that is what we are going to perform here for that so I’m just giving the data structure just giving it some default values we have product ID product name available quantity it could be zero and then we have reorder Point okay so the values with colons instead of equal to okay so saving it and I want to take these variables for the two-way binding so this is the first input field that we have and I want to provide the model for it inventory data dot product ID and also give the name that is product ID I think this should suffice the next one we have is product name just confirming the name from the DS file yes it’s product name so using the third variable here available quantity and then there is the last one that is reorder point so just I’ll copy the name from here and use it okay so saving it now what I want is I just want to show this data also okay what I’m saying is that this is a Json right all of the data would be available in Json format for that I just want to use the stringify function and pass the in inventory data in it saving it so we’ll go to the browser over here what I’m doing is giving some random values I click on submit and you see that the alert box is successfully showing all of the fields that are present with their correct values in the Json format so this part is successfully achieved after this we’ll move on to the backend code for the API and create a link to submit the form and send it to the web API request okay so for that I have opened visual studio and we are going to select asp.net code web API clicking on next over here and what I am doing is in this particular location where we had created our angular project I’m going to create the web API project as well I want to give it a name so it could be it you see gadget shop API next and now what I want to do is just keep the selected options as such and click on create so it’s created the project for me and opened the solution over here we see the files also that are created under the controller folder this is the default controllers file that is coming what we are going to do we are going to add our controller file and that is going to be an API controller so I’m right clicking on the controllers folder click on ADD and add a controller over here so from the template we are going to select the API and we are going to select empty API controller over here I click on ADD and now I should be giving the name of the controller as well so over here it would be inventory controller clicking on ADD so here I have the f created and what we want to do is add a function over here what that function will do it is going to be an function which will accept the request from the angular project so so from the front end the request will come with the inventory details and this particular function will process it so it should be returning the action result it’s going to be index or let’s have it save inventory data that could be the name of the function and this is going to be an HTTP post request so I’ll give the attribute as such and let me just add a return statement otherwise the syntax tax error would keep on coming over here it’s going to return okay inventory details saved successfully so the basic purpose of this particular function is to process the web request that is coming from the angular application now for that we’ll have to create a SQL Connection in this function because the meth method is going to save the inventory data into the database table through a store procedure for that we’ll be creating a SQL connection over here we are getting a syntax error I’ll just import the required libraries it’s going to install the new package automatically Al and now in the SQL connection we can also mention the connection string here only while initializing so I’m giving the connection string over here and I already have the value I can give it over here okay so after this what we want to do is just log to our DB why I’m doing that before is because I want to be ready with my table and St procedure before I make a call from the net to the database now over here we have our databases and gadget shop database is the one that we are going to use for our application let’s create a new table in this database I’m just opening a new query and we’ll give the command create table it could be inventory table with all the fields that we are sending from the front end product ID giving it integer data type then we can have product name and it could obviously be a v care the reason is that we’ll be passing some string value from the front end although we are passing it as a drop down index but yes we should be saving it as a string only because the name Will justify the data type over here and the reorder point would be int again and we also had available quantity okay so these are going to be the fields and we can also have an index kind of field which will be identity integer of course and it could be a primary key so that it is never null and always having a unique value we’ll create the table and now we can also create a stored procedure what we want to do is we want to save the data and we can give it exact name that we have over here for the functionality Spore save inventory data so this is the kind of nomenclature that I follow to define a stored procedure and what parameters it will take is it’s going to take these parameters and we know that when we are passing a parameter or it is a variable we are defining a variable in the store procedure we’ll be giving it at theate prefix and then giving the name so this is the parameter that we are using basically four parameters these are the parameters that we are using and what we want to do is just take the parameters and save it in the table going to give the column names that we are saving into so we have written a insert statement mention the column names and then we’ll give the values that we are inserting what are the values over here we have these parameters so it’s going to insert a single Row in the table because we are not inserting multiple rows at the same time we’re just passing single rows and entering over here in the table and I think I have to add a parenthesis yes so it should work fine so you might be seeing that there is a syntax error that it it is giving it says that invalid object name the reason is because currently the database is not refreshed I have just created the table and then open the window for the SQL Editor to create a stored procedure so the database is not refreshed as such if I open it again it would work fine if I make a connection to the DB server again but uh I’ll just execute this particular store procedure if it says successfully completed or executed the command it’s working all fine and there is no error in the store procedure otherwise we’ll see if you’re getting an error so I’ll just execute it okay so command completed successfully that means the store procedure is created successfully and all is working fine there is no error as such so you could also face this issue that you have already created the table but it’s still saying the object is invalid in the DB but not to worry it might take some time it doesn’t take some time to refresh you have to make the connection again and then it works properly fine if the next time you’re opening the SQL Server management Studio again it will not show any error as such okay so let me just go to the net code and over here we have already defined the connection string now the next step is to define the SQL command we’ve already mentioned the SQL connection now we’re going to give the SQL command and how we are going to do it we have have to mention the command text it’s going to be the store procedure name then we have to give the command type command type dot store procedure and give the SQL connection as well connection here and then take the variable that we have already defined now for the command text we’ll have to give the stor procedure name exact store procedure name that we have over here and mention it in the string so now this is completed after this what we’ll be doing is opening the connection so let me just open the connection let me execute the command now it’s going to be an execute non-query because we are not performing any retrieval from the database right now we are just inserting the data and we also have to make sure that we are closing the connection because that is very important it’s connection do close now this is all fine we have mentioned the command text we have opened the connection and we have executed our store procedure and we have closed the connection but the main task was to pass the request to the database now for taking the request we’ll have to create a model that is passed to this function from the front end like we had created a data structure in the Json format on the angular application the same way we’ll have to define a class in this application also that will accept the request through this function so let me just add a folder in the solution and we’ll give it the name models over here I can add a class it is going to be inventory okay it just got loaded inventory request D2 so this is what we are giving the name to our model now we have this class and we can give the properties over here what parameters will be passed through this class first is going to be product ID second is going to be product name so it’s the benefit of having intelligence because it’s giving you suggestions as well now we don’t have product type but we have available quantity so that is the name that we are giving and then we have reorder Point okay so let me just mention this last property and now we can use this class I’m saving it we can do a save all and use this class in the request okay so for this also we have to import this library now we have to pass it to the SP as well so let me just add it to the parameters of the SP now here in this function we have to mention the name of the parameter and then give the value that we are trying to pass so first is the product ID this is the value that we are trying to pass to this parameter and then we have three other parameters as well product name available quantity and reorder point so the reorder point was this yes just confirming that the names are matching exactly with our store procedure because if they don’t match will’ll get an error while executing the function so let me just save all of it now we’ll just run the application to check if the web API is working fine I’ll just close this dialogue box because currently we’re not setting up slsl certificate we just want it to run on the local okay so the application is running now and over here we see the web API that we have created this was the default that was given by the net code and now we have this function as well this is a post method and over here it’s taking these parameters we can try it out also what I’ll do is I’ll just give some random values over here and and give values for all the parameters and click on execute 200 is the success Response Code and that is what we are getting the body has the custom message that we had given in the function so it’s all working absolutely fine by executing it we can see that but how can we make sure that it is working absolutely fine we have to go to this particular table and check if the data that we had passed from the request is it saved or not in the table because that is the whole motive over here to save the data into this particular table through the store procedure by making the connection from net core I on execute and over here we see one row is successfully inserted into the table so we have made the connection from the net code to the database now the next step is to make the connection from the angular application to the net code web API now in the angular code in inventory. component.ts where we are submitting the request we have to pose the data to the net code web API for that what we’ll do is just import HTTP client and we can have an HTTP client injection in this class now this we’ll be using to post the data for that what are the requirements ments to post the data we want to do it on the submission correct so here on submit function in this we’ll use this HTTP client post the data and we have to pass the URL so we’ll have to define the URL over here as well it’s going to be the API URL let me just Define it in the function itself for now and we’ll give it the value the value that we’ll be taking from and paste it in Visual Studio code okay so we have given the value now yeah it’s going to be let API URL then we have to mention the body correct so for the body we have to pass the inventory data so after passing the body of the request this is not mandatory but we can give the HTTP options as well so if you have some specific header values you can include it here okay let me just Define the HTTP options close this let HTTP options so let me give the headers here and and it’s going to be new HTTP headers and then we can give the UN content type as well value could be application SL Json because this is what we are going to pass I’ll save it and over here what we are going to do is once we have the post method ready we have to subscribe to it this is how it works we’re going to subscribe to this post method because this is not going to execute simultaneously as we are hitting on submit function we don’t want to wait for it to execute this is an asyn call that we are making and in the Subscribe we give the actions that we want to perform what should happen if there is an error the main task is to Define what should happen once it is completed so once it is completed we want to give the function here and this alert should only happen once the post request is completed so we’ll move it here now we have defined the post call in our angular code let me just save it and we have to make a change in the app.config as well over here we have to provide HTTP client this function we are defining in the providers and saving so what I’m going to do is just put a debugger over here okay to check if the call is being made to the backend API or not what request is coming but uh before that there is one more change that we’ll have to make if you focus over here on the URL you’ll see that this is Local Host colon 7019 is the port that we are using and in the angular application that is running it is 4200 so we’ll have to make changes for this net core web API to handle the course request now for that I’ll just stop the execution once and open the program.cs file to make the change as we see over here we already have some methods being called for the application so we also have to make the changes for different servers request that are being passed to this web API so I’ll begin making the changes Define a variable and give it a value underscore same name I’m taking over here for the text value and then we have the Builder right various functions of services are getting called and over here only we’ll have to call the add course function also so I’ll make the changes for that options do art policy give the name that will be the variable that we have just created and then for the Builder we have Builder with Origins okay so over here we are going to give the URL of the Local Host that we have so I’ll give the Local Host 4200 we had okay just switching on my numlock and then we can have I think one without the server name also this way we have defined we want to allow any header we want to allow any method and set is origin allowed some domains so after this one more thing we have to do before authorization we have to call another function that is use course and over here also we’ll be passing the variable that we have created now I’ll just save this over here we had the syntax error after this what we’ll do is we’ll run the application okay but one more thing as of now we don’t want to return any message from here we are just returning the okay status so I’ll run the solution now why I have removed the message I’ll be picking it up later in the lecture that we will be sending a Json rather than a string value from here so that is one thing that I’ll be explaining to you but as of now let’s just try to save the inventory details I’ll go to the application once go to the inventory page now I’m putting some random values okay I will select mobiles from here that is another product rather than airphones because that is the one that we had already entered now I click on submit what it should do is exactly it should come over here on the debugger because now you see that we have made the connection from the front end to the net application to the net core web API and when we are clicking on submit on the angular application it is pointing to this debugger now the flow is created and now I click on continue so you see we have got the alert that this particular form has been submitted successfully that means that once the post post call is completed that we had subscribed to the alert is showing this is what we were trying to do and it is successfully achieved so I click on okay how to confirm that our data is getting saved successfully we’ll need to go to the database and over here I’ll just execute it okay so the select statement is executed and we see over here that the data is successfully saved and over here for the product name you see that we are passing the index from the dropdown so that is what is getting saved but the row is successfully created from the application so we are able to achieve the save flow from the angular to the net core web API into the database after this what we are going to do is we’ll be getting the data from the database on our angular application because what you see over here in the table this is some static data correct this is not the data that we have saved in the table so we need to bring the data from the database to the application on the front end in angular so that is the get flow that is what we are going to achieve now but now what we’ll be doing is we’ll be creating the flow from the database to the angular to get the data that is in the inventory table to this table on the screen so let’s go to the database first over here we see that this is the table that we are trying to fetch the details for let me just create a procedure and the name will be SP get inventory data so after this in this sto procedures we simply have to add this select query because we are only trying to get the data from this table and there is no other query that is required over here I’ll click on execute it says commands completed successfully that the SP is now successfully created in this database we can also check it by going onto store procedures we have another SP over here that is SP Spore get inventory data now we’ll utilize it in the net code so let’s go to the visual studio and we can just take this code and create another API that is going to be a get API earlier one was HTTP post and this is HTTP get now what we are trying to do is get inventory data and we don’t want any parameters because we’ll be selecting the data from the table without any filter also so let me just zoom it a bit and after this we’ll also have to change the name of the SP we’ll take the name over here and use it on this command text we can remove all of this code because we are not trying to pass any parameter to the store procedure now what we are trying to do is we are trying to run the command that is going to be execute Reader Command and what we can do is we could have a SQL data reader give it a name and using this SQL data reader we will be creating a object so basically what we want to do is create a list of inventory dto from this SQL data reader and return it as as a result of this action so let me create another model because to avoid any confusion in the name I am creating another model even though the same properties would be there in the new model as well that were there in inventory request dto so the name can be inventory dto I’m saving it take the properties from inventory request Dil only and just have it in this So to avoid the confusion we have created a separate model because if we want to add some properties in the class in one of the classes we could do it in future without modifying another class so let me save this and use it over here we are trying to create a list of inventory D and when we are traversing through this particular loop I want to create a object as well that will be added to this list let let me just give it the name response and this would be added over here so what we are trying to do is inventory D do product ID is equal to convert dot to reader that is SQL data reader and the name of the column would be from here so if we just run this query we see over here that product ID is returned and product name available quantity reorder point so what we want to do is give the name of this column in our code because that is what this code is going to read the the whole table would be red in the code and we want to just take the name of particular column in single line so right now I’m binding the product ID so I will give it the name for each of the columns we have different name so I’ll give a separate name over here product name then over here this would be two string because product name is a string variable if you see over here the data type is string then we have available quantity and then we have reorder Point let me just match it reorder point and available quantity so yes you have to make sure that the names of the columns that you are giving through the reader they have to be the exact names that are there in the SQL because otherwise the data would not be mapped from the SQL data table to your object otherwise it would create an issue but if you give the exact name then it’s not going to be an issue and the code is going to work perfectly fine so over here you see that intelligence is giving me the suggestion as well that we want to add the dto into this list so we’ll just take the suggestion and we have successfully added our D to the response and now what we want to do is we simply want to use a Json convert serialize object and what we want to serialize is the response so it will send send the response that is the list of inventory dto as a Json now and let me just save it do a save all we’ll run the code again we can build it also once we’re trying to rebuild the solution so that there is no error in the code that could be checked from here if the rebuild is completed successfully it says one succeeded because we have only one project in the solution so it means that our code is going to work fine syntactically logically it’s going to work fine or not we will be able to know once we execute it so let me now just execute the solution and the API the get API that we added we are seeing over here on the screen as well it’s added over here and to verify if the db2 net core web API all that connection is working fine or not we just do a try it out from here and execute the API it’s not going to take any parameters so you see that parameters is also mentioned over here that we are not taking any parameters and now we see the response body this is a proper Json that we are getting we have two products one was the product name earphones and then second one was the drop down index that was added from the front end code from the angular application both of those records that were there in the table we see over here those are coming on the screen successfully over here in the response body so this connection is successfully established now we just need to add the code in the angular to get it on the front end application so we’ll go to the visual studio code and in this particular component inventory. component.ts what we have to do is whenever the component is initialized on that point only we are trying to fetch the details so over here we have to add a new function that is going to be NG on in it once the function is getting initialized what we want to do is we want to get some data right from the back end so we already have the URL of inventory API we can just make sure that the get API also has the same request URL so over here we see that the request URL Remains the Same if we are not not passing any parameters it would be a get call so let’s make the changes for get call using this HTTP client that we had used earlier and this is going to be a get call we don’t need to pass any headers or the body we just have to mention the URL and over here we’ll only need to have the response of this get method so what we can do is just have the data and create a function around it this data is going to be the response of this get method let me just add a new object that is going to be inventory details of type any now whatever is being returned from the API from the get API I want to just map it to this new object that I have created over here once this is done what I want to do is utilize it on the HTML page but before that let me just do one thing I’ll do a console log of whatever is being returned from the API on the web angular application let me just save the details so let’s go to the browser and just open the console window we’ll do a refresh of the page and now we see over here that we are getting the response from the database on the web application as well so whatever is being passed from the get method on the web API that we have created over here we are successfully getting it on the front end as well so that connection is successfully established now the main task is to just bind it to the table because currently this is all static data we just need to bind it to this particular table and then we’ll be getting the rows dynamically from the database and not from the static HTML template so we’ll go to the inventory. component. HTML file and over here I’ll just take this row and create a dynamic row over here then we’ll comment the static data that we have first let us just create the static row we have to use an ng4 for this let inventory of invent details so what is inventory details it’s this particular object that we have created on the component.ts file I’ll just paste it so that the exact name is matching over here and in this inventory details for each of the object we have to bind the property name the name that we are getting over over here so this is the product ID so we’ll just map it and after this product ID we can bind the product name as well and then we have available quantity with a capital A so you have to make sure that what whatever property name you are giving it is exactly matching with whatever data you are getting over here because this is the response from the web API and it is there in your inventory details object so let’s go to this HTML file and yes we have made the changes binded whatever dynamically we are getting on the row and how many rows will come the number of rows that are there in the inventory details the number of objects that are there those many rows will come in the table of this tag that we have so let’s save the changes but yes first let’s just comment out whatever static data that we had given earlier commenting it out and now saving the changes let’s go to the page again it should be reloaded yes because just after saving the changes we are able to see that the reload is done and over here we are getting whatever we had saved in the table that is successfully being fetched on the screen as well this is all Dynamic data this is not the data that we had given in the HTML template it’s all dynamically being fetched from the table to the angular application on the screen so we have successfully established the get flow as well I hope you have understood what whatever we have covered till now now what we’ll do is we’ll create the functionality of this delete button what if we want to delete a particular record from this inventory details table for that we’ll have to write what action we want to perform on this delete button it has to take the selected record and delete it from the table so let’s get to the flow of creating the delete functionality we’ll go to the database and and over here we’ll be writing the store procedure First We are following the same approach that we did for the get flow it is going to be delete inventory details and over here if we just check that in the table inventory table what key should we take to delete the record product ID would be the appropriate candidate to delete the record from this particular table that we are sending from the front end to back end what we’ll do over here is just take this query and it’s not going to be a select query it’s going to be a delete query we want to delete the record from inventory table where product ID this is going to be a filter so we should Define the filter as well the stored procedure is going to accept a parameter product ID of data type in and wherever the SQL is able to find a record according to this filter where product ID is equal to at theate product ID that is being passed to the store procedure that records should be deleted so this is the store procedure that we are creating for deleting the record let me just execute this it says commands completed successfully that means that the store procedure is created over here if I just refresh it you’ll be able to see a third store procedure for deleting the inventory details so now we’ll use this store procedure in our net core web API I’ll just stop the exec ution and take the help of the existing method to create a new method now added an extra curly braces now this is going to be a HTTP delete method and we don’t want to get inventory details we are simply trying going to delete them the stor procedure name we’ll be copying from here and we want to pass the parameter now to this sto procedure so we’ll just remove all of this reading code that we earlier had for get inventory details we don’t want to send any response just send the okay code as of now and the command would be execute non-query because we are not trying to read any data but before executing the command we have to pass the parameters it’s going to be the similar approach that we had followed while saving the data the name is going to be atate product ID this is the exact name that the SP is taking as a parameter and what should we pass over here this particular web API method would be accepting a product ID so this is what will be passed to the store procedure as well we’ll just save the changes and now I’ll execute the solution so closing the dialog box and we have to see if our new web API method is added or not while the solution is running so over here it is added this is a delete API that we have created and we can try it out also we have to pass the product ID over here so if we just check what data is existing currently in the table that we also can can see over here that these are the records in the table right so we want to delete let’s say the second one we can create it also from the front end so no problem as such I’ll just pass this particular value and click on execute so over here we are getting a code 200 that means the API is successfully executed and if we see now in the table so we are a able to see that the second record is deleted so that means that our API is working perfectly and now what we have to do is just add the flow from the front end to the net code web API so that we are able to delete the record from this particular page if I just refresh this now it should have only one record so yes it is having one record only and what we’ll do is just add some value over here so that we are able to see multiple records instead of only one so let me just click on submit it says form submitted successfully I click on okay and check the table if the data is inserted or not so yes we are able to see the record inserted over here and what you’ll be able to see is that the ID column is having value three the second row was deleted right and and after that as we have set it as an identity column the next row that would be inserted would take the sequential key only so the identity would just increment it by one and set it to three it will not be two so that is one thing I hope you all are aware about for the identity columns when they are set in the SQL and over here we are able to see that we have successfully inserted the record but what is not happening is that this table is not getting refreshed so this is a very good point that we have found while inserting the record again in the table that the table is not refreshing on the go when we are clicking on submit for that what we need to do is this is the NG on init function right this is getting the records from the table this is the second flow that we had created now when I submit the record in the table from the screen what I want is the table should also get refreshed on the go so for that we’ll take this method and once the post is completed once we have the alert we also want to refresh the page so that any new data that is inserted in the table it should reflect on the screen as well so now if we go over here and I just add add some random values again I click on submit okay and you see that the table is getting refreshed now because we have called the NG on init function and as it is only fetching the records as of now we are able to call it successfully otherwise we should have called another function which was just getting the details so so what we can do over here is for the reusability approach we’ll just add another function inventory details and this function will Define it over here because NG on in it was only calling this code that is why we could easily use it in the onsubmit functionality as well but if you have more code in the NG on in it if there are some more functionalities that you are performing over here then it is a better approach to have a separate function for getting the inventory details that you could use in NG onate and then this function can be called on submit as well over here so now I’ll just save the change over here and we’ll go to the screen make some changes that is add a new record and click on submit so everything seems to be perfectly working fine over here we are able to get the records while we are submitting we are able to get the latest table details as well after this now what we have to do is add the delete flow for that what we’ll be doing over here so this is an interesting point now we have already included bootstrap in our application and now we’ll be using NG bootstrap for creating a dialog box that would be displayed once the user clicks on delete button so when the user clicks on delete button what should happen is a message should be displayed over here on the screen so as to have a confirmation from the user that they really want to delete the record because it is a possibility that they clicked on it by mistake and we don’t want to just directly delete the record we want to confirm from the user that they are trying to delete this particular record okay so let us now add the functionality of this button what do we want to do in the HTML file on the delete button click we want to call a function whenever this function is clicked we want to open confirm dialogue this function can be called and over here we’ll Define the function so which dialog box will it open now for that what we’ll do is will create a new component and in the dialogue box we are going to utilize the model feature of NG bootstrap so let us just stop the execution for now and create a new component that can be named dialog box because I want to keep the HTML structure separate from this particular component that is why we are creating a dialog box component that could be utilized at multiple places according to the HTML structure that we have okay the HTML structure would go over here but we have to install the NG bootstrap as well so let us install NG bootstrap for this and this is the particular command that we are going to use so over here we are doing a save with Legacy Pi depths it is because we want to eliminate any risk of conflicts that might arise due to peer dependency that NG bootstrap would have and also we require another installation for this that’s that is for poer JS / code after we have installed both of them we’ll be using the HTML template that I have already reference from the NG bootstrap documentation over here in this component. HTML file and save it now we want to open it from here is that correct from open conf firm dialogue so for that we’ll be using model service we are going to inject model service using the NG B model and you see that the import is automatically done through intellisense I’ll save the change changes we are going to utilize this model service to open the model and in the function we have to pass the content that we want to open so this is going to be the dialog box component and once we just mention it over here it is automatically imported as well I’ll save the changes over here and in the dialog box component what we have to do is import NGB active model from NG bootstrap and then there is a function that we have called while we are clicking on okay so let me just give it the name confirm over here and this function we’ll just mention here over here we don’t need to perform any function but we do need to close the model so for that what we’ll be doing is on the confirm button on the confirm function that is being called from here we’ll just have a model do close so rather than directly closing it over here like this we are closing it in the TS file because later on we might need to add some functionality in this particular function so let’s just go to the browser now and over here we click on delete so you see the model is opening we have a dialogue box for the confirmation of deletion so this is working perfectly fine now how do we delete the data that is selected it’s not selected through a checkbox we are directly clicking on the button of a particular record so how do I make sure that that particular record is deleted so we’ll need to pass the product ID correct the product ID that we have over here we’ll need to pass it okay and we see that this product ID is the product name so let us just change that also in the inventory. component. HTML the product ID is the hash that we have over here and this is the product name so you see as you keep on making the changes you’re able to see what Perfections you can make to your application right so this is the product ID that we need to pass to theet code web API when I click on delete this particular record should be passed to the web API this 21 31 and how do I do that I come over here we already have the inventory. product ID right that is being displayed so we can simply take this and pass it on this function why I’m doing this because when we are opening the confirmation dialogue box a record should be made there should be a record of which product ID I had clicked on which product ID’s delete button I had clicked on so for that I’m passing the inventory. product ID as well from here and I have to capture it I have to capture it okay it’s not going to be int like in the net it’s going to be like this and this product ID can be maintained over here let me give it product ID to delete so as to clarify what is the use of this particular variable now as soon as I’m passing the product ID from the HTML page to this function I want to keep a record of it so when I’m opening the model ser service what should happen is the product ID to delete should be set as product ID that is passed from the HTML page I click on Save now once I have saved it let us just confirm if it is working all fine once I have made a record of which product ID is getting deleted so we can do a console log as well I come over here do an inspect go to console window I click on delete so you see we are getting the product ID 2131 and this is going to be passed to the backend API to be deleted now it should only be deleted once I click on okay if I click on cancel it should not be deleted am I right so for that what we’ll do is we’ll make a record of what is being sent from this D dialog box so how do we do that in this particular dialog box once we are closing the dialog box and we are only closing it on the confirm function and the confirm function is only called on the click of okay when we are clicking on cancel model. dismiss that is directly called from here and we are not confirming anything so on the click on confirm only we are performing this particular function and from here we want to pass some result that result would be let’s say confirm this is the result that we are passing from the okay button click I’ll save it and we need to capture this result as well so how do we do that over here we have to capture the result once we have got the result I want to perform some action what kind of action am I performing I’m going to delete the data so how do I do that but the question is not how there should be a condition that once we have got the result we are checking also that the result is is what we had expected so if data that we are taking from the result this data is being passed from here if the data that we are getting in the result and the property of the data that is event as I have mentioned here its value is equal to confirm so only if the prop property of the event is equal to confirm then something should happen so let’s do a console log over here as well confirmed to delete else delete not required so I saved it and let click on okay so you see this is being sent from the dialogue box event is confirmed and that is how we are able to get into this condition and it says confirm to delete also now that we think of it it will never go to the else condition why because we are not sending any other data any other result from here so this this particular condition would not be even satisfied because we have only called confirm functionality from the okay button and that is sending this particular response so it’s never going to go into the lse statement so that is how we will be able to specify whatever we want to do when the event is equal to confirm so now over here what we want to do is we want to delete the inventory details so this particular function let’s mention over here what we want to do in the delete functionality and this is going to be a this. delete inventory in this particular function we want to have the API URL that we’ll take from here okay let me just give some value as well so that we are able to check the URL okay so this is the URL of the request and I want to pass it in the HTTP client delete function API URL and then we could also utilize the HTTP options or I think we can skip it for now because this is just a delete function we want to subscribe to It Whatever data we have got we just want to do some action once we get the result but before that over here you see that this is a static value that is being passed in the URL but we cannot pass the static value we have to pass whichever row was selected for the deletion from here so how do we do that we are already saving it in this particular variable so we we will utilize this variable and because this is under the class scope and not just the function scope we can directly use it over here so whatever value is set over here when this function is called delete inventory we’ll be able to utilize the value in this URL okay so now once we are able to delete the data successfully after that what we want to do is get the refresh table from the database so we had created a function earlier that is what we’ll be utilizing over here and this particular function will be called once the data is successfully deleted so let me just save the changes and uh we’ll go to the browser now okay I click on delete for this particular record it says are you sure you want to delete it I click on okay and you see that the event was confirmed the data is deleted new refresh data we have got from the database as well that means the delete functionality is also working fine if I just add something over here I click on submit this is it and now I click on delete this is also working fine if I click on submit let me just click on submit again add a new row click on delete and click on cancel so you see that nothing is happening if we click on cancel it’s only deleting once I click on okay so this is how we have integrated the dialogue model from NG bootstrap into our application created the delete functionality with end to end flow from angular to net Cod web API till SQL and all of the data is being refreshed On The Go only okay so the save functionality and delete we have just seen that it is working all fine and we are able to read the data from the table as well let’s move on to the next functionality that is edit so we are going to update the data in the table for that let’s first move on to our net code we we create another web API I’ll just stop the execution of the solution and take the help of the existing code to write another web API this is going to be a HTTP put API and we want to update inventory data over here we are going to use inventory request dto so in this particular web API we’re going to accept the whole inventory request dto we just don’t want the product ID or the product name we are going to accept whole of the inventory request dto and save whatever is coming from the front end to the backend now this particular SP we’re going to create in database for first over here we have the table inventory table and this is the data in the table right so to select which of the particular column we need to use to update the data we’ll follow the same process that we had done in the deletion of the record we are going to use the product ID column over this column only we’ll be putting the wear clause and then update the record in the table let’s create a new store procedure create procedure the name is going to be update inventory data so if this particular table is to be used just let me select something from this table over here these are the columns in this table so what I’ll do is give the parameters names as well because we are accepting a single row only just like we did in save inventory data this is going to be integer product name it’s going to be Vare according to our table these are the data types that we are using then it’s available quantity integer again and then we have reorder point that is another integer column after this what we are doing over here is we are not selecting anything in this St procedure we are trying to update the table and in this particular store procedure we want to set the columns with new values right that are coming as input from the net code to the store procedure we have product name then we have available quantity then we have reorder Point these are the three columns that we are updating and which particular record we want to update that condition we’ll give through we clause on this particular column we have already selected this column we use this column while deleting the record as well so we are updating the record through this column only in the wear condition wherever the product ID is matching with the one that is being sent from the net code to the store procedure in DB we want to update that particular record so it is possible that uh one of the column is only changed with the new value we are trying to update all of the columns in one go because even if one of the columns is not changing we are still updating it with the old value only this is the product ID column that we are using for the wear Clause okay let me execute and now over here it says commands completed successfully that means the stor procedure is successfully created in the database we could also check from here so now you are able to see that we have this fourth stor procedure update inventory data let me use the name I’ll just copy and paste it over here this is the sto procedure that we are trying to call now and now what parameters do we want to pass to this particular sto procedure we have are passing product ID but it is not from a single input we want to use the inventory request and that product ID we are trying to pass to the store procedure then we have another three columns the one is product name we’ll use product name over here from the inventory request then we have available quantity the exact name we’ll use over here then we have reorder point so these are the four parameters that we are passing to this store procedure and we’ll keep the command as execute non-query because we are not trying to read any data from the DB we are only trying to insert the record also you have to make sure that whatever parameter names you giving over here these have to exactly match with the ones that are mentioned in the sto procedure because otherwise it will give you an error while the execution is in progress so the parameter names should exactly match with the parameters name over here in the store procedure now after saving let’s execute this solution okay now you see that we already had three existing web apis we have another one that is for put we have created it to update the record in the table and we can try it out also from here now let me just check yeah this is the current table data okay we already have some record so I’ll take this product ID and update it and the product name is three available quantity is this so let me just take available quantity and the reorder point also from the table now what I want to do is I want to update the product name it is three as we have seen in the table I want to update it to mobiles I’ll click on execute and you see we are getting 200 success code we can also verify from the table now I’ll run this query again okay so I hope that you are able to see that we have successfully updated this particular column product name all of the other values we had also given as they were old only those are coming as such but the product name is updated with the new value mobiles which was three earlier so our net core web API is working absolutely fine now we need to concentrate on the front end angular part to make the connection and execute the update API okay so let’s just go to the browser first over here what should happen when we click on edit from the front end application when I’m clicking on edit for this particular row all of these fields should be populated with the row data so the product ID text box the name drop down quantity available reorder Point all of these should be populated with whichever rows edit button has been clicked on so let’s just go to the solution for that ones okay we have to go to visual studio code and over here we see that the form which is there and the text box the drop down the two other text boxes both of these are binded with this NG model inventory data that is there on our component.ts file over here we have already made The Binding from this particular structure on this form so what should happen when we click on edit over here on this edit button click this particular inventory data should be populated with the row that is selected and how it would happen we’ll just write the code for that now so over here we are defining a new function when I click on edit it should populate form for edit now in this function which particular data structure should be sent to the component.ts file from HTML see over here we are traversing through inventory details in the for Loop right and each of the row is binded through inventory so this is the inventory detail that we will send to the component.ts file I’ll take the name from here for the function and in this particular component TS file we’ll write the implementation of this function this inventory we’ll just give it the type any as of now and in this component.ts file we have inventory data that we B to bind with this inventory we cannot simply bind this inventory data to the inventory that we are getting from the HTML we have to assign each of the fields in inventory data the correct value from inventory so how do we do that in inventory data the field names are like this product ID and in inventory we see on HTML that the names are like this product ID with a capital P so we’ll take the names from here and assign each of the fields this way after product ID we have product name and similarly we have product name in HTML like this next we’ll take the available quantity and bind it to available quantity in in inventory data then we have reorder point so you see that we have simply binded the inventory data from the HTML inventory that we are getting as input in this particular function I’ll just save the changes and on HTML also we are saving the changes now let’s go to the browser and check if our code is working fine in the second row I’m clicking on edit so what happened was I click on edit for the second row correct and all of the text boxes are correctly binded but it is not happening for the drop down and why is that because we were sending index to save at the back end from the HTML page we have an index associated in our drop-down with each of the values so this is an interesting point over here for the drop-down we have indexes given to our values in the each of the option associated with the text that we are showing in the drop down so now I’m just simply giving the value same as the text that we are displaying and then we’ll see if this is also getting binded correctly when we click on edit I come over here the page is refreshed I click on edit okay so the drop-down value is also getting perfectly binded now all of the fields are getting populated when I click on edit from this particular row if I just click on edit now here on the first row this is also working perfectly fine you see that all of the form data is getting prepopulated when I click on edit so I hope that uh this was an interesting point for all of you to learn about the drop- down control on the form that for the drop down options we have a value and the text that is displayed associated with each of the options we can keep it same also and we can give it indexes also that we had given earlier and earlier when we were trying to save the data value would only get saved to the backend table now we have kept the value same as the text of the options so that would get saved now as a string value now over here on the form you see that we are able to populate the data correctly but what happens if I just change thect product ID from here because currently in the table these two are the product ID values existing right and if I change the product ID only then what will happen I’ll not be able to update any of the record in the table because that product ID would not exist correct we already have the wear condition which is checking if the product ID is matching with the one that is being sent from the front end and if I’m able to change the product ID then there is no point of sending any data to the backend to update it will not have any record to update for that particular reason we’ll have to disable the product ID text box over here when we are trying to update the data and how we’ll do that we’ll come over here on the input that we have for product ID we’ll give the disabled attribute and this particular attribute will be set according to a variable that we’ll Define in the component.ts file so I’ll give it the name disable product ID input and we’ll take this variable and Define it over here in the component.ts file this would be set to call we don’t want to disable initially and then what we want to do is whenever I’m clicking on edit at that particular time I want to set it to true this is what should happen so whenever I click on edit from this page and the row is getting populated over here at that time I want to disable it so let’s just save this and we’ll save this also now we’ll go to the browser once I click on edit and you see this is not at all editable this is disabled I’m not able to update any value in the product ID text box so after this what we want to do is we want to change some values over here let’s say I want to just up update the quantity available I want to update the reorder Point as well and I want to submit it if I submit it the backend update web API should be called and for that we’ll have to change the submit functionality over here we already have a function for submit button now this is currently saving the data this is currently doing what it is calling ing the post method of the HTTP client but we want to update the data now so for that we’ll have to do an HTTP put but when should we do an HTTP put we should only do it in case we are trying to submit when the text box product ID is disabled so that is when we should update the data otherwise when it is enabled we should be saving the data as we were doing earlier so that is why we’ll have to put a check over here this dot disabled product ID input is equal to equal to True at that time we should simply call the put method otherwise we should save the data so this should work as it is let me just format the document okay so whenever the product ID field is disabled it would call the put method otherwise it would call the Post method so let me just save it and now we’ll go to the screen we’ll click on edit making the changes again and you see the values are different from which were given earlier in the table so I click on submit now and it says form submitted successfully I click on okay and you see as soon as I clicked on okay this is showing me the new values over here so this is working perfectly fine we can see it in the table also the values are updated but as we have already put a fetch call on this particular table when we are submitting so it shows me the new data in real time because when I click on okay the fetch call is instantly made to from the back end now we’ll just do some minor changes on the application just to enhance the look and feel of the application what we want is when we click on okay on the alert box we want that the form should be reset so for that what we’ll do is we’ll come over here in the code what we want to do is we see that we are calling this particular function get inventory details whenever we are deleting the data whenever we are saving the data right so over here in this function we want to make a change whenever we are getting the fresh data from the table at that time our inventory data this particular data structure that is being binded to the form control it should be just reset so we simply want to set it to the new values and after this we also have this disabled product ID input right this should also be set to false because in case we have edited some data this would be set to True from this particular function right and it would only be set to false when we have explicitly mentioned in some function and as we are calling this function every time we are trying to refresh the table we should set it to false so this should work now we’ll go to the browser I’m just adding some new data now okay and I click on submit so this should add a new row and you see that the form controls are also getting reset let’s just try to edit something as soon as I clicked on edit product ID is getting disabled I’m only able to change the values from here I click on submit and you see that the product ID field is also getting enabled as soon as I clicked on okay on the alert box because the refresh whenever it is happening for the table it is refreshing the form controls with the product ID input as well so all the crud operations for the inventory page are now completed we are able to save the data we’re reading the data from the table and that is all happening in real time because as soon as we are making any updates like save delete edit we are able to call the read functionality as well so I hope all of you have understood whatever part we have completed till now now after completing all the crowd operations for inventory we’ll move on to the next component that we were going to complete for this particular application we had created two options over here in the drop down inventory and customers so the inventory component is completed with all the crowd operations now we’ll move on to the customers component we’ll be creating a new component for this in our code as of now if I click on this option nothing is being displayed except the header that is given in the app. component. HTML file so what we’ll do is we’ll go to the visual studio code and try to create a new component for the customer I’ll just stop the execution and what I want to do is NG generate component in the app. components folder where we already have inventory component I am creating a customers component as well so you see that the customer component is created over here this is the HTML file for it and as of now it is showing customer works so in the app. component. HTML file we want to have a router link for customer as well we already had for inventory now we want to add for customer over here in the customer option I’ll be mentioning customer over here and in the app component.ts we also want to import customer component and here it’s saying add import I’ll save this after this in the routes. TS file what we want to do is we want to give another p path we already have given a path for inventory now we want to give the path for customer as well so the this path the value of this path is going to be the same that we have mentioned over here in the HTML file so I’ll just take this and paste it over here so that there are no differences in the two values and after that we have to mention the component that we want to use so over here it is going to be customer component so sometimes the import is just done automatically I have saved this and after mentioning the New Path what I want to do is I’ll just do an NG serve again so that our application is running and we’ll go to the browser through the link okay now I’m going to the customers option so you see over here it says customer works that means we have successfully added a new component in our application for customer and we have added the router link for it as well after this what we are going to do is take a static HTML template for the customer I already have it referenced from the bootstrap and we’ll use it over here we’ll be pasting our static HTML template and after this I’ll just save it to check if the changes are getting reflected in the browser or not so we have come to our browser and we see over here that now the page is showing the new template on this page we see that we have a button to add a new customer then pertaining to each of the row we have edit and delete actions so now what we’ll do is we’ll give the functionality for this button first for adding a new customer because if you see over here in the inventory we had given the form like this on the inventory page itself that we could add the new row from here through this form and submit it right but in the customers we have added a little modification the reason is that you’ll be able to choose whichever template you want to and we just have a variation in our application so now what we want to do is when we are trying to add a new customer a new popup should be shown with the customer form that is the basic approach that we are going to follow over here and for that what we want to do is add a new dialogue component in our application so let’s go to the visual studio code and I’ll just stop the execution again I want to add a new component in the app components it is going to be customer dialog so this is what I want to add to this application this is a new customer dialog box and over here we have the HTML file for it now in this customer dialog box called component. HTML file we’ll be giving the form template for the new customer I’ll be using the one that I have already reference from the bootstrap and let me just paste it over here okay so you see that we have some input controls for customer information and then we have a add button over here let me just save this now when should this dialog box open it should open when we are trying to add a new customer right so in this customer. component. HTML we have the button over here when we click on this button it should open customer dialogue this is the function that we want to give in the customer.com component.ts over here we are mentioning the function and for this we’ll be using the model service that we had used for inventory. component. HTML file when we were trying to delete the record so let me just have a private model service variable created for it and we are trying to inject NGB model and for the inject we want to import the core module the options would be mentioned over here now after this we want to use our model service to open customer dialog box component this is what we are trying to acheve let me just save it and let me save here also the changes that we have made for opening the okay let me just give it the name customer not custom it and over here also the name should be customer okay after this let’s just run our app application okay go to the URL and now when we are going to customers when we click on this add a new customer button you see that the dialog box is opening so this is working perfectly fine that we are trying to open a new customer form through this button this is is a different approach that we have used to open the customer form over here we can enter the information and then click on ADD after this what we’ll do is we’ll be working on the functionality for this add button that should be saving the new row in our database so let’s just go to the database and over here we’ll try to create a new table for customer and I can give it the name customer details okay now after this The Columns that we have on our HTML one would be customer ID then first name we’ll give it 5050 okay and then we have last name email date of registration so let me just give it registration date and it could be of date data type I think we had another the customer HTML we can just check phone number yeah this is the one that we want to add okay got something over here the email spelling I’ll just save the changes so you see when you just keep on going through your Cod again and again you’ll just find some changes that you want to make to make it better phone right it was phone we’ll give it Vare only and could be off let’s keep it 15 digit however we want to save later on with the code or not maximum is 10 though now we’ll create the table and now what I want to do is I want to create a stored procedure it should be save customer details we are trying to insert the data in this table and these are the columns right so we’ll just mention over here we have to remove the data type in the insert statement and after this we want to just take these values and have them as parameters over here so but here want to have the parameter names but these should have the data types so let me just copy from here once again your code helps you to write the further code faster and now I’m giving the parameter names yeah just a syntax error that over here they should come the parameters before as keyword and we want to insert all of these in the table so I’m taking the names from here only and let me just give the character and to add the brackets for the values okay and also you might be wondering that how did I give the at theate character before each of the variables in one go so you press shift and ALT and then you just drag the mouse so you’ll be able to add anything so you could just add any character or delete any character so it would work for all of the rows so this is something interesting that I had learned some time back while working on a project in the office now we just want want to execute this query to create the new procedure okay after this what we want to do is we want to work on the API so over here I’ll just stop the execution of this solution and now you see that we already had the inventory controller correct we were working on this one for doing all the crowd operations for our inventory component now what we want to do is we want to add a new controller and this is going to be an API controller let’s select the empty one and click on ADD now it will request the name so we are going to give the name as customer controller and you see over here it has already mentioned the route as well from the inventory controller we just want to take the help of the first method that we had for saving the inventory data correct now we want to do a similar thing over here we want to save the customer data so this is going to be save customer data and for this want to create a new DET going to be customer D customer request D and after that over here we’ll be mentioning the properties and is going to be then it’s going to be the first name after that it’s going to be last name so you see intelligence is very helpful then we have email so that we have phone and then we have registration date okay uh we can give it date time I’ll just save this and use this customer request D over here the SQL connection would remain the same but we have a different store procedure now this is save customer DET also uh what we can do is we can keep it the naming convention same so I’m just dropping the to procedure once and we’ll create it again and we’ll create it again with the name Spore okay now it says commands completed successfully so we can check over here that in the store procedure we should have this particular store procedure as well so you see that we have got the SP save customer details and now let’s just work on our API the name of the store procedure is this one we have to take it from here and there are multiple parameters that we are working with so those also we can take it from here only one is the customer ID then we have post name then we have last name this is the one that we can use over here after that we have [Music] email and then we have phone and registration date so you see these are the different parameters that we are sending to the store procedure through the request detail that we receive from the front end so that part we’ll be covering later but as of now we have completed the coding for our web API so let’s just save this and we’ll run this solution okay so the solution is running now so you see over here we have got our new controller customer and and we have created a post method for it for saving the new records in our database let’s just try it out we’ll give the customer ID some random value and some random values for the email phone number okay let’s just keep the registration date as it is and we’ll be executing this you see we have got a 200 success response but how do we verify it we’ll go to the database and execute the query to select the data from this table that we have just newly created so you see the new record is successfully inserted into the table and we have completed the flow from the web API to the database successfully now we’ll be working on the flow for creating the connection between the front-end application and the web API let’s go to the visual studio audio code now and over here in the customer dialog box component. HTML file we’ll need to add a new function that is triggered when we click on the add button similar to the one that we had done in inventory component on this form I’m going to mention NG submit and call a function on submit for this function we’ll be writing the code for but before we proceed to write the functionality of this function what we want to make sure is that all the form controls that we have on this page they are matching with the ones that are required to be send to the web API so over here we’ll come and click on add a new customer so you see that we have first name last name phone number registration date at email but we don’t have customer ID over here so I’ll add a new text box for customer ID and one more thing we are going to do we’re going to make the changes so that the date is entered through a date picker and not in the text box so let’s make the changes for that over here I’ll just just add a new column so I’m adding it next to the email for now and this is going to be customer ID or let’s do one thing what we can do is just change this particular form control to phone number over here name should be same as this is email would be email only and what I want is that the customer ID should come over here so I’ll take this I’ll be taking this and add it over here and this should be customer ID the same name is given in the label then this is first name last name phone number and email so this is all fine now we need to change the registration date right so what we’ll do is we’ll give the type for this particular control that is registration date as date and the class would be form control only so I think this is fine I’ll just save it okay it says on submit is not there so for now I’ll just remove this and we’ll add it again once we are writing the functionality okay we’ll go to the browser and over here you see that we have a new form now we have customer ID added because we need to send it to the database for the row to be added for a particular customer then this particular registration date we had a text box in place of the date picker now we have a date picker and through the calendar we can just enter the date over here then we have first name last name name phone number and email so this looks perfectly fine now we’ll be working on the functionality for this add button let’s go to the visual studio code again and over here I’ll just save this and add a new function also over here so that the error is remove on submit I’ll say save it you see the error is gone now so now we’ll just start working on the onsubmit functionality we’ll have a API URL that we need to give and I’ll just mention it over here before that we have to write the HTTP options this is similar to the process that we followed for in entry details HTTP headers then over here we’ll find the properties for it authorization in case you have a token that could be passed through this and then we have content type that should be application slash Json headers are defined and after this what we want to do is we want to call the Post method of customer API so I’ll just take the URL from here itself because I know that I only need to change the controller name the rest of the API URL would remain the same so over here in the API URL I only have to change the controller name that is customer now for calling the post method what we need to do is have HTTP client defined over here we are going to inject the h HTP client and for this particular error we’ll need to do an import of angular code after this what I want is I need to call the HTTP client and we need to post so we are trying to call the Post method that is all good but we still haven’t defined what we need to send to the backend API to be saved into the database for that we’ll need to Define an object over here that would be mapped to our HTML page and would take the data from the form to the backend API so let me just Define customer details over here and we have several properties that we need to send first is customer ID then we have first name last name registration date phone and Emig okay so this is all of the information that we would be sending but how do we bind the data from from the HTML to this particular TS file for that I’ll just save this once and we’ll just take the name of this object over here we’ll come to the customer dialog box component. HTML file here we need to bind it so this is going to be through NG model and the value would be customer details customer ID and then we have name property that would be same as the one that we have in the object so similarly we’ll be giving the NG model for all of the control this is for customer ID then we have have registration date let me give the correct name over here and then we’ll give the NG model binding for registration date so this needs to be changed and the name would be same over here so let’s do it for all of the controls first name then similarly it would be [Music] for first name this is phone number and these are the two have given the property as phone so that is what we are keeping over here let me just show you confirm it once on the TS file phone right correct over here think we have done the bindings for all of the controls I’ll just save it once okay so you see that as soon as I saved it some errors have come we have done the binding but we need to import over here forms module and common module I think this should fix the error yes okay now we’ll be working on the post functionality we want to send this data to our backend API this do customer details then we need to pass the HTP options and we are going to subscribe to this HTTP post method just doing a console log over here for error we again want to do a console log but once this is complete what we want to do is you want to write the functionality for completion of this post method we are going to give an alert box that gives the message customer details SA successfully and which details are saved that we can show through json. stringify by passing the customer details over here so the functionality is all good we’ll just save it once after this we will go to the browser try to enter new customer information give some random values let me just give some random values over here I’ll click on ADD and it says customer details successfully saved saved successfully that is the message that we had given and over here you see this is the object that we are passing to the backend API and now generally when we are developing an application for the end user we don’t show what particular data they have sent to the back end right because they don’t want to see a Json being shown to them in the message box this is for our convenience that we are able to see what all data has been sent and this is just for development purposes so I’ll click on okay but we need to confirm that all of the data that we were trying to send has been already saved in the database once we have got the alert box so for that we’ll just go over here and I’ll click on execute so you see whatever data we sent this was all random data we just sent some random values from the front end and those are all saved successfully into the database so the save functionality is working perfectly for the new component that we have created that is customer we can start working on the get flow of the application but first of all we’ll go to the browser and check at which particular point we want to retrieve the data so the get flow should happen as soon as I click on the add button the first thing that would happen is I would receive a message that the customer details are saved successfully after that the popup should close that is one more thing that we have to add and once once the popup is closed this table that we see on the browser it should be updated with the new record in the table that is there in DB so we have to add the get flow as soon as we click on the add button and after we receive the successful response and whenever this page is loading so let’s say I’m coming to customers this page right currently we are seeing some static data so this should not happen we should be able to see whatever data is there existing in our DB so we have to add the get flow at two places so let’s see how we’ll do that but first of all we have to complete the flow for add functionality once we have gotten the successful response we want to close the dialog box for adding a new customer and we want to do it in the complete function so for that we’ll be giving the model do close functionality but first of all we’ll have to define the model model over here through inject I want to inject NGB active model and this we’ll be using over here this do model Dot close and once this is closed we’ll be taken back to the customer page over here so now let’s try it I’ve saved the record I’ll give some random values marks phone number let me just give a random date I’ll add it says customer detail saved successfully click on okay and you see that the model is closed and we can confirm that the data is saved by executing this particular query so you see a third record has been entered after this let’s try to create the get flow from the table we’ll be creating a procedure get customer details just following our naming convention so adding Spore and then simply we have to select the customer details in this store procedure I click on execute it says commands completed successfully that means the store procedure is created so if I just try to check in the store proced procedures over here it is added now so let’s go to our web API solution stopped the execution and just taking the help of already written function for saving the data now what we are doing is we’re trying to get the data give it the appropriate name we don’t want to pass any request over here and then the store procedure name would be get customer details after this we don’t want to pass any parameters to the store procedures and once the connection is open we want to execute Reader through SQL data reader and while the equal data reader is reading what we want to do is we want to take all of the data that is there in the result set and create an object for it so I’ll just add a new model customer D and I’m taking the help of customer request dto over here to define the properties because these are going to be similar to the ones that we defined in customer request d once this is done over here I want to create a new list of customer video customer details or let me just name it customers yeah and then what we’ll do is create a D2 that would hold the single record and once that is done I want to assign each of the property the value from the table so this is going to be convert do string we have a reader column name is customer ID okay got it customer ID is integer and similar to customer ID we have first name that would be string column name should be over here then we have multiple columns one of them is last name then we have phone we have email and another is registration date got it um this is going to be time this is taking registration overhead email and then okay so I think we’re all done with assigning a single row to this particular object now what we have to do is we need to add the customer D to customers and once this is done need to just convert this object to Jon and send it as our response from the API so this is customers this is the response that we are sending through the get customer data I’ll just save all of it and we’ll run this solution so this is running now and over here you see that we have a get method added to our customer controller and we can just try it out I’ll click on execute and we are getting the response perfectly all of the data that was there in the table we are getting it as a Json in the response body and this is the same URL that we used for post just that we are making a call to different method every time through the same URL so now that this flow is completed to get the data from the DB to the web API we just need to work on the flow from the front end to the web API and we’ll be able to see the exact data that is there in the DB on our UI application so let’s complete the get Flow by going to the visual studio code and over here what we’ll be doing is we need to fetch the data on the load of the customer component right so we’ll go to the customer component.ts file and we’ll be just mentioning what we want to do in NG on in it over here we want to get the data let me create a separate function similar to the one that we had done in inventory component we are going to get the customer details and this function we’ll be calling over here now here we need to specify which particular URL we want to call for getting the customer details so we’ll just take the AP URL from here after mentioning the URL what I need to do is have HTTP client mentioned over here and we’re going to inject HTTP client so this we need to use in our function to call the get method and we’ll pass the URL we are subscribing to the method now when we are subscribing to the get method we will need to assign an object with the result that we receive from the API so let me just Define an object going to be customer details of type any now what I want is that once I received the result from the API this do customer details should be assigned at result oh I think we’re good over here once we receive the data from the get method we are assigning the customer details the same now in our customer. component. HTML we already have static data that is there in the table this will be just replaced with the dynamic data that we are receiving from the web API so this is the static data over here what I am doing is just defining a dynamic TR and we are going to iterate through ng4 let customer of customer details this is the object that we created let me just take the help of the static data and over here as of now I’m removing all of the static values and we’re also removing the static rows that are there in the table as of now I’m keeping the bindings blank because let’s just see which particular properties we’ll be binding on the HTML page so what I’m doing is just doing a console log of customer details saving this and over here also we save it okay now let’s just go to the browser currently you’re able to see that on the browser all of the static data is removed from the table and we’ll just do an inspect and go to the console over here I’m getting an error also can’t bind ng4 okay so to remove this particular error what we need to do is in our customer. component.ts we’ll be import supporting foron module I’ll save it and then we’ll go to the browser again so you see that the error is removed and now we are able to see that the records are there but we are not seeing any value because we have not done The Binding yet but on the UI we are able to see number of rows that are there in the table right you see that edit and delete these are static values so three rows are generated in the table but we are not able to see the values because currently we have not done The Binding right and for The Binding what we need to see is which particular property we need to bind because this is the value of customer details we cannot have the camel casing done in the property this property is starting with the capital c so this is exactly what we need to bind and that is why we we have not done The Binding yet but now that we know that we don’t need to use the customer details similar to this one like we have done in the customer dialogue box we need to bind it directly to what we are receiving from the web API so for that what we’ll do is now we can come over here and just have customer do customer ID so this is the value that we are binding from the back end we are doing this for every column is going to be customer Dot and then we just check our browser then it is first name last name phone registration date and email all starting with the capital laters this is what we need to B customer ID customer name and now over here in the customer name what we are doing is we’re giving the first name and last name together so as to just concatenate and show it on the screen email this is a very important thing that that we just went through to check which particular property you are binding is if we had just used the one that we had done in customer dialog box component these were all with camel case because this is what we had defined over here but we are not binding it as per the TS file we are binding It Whatever response we are receiving from the back end and now the next is is I hope you have understood this particular concept properly and now I’m just saving it going to the browser again and you see that we are able to see the data whatever data was there on the back end in the table you’re able to see it perfectly so the get flow is working perfectly fine if I just try to add a new customer if some random values over here click on ADD so it says that the customer details are saved now when I’m coming over here you see that this is not getting updated so this is the second place that we had already discussed earlier where we need to add the get flow so let’s just go to the visual studio code so in the visual studio code we’ll just go to the flow that is there right we are trying to open the data from the customer. component.ts over here and once the model is open we we enter some data into the form then we just call the Post method of the API once it is successfully saved we are closing the mo model so when we are closing the model what I want to do is send back a result to the customer component and how do I do that over here on the close we’re just sending a result closed so this this is the result that would be sent back to this customer component I’m saving it and now once that is done I want to go to the customer component and over here in the open function I’m taking the result once I received the result I want to perform a function now what I want to perform is I I want to check first of all if the data that we have received and there is event property in it is it equal to the one that we have sent back from here you see that this is the property that we are sending right event value is closed so that is what I want to match over here and if it is matching then we should just call this do customer get details so you able to see that this is the second point we are calling the get customer details function first for that we are calling it on the initialization of the page whenever we are coming to this particular component we are calling the get method and then once we are closing the popup after saving the data successfully at that time also we’re trying to call it so let me just save it and we’ll go to the browser again now this is reloaded so all of the get flow is working fine on the customer initialization but let’s just try to add a new custom entering some random values again I’m clicking on ADD it says successfully saved and you see now the data is getting refreshed as soon as the popup is closed because we have confirmed from the dialog box that we are closing the popup and then once we come back to the customer component it is checking if we have closed the popup so at that time we should just update this particular table with a new record that was entered so I hope all of you have understood that we have completed the get flow for this customer component at the two places that we wanted to after this what we’ll do is we’ll work on the delete functionality for this customer component for the delete functionality let’s go to the database and we’ll try to create a store procedure P underscore delete customer detail and now what we want to do is delete from customer details table but if I just write this query then all of the data from this table would be deleted and we don’t want that we want to delete for a particular record let’s say I have selected this record and trying to click on delete button so only this particular record should be deleted so for that we’ll be choosing customer ID as our constraint to delete the data from this table so this is the check that we are going to put on over here what I want is that particular record should be deleted where customer ID is equal to and the check should be according to this parameter that we would be passing to this procedure so wherever the row is matching with the parameter that is passed to the store procedure that particular record should be deleted I’ll click on execute and now the store procedure is created in the database if I just refresh and we are able to see that this thr procedure is added now we’ll go to the net code for our web API what I’ll do is I’ll just take the help of this function and we are writing an HTTP delete method and the name would be delete customer data now I don’t want that the whole request dto is passed I don’t want the whole customer data to be passed to the backend API I just need the customer ID from the front end to be passed to this web API and we are keeping the connection string same the store procedure name would change over here we have the name so we just use it and I’ll remove all of the other parameters we are going to pass only this parameter which is customer ID now we have our delete customer data function ready once the connection is open we are just trying to execute the store procedure and then we are closing the connection after this we are sending the okay response so I think the whole function is ready we can just try it out by running the solution so over here you see that for the customer controller we have have a new web API that is delete method and if I just click on delete we are seeing that we need to pass the customer ID so if we are able to try it out we’ll have to enter a customer ID over here for that what we’ll do is we’ll just click on execute once and see what all data is existing in the customer details table now we have five records over here and I want that this random values that we had entered earlier right this this should be deleted so I’ll select this customer ID and we’ll go to the browser click on execute we have got the 200 success record and now if I just execute this query again we see that only four records are existing now all of the data that was there earlier it has been deleted and what we can do is we can just refresh this page and we are able to see that only four records are coming over here one we have already deleted from the backend web API so the backend flow is completed for the delete method now we’ll just work on frontend to web API connection and then we can just click on this delete button and we’ll be able to delete the data from the UI as well so for that we’ll need to write the code for this button let’s go to the visual studio and over here if we just check customer. component. HTML file over here on the HTML file this is the delete button that we want to add the functionality to but we don’t directly want to just delete the data because what if the user has mistakenly clicked on the delete button right so we want the user to confirm if they are sure to delete that particular record for that what we’ll do is we have giving a function open this is not going to be customer but confirm dialogue that the user should confirm if they are ready to delete this data and have this function created in the TS file as well now what this function would do is it would open the model and that is going to be dialogue component so I hope you all remember that we have already used the dialog component in the inventory component as well and it would just open it first of all we are just trying to open it okay and then we’ll see what happens save it here and here also we’ll go to the browser and now let me just click click on delete button so we see that we have successfully opened the confirmation dialogue box from this delete button but what should happen if I click on cancel nothing should be deleted so that is working perfectly fine because we have already written the functionality for the cancel button earlier but when we are clicking on okay then the record should be deleted right so that is the pending portion over here that we are going to write the functionality for deleting the record record of the customer when we click on okay and how do we accomplish that when we were clicking on okay we were sending the event as confirm from the dialogue box so that is what we would accept on the customer component now and then delete the record let’s just go to our solution over here on the visual studio code and over here we can see that we are sending a confirm event from the dialogue box and in case we click on cancel at that time what is happening is it is simply closing the dialog box so in the event that the user clicks on cancel nothing would happen but when the user click on okay at that time the confirm function is getting called and we are sending the event as confirm so that is what we would accept over here also we can take the help of inventory component right now and just use that code over here so what I want to do is I just want to have the result now of the dialog box in the customer component if I’m opening the dialogue box we are taking the result and then we will be deleting the customer this is what we are going to do so let me just have a function created for it need customer details and this is the function that we want to call when we are receiving the event as confirm only if the user has confirm then we are deleting the record now let’s just give the functionality for deleting the record which is the URL that we are going to take over here that would be from the browser itself over here you can see that we have this URL and this is the one that we’ll be putting over here now for the delete functionality we have already made sure that a check would be passed for the wear clause and that would be customer ID so this is what we require over here in this function to be passed to the delete method of the HTTP client but how do we get that particular customer ID that would be passed from the HTML this is the customer ID and this needs to be passed to the function that we are trying to call from the TS file so we accept it over here and then can simply pass it to this function we just save here also after this we need to pass this customer ID with this URL for that we’ll be just using HTTP client all the delete method pass the API URL with customer ID and we are subscribing to the method so after saving the changes we’ll just go to the browser and if I click on delete it says are you sure you want to delete the record I can cancel it so nothing is happening now if I just click on okay on this confirm box currently also nothing is happening but if I just check over here in the DB has the record been deleted so yes the record is deleted successfully but on the UI we are not able to see it why is that because we have not refreshed the table after deleting the record from here so for that what we’ll do is in the visual studio code when we have subscribed right when we are getting the result result from the delete API at that particular Point what we want to do is just call get customer details method now I’ll save it we’ll go to the browser again and I just add some random values over here because I want to delete this random data click on uh and now I want to delete it so I click on okay and you see that the table is getting refreshed so the delete functionality is working perfectly because after we have deleted the record we are refreshing the table as well and that is all Happening by just passing the customer ID from the HTML page so I hope all of you have understood the delete functionality that we have covered till now after this what we we’ll be doing is we’ll be working on the edit functionality but before proceeding on it we have to make a minor change over here you see that on the table that we are seeing on the UI the registration date is having the time also that is because we had kept the variable as date time in the net web API so for that we’ll be making the changes over here we need to change it to string so let me just make the registration date as string and then this could be convert. twoing we’ll save the changes and in the store procedure when we are fetching the details we need to make sure that string is being fetched from the table so this is a conversion technique that I am explaining to you and this is very helpful when we are working on the store procedures because generally you would have data in your table in some format but you need to get it in some other format so for that we can cast the data now just to fetch the column names I am selecting the table and doing alt F1 on the keyboard this gives me the data definition for the table we can take the column names from here and what I want is add a comma Now for the registration date as V yeah and we can keep it 12 as of now and the column name can be registration date only so why I have to mention all of the columns separate separately because I want to perform a cast on a particular column so if we are just putting a star with the select keyword we’ll not be able to put cast function on a selected column for that we have to mention the column name separately over here now we are altering the procedure I click on execute and it says commands completed successfully to verify that our changes are working perfectly fine we can just execute this store procedure in the database so now this is giving me records over here in the same format that the table is there but this would be now Vare so what we can do is just go to the solution and run it once and now that it is running over here in the get method can just try it out I click on execute and you are able to see that the registration date that we are patching now from the table it is not having a Time associated with it otherwise it had assigned a default time in the net code so now we’ll go to the browser and just refresh the page over here you see that the registration date is is coming without the time so this is what we were intending to do and we have achieved it now after this we can move on to the edit functionality as well so for the delete functionality let’s first go to the database to create a store procedure now we are creating a procedure to update customer details and what we’ll be doing here have an update query or customer details table now in the table we know that we have some columns right customer details these are the columns that we have so as we have selected earlier customer ID column for deleting the record so this seems to be the appropriate candidate to edit the records as well so we will be using this in the constraint when we are trying to update the record and these are the remaining columns that we’ll be updating the values for so whatever data we are getting from the front end all of that we’ll be updating in the store procedure over here I want to update first name but what do I want to update the first name with I will have to pass a parameter for the first name to this store procedure and similarly for all the remaining columns those parameters would be passed I’ll just take the column names from this table these are the column names that would be passed we do need to pass the customer ID as well because how would we update the record without passing the customer ID this is of type in then we have B care this was I think 50 then the same we can keep for the last name email registration date can be date only and then we have phone which would be just vard for now okay after this what we are going to do is set the first name with the first name that we have received as a parameter last name then we have email registration date and then the last is phone so all of these we are updating but which particular record we want to update it for it is only when customer ID is matching with the one that is pass from the web API customer ID so the parameter that we receive in the store procedure if the records value is matching with it then we want to update all of these columns so this is the update query and that we’ll be using in our store procedure for updating the customer details I’ll click on execute so the store procedure has been created now let’s go to the web API I’ll stop the execution so we can take the help of inventory controller and just use the update function to write the same for customer as well over here it’s going to be HTTP put request and then we have update customer data here we are passing customer request DT and the store procedure name will be taking from here now we want to pass all the parameters here for the customer details so those can just take from the Save method and this was D can be changed to customer request all of the parameters are also set now we just have to execute the store procedure and whatever values we are passing those will be updated accordingly through this SP now let me just save it and we’ll run the solution now that the solution is running we can just go over to our database once and check over here which particular record we want to update so I’ll just select this one and we’ll go to the browser so you see over here that a put request method has been added can try it out now these are the values that we need to pass so the customer ID would be the one that we have selected we’re keeping the names same only and for the email what we’ll do is we’ll change the email over here let me just copy the phone number or what we can do is just change the phone number rather than the email and this date also can remain the same and I’ll execute it we have got a 200 success response now we can verify our data for this particular customer in the DB you see that these are the updated values that we have sent we have changed the phone number only here and in the DB if you see this is a old phone number that we had so I’ll just have a query on a different query editor and execute it over here we are able to see that this is the new phone number that we have given from the web API and this is updated correctly through the update method over here this is the old one that we had in the table so we have built the connection from the web API to the DB and the update method is working perfectly fine what we’ll do is we’ll go to the visual studio code to make the changes for the front end to web API connection but before that let me just go to the UI once again we have the edit button over here and for the inventory what we had done is we had a form on the page itself which we were populating when we were clicking on the edit button but now we have a dialog box when we are adding a new customer so what we want to do is this is a very interesting portion we want to populate this particular dialog box when we click on edit Button as soon as I click on edit the popup should open and this popup should have the values prepopulated from the record so I can change the values from here by entering some new values and then I’ll click on save or update that should update the record in the DB and I should have a fresh data over here on the table so this is what we are trying to achieve let’s see how we can do that so we’ll go to the visual studio and over here you see that this is the customer component.ts the customer component. HTML is having that edit button right so we need to have a button action related to this when we are clicking it we want to open the edit dialogue box for the delete what we were doing is opening the confirmation dialogue box now we want to open the edit dialog box and this is the function that we’ll have to create over here what this should do is now in this function we are using the model service to open customer dialog box component now I’m able to open the customer dialog box component through this particular statement but how do I pass the customer information over here on the HTML you see that we have a row associated with the customer and when I click on edit this particular information should be passed so this is the customer data that I would need to pass to the customer dialog box component and when we are passing it first of all we need to be able to receive the data in customer dialog box component so for that what we’ll have to do is over here we are taking it as input defining a private customer variable of type any and just import angular core now I’m saving it once done what we have to do is when I’m clicking on edit and folling this function I should be passing the customer whole information needs to be passed for the delete we had only passed the customer ID but now when I am updating I would need to require to pass the whole customer object save it okay so it’s giving error because we need to accept it also in the function I’ll save it again error is gone so I’m able to receive the information in this function but now I need to pass it to this component and for that what we’ll do is just Define this as a const model reference giving it a random name and model reference component instance property this is what we need to Define over here in the component instance we have to give the exact name of the property that we defined over here so as soon as the dialog box is initialized customer value would be set from here and over here we need to give the object first is the customer ID property then we have first name first name name em registration date the last is phone so we have set all of the values all the properties have been defined over here you could wonder that why we have not taken the names over here of the properties as camel case because over here we are accepting it because this is being accepted over here and even though it is of type any I am just making sure that it is matching with the object that we have already used in the customer dialog box component but over here we have to use this name with the capital letter in the starting because this is what is being used on the HTML page this is the one that we have been receiving from the web API and we have directly mapped it on the HTML so you have to make sure that all of the property names you are giving correctly for the application to be working perfectly then after this what we’ll need to do is I’ll just save this once over here when we have got the data of the customer we need to map it to the HTML right when we go to the HTML we see that we already have a binding in place for customer details so what we can do is we can utilize this customer details object only and initialize it with this value if we are updating a record and how do we do that we have okay we don’t have NG in it in this class currently so I’ll add it in the NG on init function what we need to do is this do customer details it should be set to customer but should it be set every time when the object is getting initialized no it should not be it should only be initialized in case we are receiving any data in this particular object that is customer so that means when we are trying to edit only that time it should be initialized with this value now what I can do is I can give an if condition customer is not equal to null at that time we need to just set this customer details with this value I’ll save it and now we’ll go to the browser ones over here it’s refreshed already after saving I’ll click on edit so you see that we are able to perfectly get the data from that table into this particular popup that record is populating on the dialog box but this add label is not looking very appealing over here because we are not trying to add any data we’re trying to update the record so what we can do is we can update the text as well for the button and one more thing we’ll need to do is just similar to inventory that we need to disable the editing of this particular text box when we are trying to update the record because this is the constraint that we are using at the back end to update the record we cannot change its value this is similar to the inventory component that we have achieved now we’ll be disabling it over here as well when we are trying to open the dialog box for updating a record so let me just go back to the visual studio code and over here in this customer dialog box component when we are setting the customer details with customer at that time we are trying to update the record right so over here we’ll make the changes but uh we’ll check the HTML first in this the label is set as static currently right so we can give some value over here dynamically for that let me just Define a variable button text string so it can be initially it can be add okay so this is the initial value that we are giving that is a default value and this is what we can use over here and just save it now button text should be set as dat I’ll just save it go back to the browser over here when I am adding a new customer at that time it is ADD so it is taking the default value but when I’m editing it it’s coming as update so this is working now we’ll make the changes for disabling over here where we have defined button text can give AOL value for disable customer ID input so initially it will be false I don’t want to disable it initially but as soon as I’m trying to update it should be said to true and we’ll take this over here on the HTML page customer ID disabled is being set through this variable I’ll save it now we can check the browser again so when I click on edit you see that this is not editable right now so this is also working perfectly now we have made the required UI changes for our update dialogue box and the add dialog box is also working perfectly now over here we have to make the changes to send this data to the back backend now we’ll go to the visual studio now in this component we need to make the changes to update the data right if we check the button for editing this is the button that we are using right but this is the button that we are using for saving the data as well so what we’ll need to do is we’ll need to make the changes on this button functionality only on submit is the function that is being used for that button click and now I need to add a check which can make sure that either the user is trying to edit the record or save the record so we already have made the changes for these two variables to verify if the user is trying to edit or update so we can take any of these I’ll take the bo one and over here if this is true at that time I should edit otherwise I should just save the record and that save functionality is already working through post method okay now for here in this if we need to have the functionality for or put so I can just take the help of this code we are trying to call the put method the API URL Remains the Same we’ll be passing the customer details only because this is set through this function when we are trying to update the data and we can just update this message and then we are closing using the dialog box as soon as we click on the okay for the alert box so this is also perfect I’ll just save it and we’ll go to the browser now on the UI I’ll click on edit for this record and I am updating the phone number again let me give it a different value so that it is easier to remember okay it starts with 3625 I click on update and it’s not updating over here but let’s just verify on the DB ones this is the old value I click on execute again so we are getting the new value over here that means the API is working perfectly but it’s not showing on the UI over here let’s just check the code now C customer dialog box component we are sending the event as closed in the customer component when we were opening for adding we were receiving it got it that over here we are not receiving the result because in open customer dialogue we have been receiving the result we are checking if the event is closed then we are refreshing the table so this statement we have to just take and add over here in model reference result then and we have got the result from customer dialog box component then we are checking if the event is closed then we are refreshing the table save it we’ll go to the browser again now over here we are seeing the refresh data because after saving the page always reloads right so now just let me click on edit and give it a new value this one only and then click on update so it says this is updated now and now we are seeing the new value over here so this is working perfectly fine now if I have the old value I click on edit enter some new information and it is updated successfully over here so the table is also getting refreshed as soon as I am clicking on update over here and for the ad also if we are adding a new customer then we don’t have this text box disabled and the button displays add only so I hope all of you have understood the crud operations that we have completed for inventory and Custom customer component we did some UI enhancements as well and some additional features that we have given to the application to make it perfect for the end user now this was the last video for this particular application will be coming out with the new application soon thank you

    By Amjad Izhar
    Contact: amjad.izhar@gmail.com
    https://amjadizhar.blog

  • Principia Mathematica: Evolution, Revisions, and Analysis

    Principia Mathematica: Evolution, Revisions, and Analysis

    The provided text extensively examines Bertrand Russell’s work on Principia Mathematica (PM), particularly focusing on revisions and manuscripts related to the second edition. It explores the changes made, Russell’s motivations, and criticisms from logicians like Gödel and Ramsey. The evolution of Russell’s logical system, including the theory of types and the axiom of reducibility, is scrutinized alongside influences from figures like Wittgenstein and Carnap. The analysis investigates modifications related to propositional logic, extensionality, and the handling of classes and relations. Ultimately, the text aims to clarify Russell’s intentions and the impact of these changes on the foundations of mathematics and logic.

    Principia Mathematica, Second Edition: Study Guide

    I. Quiz

    Answer the following questions in 2-3 sentences each.

    1. What does the notation ‘Rν‘a’ represent in the context of multiples and submultiples of vectors?
    2. Explain the meaning of “Prm” as defined in *302.
    3. In *304, what condition defines when X is less than r Y (X <r Y) in the series of ratios?
    4. How are X×s Y and X+s Y defined in terms of R and S in sections *305 and *306, respectively?
    5. Explain what is meant by “FM sr” and “Semi Ded” in the context of multiples and submultiples of vectors.
    6. What is the significance of the expression “(ιτ){(∃ ρ, σ ) . (ρ, σ ) Prmτ (μ, ν)}” in defining the highest common factor (hcf(μ, ν))?
    7. In the context of inductive classes (Cls inductm), what property is being proved in *89.16?
    8. Explain the meaning of the notation α̂{α(S∗|S)α} in the context of Section 4v.
    9. According to 917, what properties can be derived for Cls induct3?
    10. In the context of the summary and related properties, what can we prove directly about the relationship: {(∃x).φx}|{(x).ψx}?

    II. Quiz Answer Key

    1. ‘Rν‘a’ represents the result of applying the relation R, ν times to ‘a’, where ν is a natural number. It signifies a multiple of a vector ‘a’ with respect to the relation R.
    2. “Prm” defines the concept of relative primes within the context of inductive natural numbers. Two numbers, ρ and σ, are considered relatively prime if their only common factor (τ) is 1.
    3. X <r Y is defined by the existence of natural numbers μ, ν, ρ, and σ (excluding 0) such that μ×c σ < ρ×c ν, and X = μ/ν and Y = ρ/σ. This means that X is less than Y if the product of μ and σ is less than the product of ρ and ν.
    4. X×s Y relates R and S based on the product of ratios μ/ν and ρ/σ, while X+s Y relates R and S based on the sum of ratios μ/ν and ρ/ν. Both definitions involve natural numbers μ, ν, ρ, and σ (where ν and σ are not 0) to connect the ratios X and Y to the relations R and S.
    5. “FM sr” likely refers to a “vector-family”, while “Semi Ded” likely refers to a “Semi Dedekind” property. These terms describe specific characteristics of mathematical structures relevant to defining multiples and submultiples of vectors in the context of Principia Mathematica.
    6. The expression “(ιτ){(∃ ρ, σ ) . (ρ, σ ) Prmτ (μ, ν)}” identifies the unique τ that is a common factor of μ and ν, where ρ and σ are relatively prime with respect to τ. This tau corresponds to the highest common factor.
    7. In *89.16, the proof aims to show that if α is not a member of the third-order inductive class (Cls induct3) and γ is a member, then there exists a unique difference between α and γ (α − γ). It implies a certain distinctiveness or separability within the inductive class structure.
    8. The notation α̂{α(S∗|S)α} defines the set of all α such that α is related to itself through the relative product of S∗ and S (S∗|S). In essence, it identifies elements that are in the reflexive domain of the relative product of S∗ with itself.
    9. According to 917, Cls induct3 supports the property that if α is not a member of the third-order inductive class and γ is a member, then there exists a unique α − γ.
    10. Directly we can prove: ∼ (∃x). φx .∨. ∼ (y). ψy ≡ : (x). ∼ φx .∨. (∃y). ∼ ψy

    III. Essay Questions

    Answer the following questions in essay format.

    1. Discuss the significance of numerically defined powers of relations and relative primes in the broader context of Principia Mathematica’s development of number theory. How do these concepts contribute to the formalization of arithmetic?
    2. Explain the role of the Axiom of Archimedes and the Axiom of Divisibility in the development of measurement within Principia Mathematica. How do these axioms ensure the consistency and applicability of measurement in the context of vector families?
    3. Analyze the use of matrices and propositional logic in the proofs presented in the source material. How do these tools contribute to the rigor and generality of the arguments made?
    4. Discuss the significance of inductive classes and their properties in the context of defining mathematical concepts in Principia Mathematica. Provide examples from the text to illustrate your points.
    5. Critically evaluate the notational conventions used in the source material. What are the advantages and disadvantages of these conventions in terms of clarity and precision?

    IV. Glossary of Key Terms

    • NC induct: Natural numbers, inductively defined. Represents the set of natural numbers constructed through inductive principles.
    • RP: A numerically defined power of a relation R. It denotes the application of the relation R to a certain extent, defined numerically.
    • num(R): A function representing the “number” associated with the relation R. The specifics depend on the relation’s properties.
    • Prm: Relative Primes. A relation indicating that two numbers are relatively prime (i.e., their greatest common divisor is 1).
    • hcf(μ, ν): Highest Common Factor (Greatest Common Divisor) of μ and ν.
    • lcm(μ, ν): Least Common Multiple of μ and ν.
    • Rat def: Defined ratios. Refers to the set of ratios constructed from natural numbers.
    • FM sr: Vector-family. A collection of vectors with certain properties relevant to measurement.
    • Semi Ded: Semi-Dedekind property. A property related to completeness and Dedekind cuts.
    • Cls inductm: Inductive Class of order m. A class defined through induction up to a certain order.
    • Potid’R: The potency of the relation R.
    • R0: Identity relation restricted to the domain of R.
    • D’R: The domain of the relation R.
    • C’R: The counter-domain of the relation R.
    • α̂(…): Class abstraction. Defines a class based on a condition.
    • ṡ‘κ∂: The “dot-abstraction” notation, meaning the class of all terms ‘x’ such that ‘x’ belongs to κ.
    • Comp: A class that contains the complements of all its members.
    • R|S: Relative product of relations R and S.
    • R∗: The ancestral relation (transitive closure) of R.
    • ε: Is an element of. Denotes membership in a set or class.
    • ⊃: Logical implication (“implies”).
    • ≡: Logical equivalence (“is equivalent to”).
    • ∃: Existential quantifier (“there exists”).
    • ι‘x: The unit class of x (the set containing only x).
    • ∪: Set union.
    • ∩: Set intersection.
    • ∼: Logical negation (“not”).
    • →: Mapping or function.
    • ∀: Universal quantifier (“for all”).
    • ∂: Denotes the derivative of a class.
    • α ~ε μ: Element α is not an element of μ
    • p|q: p “not-ands” q: both not true.
    • αM∗β: That α is in the ancestral relation of β under the relation M.
    • α Rts β: Alpha is rooted in beta
    • ṡ ‘Potid‘R: Class who’s members are subclasses of Potid’R.
    • ←− R ∗‘x: A formula relating R and x to other values
    • −→ R ∗‘x: A formula relating R and x to other values
    • D* The ancestral of the domain D

    Russell’s Principia Mathematica: Manuscript Analysis

    Okay, here’s a detailed briefing document summarizing the main themes and important ideas from the provided excerpts of Bertrand Russell’s manuscripts and notes for the second edition of “Principia Mathematica.”

    Briefing Document: Analysis of Excerpts from Russell’s Manuscripts for Principia Mathematica, Second Edition

    Overall Theme: These manuscript excerpts provide a glimpse into Russell’s rigorous, formal, and highly symbolic approach to defining fundamental mathematical concepts. The document shows his work at the granular level, filled with definitions, theorems, and proofs relating to numbers, relations, and order. The notes are primarily concerned with building up from basic logical and set-theoretic notions to construct more complex mathematical entities. The overarching goal is the reduction of mathematical truths to logical truths.

    Key Areas and Ideas:

    1. Definitions of Numerical Concepts and Operations: Russell meticulously defines basic arithmetic concepts like numerically defined powers of relations, relative primes, highest common factors (hcf), least common multiples (lcm), and ratios.
    • Example:∗301. Numerically defined powers of relations. ·01 RP = (|R) ‖ (Ŭ1 t3‘R) Dft(∗301)” This defines a power of a relation R.
    • Example:∗302. Relative Primes. ·01 Prm = ρ̂ σ̂ {ρ, σ ε NC induct :ρ = ξ ×c τ . σ = η ×c τ. ⊃ξ,η,τ . τ = 1}Df” This defines what it means for two numbers to be relatively prime.
    • Example:∗304. The Series of Ratios. ·01 X <r Y . = . (∃μ, ν, ρ, σ ). μ, ν, ρ, σ ε NC induct − ι‘0 . σ = 0 . μ×c σ < μ×c ρ ,X = μ/ν . Y = ρ/σ } Df” This formally defines the “less than” relation (<r) for ratios. The document contains formal definitions of multiplication and addition as well. Note the frequent use of set builder notation to define numbers as the set of some objects satisfying certain conditions.
    1. Vectors, Measurement, and the Axiom of Archimedes: The notes delve into the properties of vector families and their relation to ratios. The Axiom of Archimedes is invoked in the context of multiples and submultiples of vectors. An Axiom of Divisibility is also present.
    • Example:∗337. Multiples and Submultiples of vectors. ·13 : . κ ε FM sr . P̆ = ṡ‘κ∂ . P ε Semi Ded . R ε κ∂ . a ε C‘P . ⊃ : x ε C‘P . ⊃ . (∃ν) . ν ε NC induct − ι‘0 . xP (Rν‘a) [Axiom of Archimedes]” This states Archimedes’ axiom formally.
    • Example:If X is a ratio as previously defined, and κ a vector-family, X κ is the ratio X as applied to the family κ .” This explains how a ratio acts on a vector family. This section seems to be preparing the foundation for geometric reasoning.
    1. Logical Proofs and Manipulations of Symbolic Expressions: A significant portion of the manuscript is dedicated to logical proofs, often involving complex symbolic manipulations and the application of previously established theorems or axioms (referenced by numbers like “*8·261”). The proofs often involve quantifiers and logical connectives. Many of the proofs involve complex matrices.
    • Example: The extended section around expression (642) and theorems *8·322, *8·333, *8·341, *8·342, and *8·343 demonstrate the meticulous logical deductions Russell employs. Key logical proof techniques involve defining and manipulating matrices of logical statements and systematically proving various cases.
    1. Set Theory and Class Theory: Set-theoretic operations, notions of inductive classes, and the posterity of a term are prevalent throughout the notes. The notes make abundant use of set-builder notation (e.g., the use of hats or carats above letters as in “ρ̂ σ̂”) to formally specify the membership of a set based on specific conditions. The notes are trying to develop the theoretical basis for inductive proofs.
    • Example:We have Rm+1(x y) ⊂ R(x y) Cls inductm+1 ⊂ Cls inductm.” This relates inductive classes of relations to sets.
    • Example:R0 ⊂· R∗|R ⊂· R∗ where R0 = I ⇁ C‘R Df ∗89·02. R0 = I ⇁C‘R Df The proof is as follows: ∗89· 1. . R0 ⊂· R∗|R ⊂· R∗” Shows the use of definitions and set relations to construct a proof. The concept of “Cls inductm” which means a class that is inductively defined, appears frequently.
    1. Relations, Domains, and Operations on Relations: The notes use relations extensively, defining operations such as relative product, powers of relations, converse of a relation, and domain/range restrictions.
    • Example: Numerous definitions and manipulations of relations illustrate this. Relations are central to many of the theorems and definitions throughout.
    1. Order and Predecessors: The document frequently considers the relationship between an object and its predecessors and successors with respect to a given relation “R”.
    • Example: In section [17v], Russell is attempting to prove that “∼ R̆‘maxR‘γ ε α .∨. y ε α ∪ γ by induction, i.e.23 ∼p ∨ q . ⊃ . ∼r ∨ q ∨ s” and seems to be concerned about proving that some condition holds for all ancestors to some node y.

    Notational Conventions:

    • The manuscript relies heavily on symbolic notation, which would be familiar to readers of “Principia Mathematica.”
    • Df is used to indicate “Definition.”
    • Likely indicates the start of a theorem or proof.
    • References to previous theorems and axioms (e.g., “*8·261”) are common.

    Observations and Potential Insights:

    • Foundation for Mathematical Reasoning: These notes are part of Russell’s broader project to provide a logical foundation for mathematics.
    • Complexity of Reduction: The level of detail and symbolic manipulation highlights the immense complexity of reducing mathematical concepts to purely logical ones.
    • Work in Progress: These are manuscripts, so they contain corrections, revisions, and unresolved issues.
    • Emphasis on Formalism: The heavy use of symbolic notation underscores the emphasis on formalism and rigor in Russell’s approach.

    In summary, the document offers a fascinating glimpse into the intense, formal, and foundational work that went into the creation of “Principia Mathematica.” It shows the level of abstraction and symbolic manipulation required to rigorously define fundamental mathematical notions within a logical framework.

    Principia Mathematica, Second Edition: Manuscript Notes

    FAQ on Principia Mathematica, Second Edition Manuscripts

    Here are some questions and answers based on the provided excerpts from Bertrand Russell’s manuscripts and notes for the second edition of Principia Mathematica.

    Question 1: What are numerically defined powers of relations, and how are they represented in the manuscript?

    The manuscripts introduce numerically defined powers of relations. For a relation R, RP appears to represent a power of that relation, likely in terms of its repetition in the relation (Ŭ1 t3‘R). The function num(R) is defined which produces values that can then be applied to the power of the relation: Rσ = {ṡ‘num(R)}‘σ̇ Df. So, if R represents a relationship, R2 and R3 would then represent the relation applied twice and thrice respecitively.

    Question 2: What are relative primes and how are they defined?

    The manuscripts define relative primes within the context of inductive numbers. Prm is defined as ρ̂ σ̂ {ρ, σ ε NC induct :ρ = ξ ×c τ . σ = η ×c τ. ⊃ξ,η,τ . τ = 1}Df. Then (ρ, σ ) Prmτ (μ, ν) . = *. ρ Prm σ . τ ε NC induct − ι‘0 . μ = ρ ×c τ . ν = σ ×c τ Df

    Essentially, two inductive numbers, rho and sigma, are relatively prime if their only common factor is 1.

    Question 3: How are ratios defined in this context, and what is the series of ratios?

    Ratios are defined in terms of inductive numbers. μ/ν (where μ and ν are inductive numbers and ν is not zero) represent a ratio. The series of ratios is established by defining an ordering relation <r and two classes “Rat def” and “Rat def ∪ ι‘0q”, meaning rational def, and rational def with 0 included, respectively. The relationship H represents X̂ Ŷ {X, Y ε Rat def . X <r Y } Df, meaning H is the relationship of numbers where X and Y are rational numbers and X is less than Y. H ′ is the same, but includes 0.

    Question 4: What are multiples and submultiples of vectors, and how are they related to the Archimedean axiom and divisibility?

    Multiples and submultiples of vectors relate to how ratios can be applied to vector families. If X is a ratio and κ is a vector family, then X κ is the ratio X applied to the family κ. The Archimedean axiom is invoked, stating that for any element ‘a’ in a semi-Dedekind family, any vector R, and any x, there is a multiple of that vector (ν ε NC induct − ι‘0) such that xP (Rν‘a).

    There is also an axiom of divisibility that states : . κ ε FM sr .Cnv‘ṡ‘κ∂ ε comp ∩ Semi Ded . ⊃ : S ε κ . ν ε NC ind − ι‘0 . ⊃ . (∃L) . L ε κ . S = Lν.

    Question 5: What role do matrices and prefixes play in the logical proofs presented in the manuscript?

    Matrices in this context seem to represent complex logical propositions or conditions, and prefixes define the variables and quantifiers involved. The matrix itself describes the relationships between these variables. The manuscript uses matrices to express logical dependencies and implications concisely. For example, the truth or falsehood of a proposition encapsulated in the matrix depends on the truth or falsehood of other propositions (φa, φb, q, etc.). The prefixes indicate which variables are bound by existential or universal quantifiers. The text uses these matrices to build and demonstrate more complex logical arguments, simplifying the representation of intricate logical structures.

    Question 6: What is Cls inductm and how is it used?

    Cls inductm refers to inductive classes, with m likely representing the order of induction. So “γ ε Cls inductm” means gamma is a class of the “m” order for inductive classes. The document explains that given Rm+1(x y) ⊂ R(x y) then Cls inductm+1 ⊂ Cls inductm, meaning the inductive classes are related by order.

    Question 7: How are relationships between classes and operations on classes (such as intersection, union, and removal) explored in the manuscript?

    The manuscript extensively explores relationships between classes using operations like union (∪), intersection (∩), set difference (−), and the application of relations (R̆“μ). Theorems and proofs often revolve around demonstrating how these operations transform classes and how membership in one class affects membership in another after such operations.

    For example, ∗89·16 : α ∼ε Cls induct3 . γ ε Cls induct3 . ⊃ . ∃! α − γ, where given alpha is not in Cls induct3 and gamma is, then there exists an “alpha minus gamma”.

    Question 8: What is the meaning and significance of R∗ in the document, and how does it relate to R0?

    R* typically represents the ancestral or transitive closure of the relation R. That is, if xRy and yRz, then xR*z. R0 is the identity relation within the field of R. The relationship between them is shown by R0 ⊂· R∗|R ⊂· R∗, where R0 is a subset of the transitive closure of R applied to R, which is a subset of the transitive closure of R.

    Principia Mathematica: History, Impact, and Significance

    Principia Mathematica, originally published between 1910 and 1913, is a monumental work in symbolic logic that aimed to deduce much of elementary arithmetic, set theory, and the theory of real numbers from a series of definitions and formal proofs. Written by Alfred N. Whitehead and Bertrand Russell, it became a model for modern analytic philosophy and an important work in the development of mathematical logic and computer science.

    Overview of Principia Mathematica

    • Scope and Content The three volumes of Principia Mathematica lay out a cumulative series of definitions and formal proofs to rigorously deduce much of elementary arithmetic, set theory, and the theory of real numbers.
    • Impact on Logic Principia Mathematica is arguably the most important work in symbolic logic from the early twentieth century. Logic conducted in the style of Principia Mathematica soon became a branch of mathematics called “mathematical logic”.
    • Influence on Computing Principia Mathematica led to the development of mathematical logic and computers and thus to information sciences.

    Revisions and Additions in the Second Edition The second edition of Principia Mathematica, published between 1925 and 1927, included a new Introduction and three Appendices (A, B, and C) written by Russell, along with a List of Definitions. These additions, though comprising only 66 pages, proposed radical changes to the system of Principia Mathematica, necessitating a fundamental rethinking of logic.

    Key changes proposed in the second edition:

    • Sheffer Stroke Russell proposed replacing the logical connectives “or” and “not” with the single “Sheffer stroke” (“not-both”). This change was technically straightforward and didn’t require rewriting the original text.
    • Extensionality The second major change was the adoption of “extensionality,” requiring that all propositional connectives be truth-functional and that co-extensive propositional functions (those true of the same arguments) be identified. According to Russell, functions of propositions are always truth-functions, and a function can only occur in a proposition through its values.
    • Axiom of Reducibility Russell proposed abandoning the axiom of reducibility, a move that faced criticism from logicians. In Appendix B, Russell attempted to prove the principle of induction without relying on this axiom. However, Kurt Gödel later criticized this proof, and it was eventually shown that deriving the principle of induction in certain systems of extensional ramified theory of types without the axiom of reducibility was impossible.

    Impact and Reception

    • Initial Reactions The second edition was seen as Russell’s attempt to keep up with a subject that had surpassed him. However, a closer study reveals deep issues regarding the shift from the intensional logic of propositional functions in the “ramified theory of types” of the first edition to the altered theory of types in an extensional logic.
    • Evolution of Logic The second edition of Principia Mathematica marks the end of logicism as the leading program in the foundations of mathematics, and the rise of the mathematical logic of Gödel and Tarski as its replacement.
    • Obsolescence and Philosophical Significance As a work in mathematics, Principia Mathematica soon became obsolete. However, its study remains significant in the philosophy of logic. The intensional nature of its logic and the potential distinction between co-extensive functions were seen as alien to the extensional account of logic that supplanted it.
    • Influence on Analytic Philosophy Principia Mathematica became a starting point in analytic philosophy, from which progress was made by correcting its errors. It is often viewed as a wrong turn in the progression from Frege’s Grundgesetze der Arithmetik through Wittgenstein’s Tractatus to the logic of Carnap, Gödel, and Tarski.

    Key Concepts and Technical Aspects

    • Type Theory The notion of type theory, extensionality, truth-functionality, the definability of identity, and the primitive notions of set theory all evolved between the two editions. The history of Principia Mathematica reveals important knowledge about the history and philosophy of logic in the early twentieth century.
    • Notation Principia Mathematica employs a system of notation that, while precise, can be challenging for contemporary readers due to its use of patterns of dots for punctuation rather than parentheses and brackets.
    • Axiom of Reducibility The axiom of reducibility states that for any function, there is an equivalent predicative function (one true of all the same arguments).
    • Theory of Descriptions Principia Mathematica introduces a method for indicating the scope of definite descriptions, with the fundamental definition being a “contextual” one.
    • Relations Principia Mathematica presents the “General theory of relations” in extension. In this theory relations are treated as counterparts of classes.
    • Mathematical Induction Appendix B discusses the principle of mathematical induction, which, along with the definition of numbers as classes of equinumerous classes, is central to the logicist account of arithmetic.

    Criticisms and Challenges

    • Technical Crudities Despite its importance, Principia Mathematica has been criticized for its technical crudities and lack of formal precision in its foundations. Gödel noted that its presentation of mathematical logic was a step backward compared to Frege.
    • Intensionality The intensional nature of the logic in Principia Mathematica was seen as a result of confusing use and mention.
    • Axiom of Reducibility Quine argued that the axiom of reducibility cancels out the ramification of types, undermining the distinctive feature of the logic.
    • Notational Excess Quine criticized the “notational excess” in Principia Mathematica, suggesting that its numerous theorems merely link up different ways of writing things. He views this as a stylistic defect, but others argue that the multiple definitions reflect the intensional nature of propositional functions.

    In summary, Principia Mathematica is a complex and influential work that represents a significant stage in the development of modern logic. The second edition, with its proposed revisions and additions, highlights the evolving nature of logical thought and the challenges of establishing a solid foundation for mathematics.

    Principia Mathematica: The Axiom of Reducibility

    The axiom of reducibility is a central concept in Principia Mathematica (PM), and its treatment was a major point of revision in the second edition. The axiom and its revisions have been the subject of considerable discussion and debate.

    Definition and Purpose

    • The axiom of reducibility states that for any function there is an equivalent function (i.e., one true of all the same arguments) which is predicative.
    • A predicative function is of the lowest order applicable to its arguments. In modern notation, these functions are of the first level, with types of the form (…)/1.
    • Whitehead and Russell express doubts about the axiom of reducibility in the first edition of PM, and one of the major “improvements” proposed for the second edition is to do away with the axiom.

    Role in Principia Mathematica

    • The mathematics developed in PM, including elements of analysis, requires frequent use of impredicative definitions of classes.
    • The axiom is needed to define notions that would otherwise violate the theory of types by referring to “all” types, creating an illegitimate totality.

    Identity

    • The definition of identity in PM relies on the axiom of reducibility:
    • x = y .=: (φ) : φ!x . ⊃ . φ!y Df
    • This means x is identical with y if and only if y has every predicative function φ possessed by x.
    • Without the axiom of reducibility, this definition is problematic because it is not possible to state that identity is the sharing of all properties, since there is no “totality” of all properties to be the subject of a quantifier.

    The Second Edition and Abandoning the Axiom

    • One of the major changes proposed for the second edition is to avoid use of the axiom of reducibility whenever possible.
    • Russell was trying to work out the consequences of “abolishing” the axiom of reducibility, to see more clearly what exactly depends on it.
    • In the second edition, the definition of identity remains untouched, even though the axiom of reducibility is abandoned.
    • Russell states that if the axiom of reducibility is dropped and extensionality is added, the theory of inductive cardinals and ordinals survives, but the theory of infinite Dedekindian and well-ordered series largely collapses, so that irrationals and real numbers generally can no longer be adequately dealt with.

    Challenges and Criticisms

    • Circumventing the Axiom Even without the axiom of reducibility, it is possible to prove mathematical induction.
    • Quine’s View Quine argued that the axiom of reducibility cancels out the ramification of types, undermining the distinctive feature of the logic.
    • Wittgenstein’s Challenge Wittgenstein challenges the axiom of reducibility as certainly not a principle of logic.

    Responses to the Abandonment

    • Chwistek Leon Chwistek took the “heroic course” of dispensing with the axiom without adopting any substitute.
    • Ramsey Ramsey agrees with rejecting the axiom of reducibility, on the ground that it is not a logical truth, and because it can be circumvented in practice.

    In conclusion, the axiom of reducibility was a contentious point in Principia Mathematica. Its abandonment in the second edition, while intended as an improvement, raised significant challenges and led to substantial revisions and alternative approaches in the foundations of mathematics and logic.

    Principia Mathematica: Theory of Types

    The theory of types is a pivotal concept within Principia Mathematica (PM), significantly influencing its structure and revisions across editions. It addresses logical paradoxes and imposes a hierarchy on functions and propositions to avoid self-reference and ensure logical consistency.

    Core Principles and Development

    • Vicious Circle Principle: The theory of types is rooted in the “vicious circle principle,” stating that “whatever involves all of a collection must not be one of the collection”. This principle aims to prevent logical paradoxes arising from self-reference.
    • Hierarchy of Functions and Propositions: To adhere to the vicious circle principle, the theory introduces a hierarchy of functions and propositions, categorized into different “types”. This hierarchy ensures that a function cannot apply to itself or to any entity that presupposes it, thereby avoiding logical contradictions.
    • Orders of Functions: Functions are further distinguished by “order,” reflecting the complexity of their definitions in terms of quantification over other functions. A function defined by quantifying over a collection of functions must be of a higher order than the functions within that collection.

    Simple vs. Ramified Theory of Types

    • Ramified Theory: The original theory in the first edition of PM is a “ramified” theory of types, which accounts for both the types of arguments that functions can take and the quantifiers used in the definitions of those functions.
    • Simple Theory: Later, a move toward a “simple” theory of types emerged, particularly with Ramsey’s proposals, where the focus is primarily on the types of arguments, simplifying the hierarchy.
    • Extensionality: The move towards the simple theory of types is connected with the concept of extensionality. With extensionality, functions that are true for the same arguments are identified.

    Technical Aspects and Notation

    • Type Symbols: Various notations have been proposed to symbolize types, with Alonzo Church’s “r-types” being the most fine-grained, capturing distinctions of order and level.
    • ι represents the r-type for an individual.
    • (τ1, . . . , τm)/n denotes the r-type of a propositional function of level n, with arguments of types τ1, . . . , τm.
    • ()/n represents the r-type of a proposition of level n.
    • Variables and Quantification: In PM, statements of theorems use real (free) variables, and bound variables are interpreted within specific logical types to adhere to the vicious circle principle.

    Axiom of Reducibility and Type Theory

    • Axiom of Reducibility Defined: The axiom of reducibility guarantees that for every function, there exists a co-extensive predicative function of the same type, which simplifies the system by allowing higher-order functions to be reduced to first-order ones.
    • Role in PM: The axiom ensures that for any complex function, there is a predicative function that is true for all the same arguments.
    • Criticisms and Abandonment: The axiom has been criticized for various reasons, including by Wittgenstein as not being a principle of logic. The second edition of PM considers abolishing the axiom.

    Classes and Type Theory

    • Classes as Functions: PM identifies classes with propositional functions. The expression x̂ψx denotes the class of things x such that ψx, mirroring modern notation {x : ψx}.
    • No-Classes Theory: The “no-classes” theory aims to eliminate talk of classes in favor of propositional functions, reducing all talk of classes to the theory of propositional functions.

    Challenges and Interpretations

    • Gödel’s Incompleteness Theorem: Gödel’s incompleteness theorem and related concepts challenge the completeness and consistency of formal systems, including those based on type theory.
    • BMT (Appendix B Modified Theory of Types): Gödel identified a new theory of types in Appendix B, known as BMT, which allows any propositional function to take arguments of appropriate type, regardless of the quantifiers used in defining the function.
    • Ramsey’s Modification: Ramsey proposed rs-types, combining simple types with orders for predicates, offering an alternative revision to the ramified theory of types.

    Revisions and Alternative Approaches

    • Chwistek’s Constructive Types: Chwistek advocated for a “theory of constructive types” without the axiom of reducibility, emphasizing that all functions should be definable or constructible.
    • Weyl’s Predicative Analysis: Weyl presented a version of predicative analysis, developing real numbers without invoking vicious circle fallacies, thereby constructing a “predicative” analysis.

    In summary, the theory of types in Principia Mathematica is a complex framework designed to resolve logical paradoxes by imposing a hierarchical structure on functions and propositions. The evolution of this theory, from the ramified approach to simpler, extensional versions, reflects ongoing efforts to refine the foundations of logic and mathematics. The debates surrounding the axiom of reducibility and alternative type systems highlight the intricate challenges in constructing a consistent and comprehensive logical framework.

    Principia Mathematica: Propositional Functions

    Propositional functions are a crucial element in Principia Mathematica (PM), serving as a foundation for both logic and mathematics. They play a significant role in the development of the theory of types and the resolution of logical paradoxes.

    Definition and Nature

    • A propositional function is an expression containing a free variable such that when the variable is replaced by an allowable value, the expression becomes a proposition. For example, ‘x is hurt’ is a propositional function.
    • Expressions for propositional functions, such as ‘x̂ is a natural number’, are distinct from mathematical functions like ‘sin x’. The latter are referred to as “descriptive functions”.
    • Expressions using the circumflex notation, such as φx̂, appear mainly in the introductory material of PM and not in the technical sections, except in sections on class theory.

    Role and Significance

    • Building Blocks of Propositions: Propositional functions serve as a basis for constructing propositions by assigning allowable values to the free variable. The propositions resulting from the formula by assigning allowable values to the free variable ‘x’ are said to be the various “ambiguous values” of the function.
    • Foundation for Classes and Relations: Propositional functions are closely linked to the theory of classes. The expression x̂ψx represents the class of things x such that ψx. In PM’s type theory, the class x̂φx has the same logical type as the function φx̂.
    • Distinguishing Universals from Propositional Functions: Universals are constituents of judgments, while propositional functions are not ultimate constituents of propositions.

    Technical Aspects and Notation

    • Variables:p, q, r, etc., are propositional variables.
    • a, b, c, etc., are individual constants denoting individuals of the lowest type, mainly in the introductions to PM.
    • R, S, T, etc., represent relations.
    • Circumflex (^): When placed over a variable in an open formula (e.g., φx̂), it results in a term for a propositional function.
    • Exclamation Mark (!): Indicates that the function is predicative, meaning it is of the lowest order compatible with its argument. A predicative function φ!x is one which is of the lowest order compatible with its having that argument.

    Type Theory and Propositional Functions

    • Simple Types: Simple types classify propositional functions based on the types of their arguments.
    • If ‘Socrates’ is of type ι, the function ‘x̂ is mortal’ is of type (ι).
    • A relation like ‘x̂ is father of ŷ’ would be of simple type (ι, ι).
    • Ramified Theory: The ramified theory of types in PM tracks both the arguments of functions and the quantifiers used in their definitions.
    • Levels: Functions have levels, and a function defined in terms of quantification over functions of a given level must be of a higher level. For example, if ‘x̂ is brave’ is of type (ι)/1, then ‘x̂ has all the qualities that make a great general’ might be of type (ι)/2 because it involves quantification over functions like ‘x̂ is brave’.

    Axiom of Reducibility and Predicative Functions

    • Predicative Functions: The exclamation mark ‘!’ indicates that the function is predicative, i.e., of the lowest order that can apply to its arguments.
    • Axiom of Reducibility: The axiom asserts that for any function, there exists a co-extensive predicative function. This axiom was debated and ultimately abandoned in later editions of PM.
    • Impact of Abandonment: The abandonment of the axiom of reducibility and the emphasis on extensionality led to revisions in how propositional functions were treated, particularly concerning identity and higher-order functions.

    Extensionality and Truth-Functionality

    • Extensionality: PM’s second edition emphasizes that functions of propositions are always truth-functions and that a function can only occur in a proposition through its values.
    • Truth-Functionality: The argument for extensionality suggests that if a function occurs in a proposition only through its values and these values are truth-functional, then co-extensive functions will be identical.

    Classes and Propositional Functions

    • Contextual Definition: The use of contextual definitions allows for the elimination of class terms in favor of propositional functions. For instance, the expression x ε ẑ(ψz) can be interpreted by eliminating the class term using contextual definitions, yielding x ε ẑ(ψz) . ≡ . ψx.
    • Relations in Extension: From section ∗21 onward, italic capital letters (e.g., R, S, T) are reserved for relations in extension, where xRy denotes that the relation R holds between x and y.

    In summary, propositional functions are fundamental to the logical structure of Principia Mathematica. They are used to construct propositions, define classes and relations, and address logical paradoxes through the theory of types. The treatment of propositional functions, particularly in relation to the axiom of reducibility and the principle of extensionality, reflects the evolving nature of logical and mathematical foundations explored in PM.

    Principia Mathematica: Mathematical Induction and its Logical Foundations

    Mathematical induction is a central topic in Principia Mathematica (PM), particularly concerning its logical foundations and its treatment within the theory of types. The discussion of mathematical induction involves its relation to logicist accounts of arithmetic, the challenges posed by the axiom of reducibility, and the attempts to provide a rigorous basis for inductive proofs.

    Importance and Logicist Foundations

    • Distinctive Method of Proof: Mathematical induction has historically been recognized as a distinctive method of proof in arithmetic.
    • Logicist Achievement: A key achievement of logicism, particularly by Frege, was to demonstrate that induction could be derived from logical truths and definitions alone.
    • Central to Arithmetic: Induction, along with the definition of numbers as classes of equinumerous classes, is fundamental to the logicist account of arithmetic.
    • By 1919, Russell presented induction as central to deriving mathematics from logic. All traditional pure mathematics, including analytic geometry, can be regarded as propositions about natural numbers.

    Principle of Mathematical Induction

    • Two-Part Proof: Proofs by induction involve two main parts:
    • Basis Step: Proving that the property holds for 0.
    • Induction Step: Assuming the property holds for an arbitrary number n (the inductive hypothesis) and then proving it holds for n+1.
    • General Form: The principle of induction appears in a general form for use with an arbitrary ancestral relation:
    • If x bears the ancestral of the relation R to y, and x possesses any R-hereditary property φ, then so does y.
    • Recipe for Proof: To prove that y has a property, show that x does, that x bears the ancestral of the R relation to y, and that the property is R-hereditary.

    Development in Principia Mathematica

    • Part II Focus: Part II of Principia Mathematica, titled “Prolegomena to cardinal arithmetic”, begins with identity and diversity relations.
    • Inductive Cardinals: Inductive cardinals (NC induct) are derived by starting with 0 and repeatedly adding 1.
    • Inductive Class: The inductive class (Cls induct) is one way of thinking about finite classes. Defined this way, inductive cardinals are equinumerous classes of individuals produced by adding one thing at a time to the empty class. The sum or union of all those cardinals will contain all the finite classes.
    • Peano Axioms: With 0 defined as a class, “natural number” defined as NξC induct, and the successor relation as +c1, Whitehead and Russell define and prove the Peano axioms as theorems of their system.
    • Peano’s Axioms and Induction: The principle of induction for natural numbers follows as a special case of induction on arbitrary ancestrals.

    Appendix B and Challenges to Reducibility

    • Limited Induction: Appendix B aimed to demonstrate that a limited form of mathematical induction could be derived even without the axiom of reducibility.
    • Technical Flaw: Gödel identified a technical flaw in the proof within Appendix B. Myhill later proved that the project of Appendix B is impossible in principle.
    • Generality: Appendix B seeks the general result that if y inherits all the level 5 R-hereditary properties of x, then it inherits any R-hereditary properties of x of whatever level. *The most important case of Appendix B shows that any induction on the natural numbers can be carried out with respect to properties of a fixed order, though this is tucked away in the middle of a series of theorems.

    Formalization and Theorems

    • Theorem ∗89·12: A key theorem in Appendix B states that every inductive or finite class of order 3 is identical with some class of order 2. The three-line proof suggests that this holds because of the level of the operation of adding one individual y to a class η, yielding η ∪ ι‘y.
    • Intervals: Intervals are also defined using descendants and ancestors, where the interval from x to y is defined in terms of the descendants of x and the ancestors of y.

    Myhill’s Challenge

    • Undefinability: Myhill argued that the proofs in Appendix B could not have succeeded, citing a generalization of a key result applying to one-many relations as well as many-one relations.
    • Non-Standard Models: Myhill’s argument uses model-theoretic arguments and “non-standard models” of arithmetic, which introduce non-standard numbers.
    • Limitations: Myhill proves that there are instances of induction of a level higher than any given level k which does not follow for properties of levels less than k.

    Gödel’s Critique

    • Mistake Identified: Gödel pointed out a mistake in the proof of ∗89·16, related to applying induction to a property of β involving α.
    • Unsolved Question: Gödel stated that the question of whether the theory of integers can be obtained on the basis of the ramified hierarchy must be considered as unsolved.

    Revised Approaches and Interpretations

    • Davoren and Hazen (1991): This study hints at a liberalization of RTT, allowing propositional functions to hold arguments of appropriate (simple) type but arbitrary order while still maintaining restrictions on the orders of quantified variables in the definition of a propositional function.
    • Wang’s Suggestion: Wang suggests that higher-order induction could prove the consistency of the system with lower-order induction and eliminate more non-standard numbers.
    • Royse’s Development: Royse showed how a truth predicate could be defined for a system of predicative arithmetic of a lower order within a system of higher order, following the model of Tarski.

    By Amjad Izhar
    Contact: amjad.izhar@gmail.com
    https://amjadizhar.blog

  • Cloud Migration: Convert Clients to Xero

    Cloud Migration: Convert Clients to Xero

    The webinar, “Cloud Migration: Tips to Convert Clients to Xero,” presents strategies for accounting professionals to effectively transition clients to cloud-based accounting. Daniel Hustler, a Xero expert, shares insights on positioning the cloud’s value, delivering impactful product demos, and handling common client objections. Key topics include understanding the client’s needs, demonstrating how Xero addresses those pain points through a customized workflow, and emphasizing the benefits for business owners like improved visibility and faster payments. The aim is to empower advisors to confidently guide clients toward cloud adoption, showcasing how it streamlines operations, enhances collaboration, and ultimately contributes to business growth.

    Cloud Migration to Zero: A Comprehensive Study Guide

    Quiz

    Answer the following questions in 2-3 sentences each.

    1. What are the three key considerations to keep in mind when positioning the cloud and Zero to clients?
    2. Why is it important to know your audience when trying to move them to the cloud?
    3. What are three benefits of Zero that commonly resonate with business owners?
    4. What is the “Rule of Three” and how can it be used in client communication?
    5. Why is a product demo a powerful tool for convincing clients to switch to Zero?
    6. What is a client needs analysis, and how can it help you prepare for a Zero demo?
    7. Describe one workflow that can be showcased during a Zero demo.
    8. What are the key elements of the profit and loss report in Zero?
    9. Name three common objections clients have to moving to the cloud.
    10. How can offering Zero training via Zoom webinars help clients adapt to the new software?

    Quiz Answer Key

    1. Knowing your audience, knowing the benefits Zero can bring, and communicating those benefits effectively. It is important to understand the specific needs and pain points of your clients, and you must be able to articulate the advantages of using Zero in a way that resonates with them. Effective communication can make all the difference in whether a client is convinced to switch.
    2. Because understanding their specific needs and technological capabilities is essential. Some clients might be enthusiastic about online solutions, while others might be more resistant or unfamiliar with them. Tailoring your approach to their level of understanding and comfort will increase the likelihood of a successful transition to the cloud.
    3. Business visibility, collaboration, getting paid faster, saving time, and staying compliant. These benefits address common concerns of business owners, such as gaining clarity over their business performance, working more effectively with their advisors, improving cash flow, reducing administrative burden, and avoiding issues with tax authorities.
    4. The Rule of Three is a persuasive communication technique of structuring content in groups of three. Because the human mind remembers in threes, speaking in threes is more memorable. It can be used to frame benefits, features, or reasons for switching to Zero to enhance memorability and impact.
    5. It allows clients to see the benefits of Zero firsthand and how it addresses their specific pain points. A demo can illustrate how the software works in a real-world scenario, making the value proposition more tangible and convincing than simply describing its features.
    6. A client needs analysis is a document of your client’s pain points and goals. It helps you tailor your Zero demo to focus on the features that will be most relevant and impactful for that specific client, ensuring that the demonstration resonates with their individual needs.
    7. A common and impactful workflow is going from the dashboard to creating invoices, managing bills, reconciling bank transactions, and generating reports. Showing how these tasks flow seamlessly within Zero highlights the efficiency and time-saving benefits of the software.
    8. Income, expenses, and net profit or loss. This report gives a quick snapshot of a client’s income and expenses.
    9. Competition with other software, cost concerns, difficulty learning a new system, unstable internet connections and fear of change are all common objections. These concerns often stem from uncertainty about the value proposition, fear of disruption, or technical limitations.
    10. Zoom webinars allow accountants and bookkeepers to provide scalable training to multiple clients simultaneously. They can address common questions and concerns, showcase Zero’s features, and provide ongoing support, helping clients become more comfortable and confident in using the software.

    Essay Questions

    1. Discuss the importance of tailoring your sales pitch to the individual client, as described in the source material. What are the risks of using a one-size-fits-all approach? Provide examples of how you can adjust your communication based on the client’s needs and concerns.
    2. Evaluate the effectiveness of using a product demo as a sales tool for Zero. What are the key elements of a successful demo? How can you ensure that the demo resonates with the client and addresses their specific pain points?
    3. The source material emphasizes the value that accountants and bookkeepers bring to the table when working with clients on the cloud. Discuss the unique role of the advisor in helping clients transition to Zero. How can you leverage your expertise to support clients and ensure a successful implementation?
    4. Analyze the common objections that clients have to moving to the cloud. Develop a strategy for addressing these objections and overcoming client resistance. How can you proactively anticipate and mitigate potential concerns?
    5. Explain the importance of upskilling and staying current with Zero’s features and capabilities. How can continuous learning benefit both you and your clients? What are some strategies for staying informed and expanding your knowledge of Zero?

    Glossary of Key Terms

    • Cloud Accounting: Accounting software and practices that utilize remote servers to perform accounting tasks. Data is stored on the Internet rather than on a hard drive.
    • Zero: Cloud-based accounting software designed for small businesses and their advisors.
    • Positioning: How a product or service is perceived in the minds of the target market.
    • Client Needs Analysis: A detailed assessment of a client’s specific challenges, goals, and requirements.
    • Demo Company: A sample company within Zero pre-populated with fictitious data for demonstration purposes.
    • Workflow: The sequence of steps involved in a particular business process, such as invoicing or bank reconciliation.
    • Value-Based Pricing: Pricing strategy that focuses on the perceived value of the services provided to the client.
    • Flat Fee Pricing: A fixed price for a specific set of services, regardless of the time spent completing them.
    • Zero Partner Program: A program that offers benefits and resources to accountants and bookkeepers who use and promote Zero.
    • Invoice Reminders: Automated notifications sent to customers to remind them about overdue invoices.
    • Bank Reconciliation: The process of matching the balances in an entity’s accounting records for a cash account to the corresponding information on a bank statement.
    • Profit and Loss Report: A financial statement that summarizes the revenues, costs, and expenses incurred during a specific period of time.
    • Aged Receivables Report: A report that lists outstanding invoices by customer and aging category.
    • Hubdoc: A document management tool that integrates with Zero to automate bill and receipt capture.
    • Open-ended Questions: Questions that require more than a “yes” or “no” answer, designed to encourage clients to share detailed information.
    • Legacy Software: Existing software or systems that are outdated but still in use.
    • Value-Based Pricing: A pricing strategy where the price is determined by the perceived value the client receives from the product or service.
    • ISOC Compliant: Meeting the standards set by the International Organization for Standardization (ISO).

    Cloud Migration: Zero Partner Strategies

    Okay, here’s a briefing document summarizing the key themes and ideas from the provided transcript of the “Cloud Migration: Tips to Convert Clients to Zero” webinar.

    Briefing Document: Cloud Migration Strategies for Zero Partners

    Source: Excerpts from “Cloud Migration: Tips to Convert Clients to Zero” webinar transcript.

    Date: (Implied – likely recent based on content)

    Author: Daniel Hustler, Education at Zero (Asia region)

    Purpose: To provide Zero partners with practical tips and strategies to successfully migrate clients to the cloud, specifically onto the Zero platform.

    Key Themes and Ideas:

    The webinar focuses on three core areas to help partners move clients to the cloud and to Zero:

    1. Positioning the Cloud (and Zero) in the Client’s Mind:
    • Know Your Audience: Understanding your target client base is crucial. Consider targeting clients enthusiastic about online solutions, new businesses (without legacy software), and businesses undergoing significant events (expansion, retirement). “If you are new to moving clients to the cloud then you may want to consider starting with clients that are enthusiastic about online Solutions.”
    • Know the Benefits (and Communicate Effectively): Focus on benefits that resonate with business owners, not just accountants. Highlight business visibility, collaboration (the value of the advisor), faster payment/improved cash flow, time savings, and staying compliant. “Instead focus on the benefits that resonates with them and consider bringing those into the conversation.” The rule of three is introduced as a mnemonic device to make the benefits more memorable for the client, “as humans we remember in threes so if you speak in threes then it’s going to be much more memorable for the Eld audence not two not four but three”.
    • Communication is Key: Ask open-ended questions to understand client pain points and goals. Avoid complex accounting jargon. Consider the setting of the conversation. Don’t make it just a sales pitch; focus on understanding the client’s needs. “Things like what are your pain points what are your goals where are you spending most of your time what is holding you back you can get a large list of open-ended questions from the likes of chat GPT now this will help your client do most of the talking from there you can understand their pain points and understand what type of benefits you might want to highlight.”
    1. Delivering an Effective Zero Product Demo:
    • Demo Power: Hustler emphasizes the power of a well-executed demo. “They say A picture speaks a thousand words and I would say a demo speaks a million words because it allows your audience to see the benefits that they receive with their own eyes.”
    • Client Needs Analysis: Before the demo, create a “client needs analysis” based on conversations to identify specific pain points and goals. This informs which Zero features to highlight.
    • Demo Workflow: Use a workflow-based demo that simulates the client’s daily use of Zero.
    • Value-Driven: Continuously refer back to the value the client will receive and how the software addresses their specific pain points.
    • Analogy: Use analogy’s for clients to relate to current activities like having an invoices owing to me graph, so that they don’t need to search through files.
    • Customization: Every user on Zero has access to their own demo company that has fictitious data, so customize to show clients the features that fit them.
    • Relationship: Be mindful of the advisor/client relationship, “don’t make it a sales pitch.”
    • Key Demo Elements (Illustrated in the Transcript):Dashboard: Shows a snapshot of the business’s financial position (bank balance, cash flow, invoices, bills).
    • Sales Overview: Helps manage and follow up on invoices, speeding up payments.
    • Purchases Overview: Simplifies bill management and payment scheduling.
    • Bank Reconciliation: Explains the process and how it provides financial insight.
    • Profit and Loss Report: Provides a snapshot of income, expenses, and net profit/loss, with customization options (e.g., percentage of trading income, profitability by store).
    • Aged Receivables Report: Helps track outstanding invoices, with added contact information for easier follow-up.
    1. Handling Common Questions and Objections:
    • Competition: Avoid feature comparisons. Focus on Zero’s value and the advisor’s value. Emphasize time savings.
    • Cost: Consider value-based pricing/flat fees to bundle Zero into the advisor’s service. Demonstrate the value proposition clearly, showing how Zero resolves pain points and saves time. Ensure clients are on the right Zero plan for their needs.
    • Difficulty/Fear of Change: Reassure clients with support and training, highlight available Zero resources (courses, help). If doing this with multiple clients at once use Zoom.
    • Unstable Internet: Acknowledge the issue, but focus on the value that can be received.
    • Trusting Cloud: Zero adheres to strong security standards such as isoc compliance and major financial institutions use it as well.

    Important Quotes:

    • “Ultimately positioning zero and your practice correctly could be the difference between moving clients to the cloud and getting new clients versus not moving clients at all.”
    • “Consider the right industry for you as well you might have a wealth of Soul Trader clients you might be familiar with the apps in those Industries generally those clients won’t be super complex so in that case you can set them up with the right tools from the GetGo rather than having to investigate a whole lot of apps that you might not be familiar with.”
    • “Collaboration never understand underestimate the value that you can bring as an advisor and how much your clients value that if you can collaborate together through the cloud”
    • “Again if you can showcase the value and the benefits and how it fixes their pain points through those conversations through the demo then already they’re probably quite often more likely wanting to switch anyway.”
    • “No plan and no action will lead to no results what steps can you take to help clients really realiz the benefit of the cloud is it about bringing in some tips around product positioning is it trying out a product demo or is it taking note of some of those key ideas around objection and question handling that we discussed in this webinar.”

    Actionable Takeaways for Zero Partners:

    • Develop a clear understanding of your target client base and tailor your messaging accordingly.
    • Master the art of demonstrating Zero, focusing on the value proposition for business owners, not just accountants.
    • Anticipate and prepare for common client objections, focusing on the value and support you can provide.
    • Leverage the Zero partner program benefits (advisor directory, training, account manager).
    • Take proactive steps to implement these strategies to achieve tangible results.

    This briefing document provides a comprehensive overview of the webinar’s content, equipping Zero partners with the knowledge and tools to effectively migrate clients to the cloud.

    Zero Adoption: Cloud Migration Strategies for Accounting Clients

    FAQ: Cloud Migration and Zero Adoption for Accounting Clients

    1. Why is it important to position the cloud, and specifically Zero, effectively to clients?

    Positioning Zero correctly is critical because it determines whether clients understand the value proposition and are motivated to switch to the cloud. Effective positioning helps them see how Zero addresses their specific business challenges and improves their operations. Without clear communication of these benefits, clients may be reluctant to change, even if Zero could significantly improve their financial processes. Correct positioning may also attract new clients to your accounting practice.

    2. What are the key considerations when positioning Zero to a potential client?

    The three main considerations are:

    • Know your audience: Understand their current tech adoption level, business needs, and pain points.
    • Know the benefits: Identify the advantages Zero offers that are most relevant and valuable to the client, like improved cash flow, time savings, or compliance. Remember benefits for you as an advisor may not be the same as the client.
    • Communicate effectively: Convey these benefits clearly and simply, avoiding technical jargon and relating them directly to the client’s challenges.

    3. What are some good starting points when targeting clients for cloud migration?

    Begin with clients who are already enthusiastic about online solutions and technology, as they are more likely to embrace cloud-based accounting. Also, consider targeting new businesses that don’t have existing legacy software. Evaluate clients based on events taking place in their business as well as your familiarity with the apps used in their industry. It is helpful to pilot with a small number of easier clients to refine your process.

    4. What are some common benefits of Zero that resonate with business owners?

    Business owners often value:

    • Business Visibility: Clarity and real-time insight into their financial performance.
    • Collaboration: Enhanced ability to work closely with their advisor.
    • Faster Payments & Improved Cash Flow: Tools and processes to accelerate invoicing and payment collection.
    • Time Savings: Reduced administrative burden and more time for core business activities.
    • Staying Compliant: Easier adherence to tax and regulatory requirements.

    5. How can I deliver a more effective product demo of Zero?

    To make your demos more impactful:

    • Create a client needs analysis.
    • Tailor the demo to the client’s specific needs and pain points, focusing on relevant features.
    • Use real-life scenarios and workflows to demonstrate how Zero integrates into their daily operations.
    • Continually highlight the value and benefits Zero provides, not just the features.
    • Don’t make it a sales pitch.

    6. What are some common client objections to switching to Zero, and how can I address them?

    Common objections include:

    • Competition: Instead of comparing feature by feature, emphasize Zero’s value and the advisor’s role in maximizing its benefits. Focus on the advisor being able to bring more value as they are also working on the cloud.
    • Cost: Highlight the value and return on investment (ROI) Zero provides through time savings, improved efficiency, and better financial insights. Also make sure they are on the right pricing plan for their use case.
    • Difficulty: Reassure clients that you will provide ongoing support and training, showcasing available Zero resources and making the transition smooth.
    • Lack of trust in the cloud: Reassure clients that Zero has robust security standards.

    7. How important is it to avoid accounting jargon when speaking with clients?

    Avoiding complex jargon is crucial for effective communication. Clients may not understand accounting terminology, which can create confusion and diminish the impact of your message. Using plain language ensures they understand the benefits of Zero and are more likely to embrace the solution.

    8. How can I get clients to take action and adopt Zero?

    Encourage action by having a personal conversation with each client where you focus on their business goals. Follow this by setting up a meeting to discuss their goals and see if the cloud is a fit. In order to scale quickly, consider setting up Zoom webinars. The most important thing is to highlight the benefits they will see.

    Cloud Migration Strategies for Accounting Clients

    Cloud migration involves moving clients to cloud-based accounting systems like Zero, which can be achieved by understanding the client’s needs, demonstrating the value of the cloud, and addressing their concerns.

    Key aspects of cloud migration:

    • Positioning the cloud Consider how you want your clients to perceive Zero and how it fits into their business.
    • Knowing your audience Identify clients who are enthusiastic about online solutions or new businesses without legacy systems.
    • Highlighting benefits Focus on benefits that resonate with business owners, such as business visibility, collaboration, faster payments, time savings, and compliance.
    • Communication Communicate value effectively by asking open-ended questions to understand client pain points, avoiding complex jargon, and understanding the setting of your conversations.

    Product Demo Tips:

    • A product demo is a powerful tool that allows clients to see the benefits with their own eyes.
    • Client needs analysis Use insights about client pain points and goals to create a client needs analysis.
    • Demo the benefits Focus on features that address the client’s specific needs, such as zero analysis reports or direct payments.
    • Highlight value Continually refer back to the value and how it addresses their pain points throughout the demo.
    • Workflows Use workflows to illustrate how they would use Zero day-to-day.

    Handling Objections:

    • Competition Avoid feature battles and focus on the value that Zero and the advisor can bring.
    • Cost Adopt value-based pricing, focus on time savings, and ensure clients are on the right plan.
    • Difficulty Reassure clients with support, showcase Zero resources, and highlight the value and benefits.
    • Lack of trust in cloud Emphasize security standards, the reliability of Zero, and the prevalence of cloud technology in everyday life.

    Xero: Benefits for Business Owners

    The benefits of Zero relate to how it can improve business operations and address common pain points for business owners. The benefits to an advisor may not be the same as to a business owner. Therefore, it is important to focus on the benefits that resonate with the business owner.

    Benefits of Zero:

    • Business visibility Zero can provide clarity over their business performance.
    • Collaboration Clients value the collaboration they can have with their advisor through the cloud.
    • Faster Payment Using Zero can lead to getting paid faster, improving cash flow. Many clients are struggling with cash flow, so this is an important benefit.
    • Saving Time Zero can reduce the amount of time spent on administrative tasks. Clients can spend more time on other things.
    • Staying Compliant Working on the cloud makes it easier to stay compliant with tax authorities.

    Effective Product Demonstrations: A Guide

    A product demo is a powerful tool that allows clients to see the benefits of a product or service with their own eyes. A demo speaks a million words because it allows the audience to see the benefits that they receive.

    Key steps for an effective product demo:

    • Client Needs Analysis Before the demo, insights about a client’s pain points and goals can be used to create a client needs analysis. For example, a client may not know how their business is performing, struggle to get paid on time, spend too much time tracing invoices, have paper bills piling up, and be unsure if their retail shops are profitable.
    • Demonstration of benefits When performing the demo, continually refer back to the value that the client will receive. Use a workflow that illustrates how they would use Zero day-to-day, referring back to the benefits, the value, and how it addresses their pain points. Customize the demo to suit the client’s needs by showing them features that solve their pain points and that they will use. If they aren’t financially savvy, demonstrate something basic like a profit and loss report and how it gives them insight.
    • Demo example When giving the demo, act as if you are the accountant presenting to the client. Start with the dashboard, which gives a quick snapshot of the business. Show how to raise and follow up on sales invoices in Zero. Show how to remove paper bills by checking out the purchases overview screen. Explain bank reconciliation. Show financial reports, such as the profit and loss report.
    • Highlight value Throughout the demo, highlight the value that the client will receive and how it makes things better for them. Use analogies to highlight the value.
    • Avoid a sales pitch The demo should focus on whether or not Zero will be beneficial to the client.
    • Address questions After the demo, clients may have questions and objections, which is normal when switching systems.

    Addressing Client Objections During Cloud Migration

    When helping clients migrate to the cloud, it’s important to address common objections they may have. Here’s how to handle some typical concerns:

    • Competition Instead of comparing Zero to other products feature by feature, emphasize the unique value Zero brings, along with the added value of having you as their advisor on the cloud. Demonstrating the software can address the objection by showing its value.
    • Cost Some accounting partners use value-based or flat fee pricing that includes the cost of Zero in their service. If passing the cost directly to clients, highlight the value they’ll receive. Show them how Zero resolves their specific pain points and the value that both Zero and you as an advisor provide when working together on the cloud. Focus on time savings from automating manual tasks. Also, make sure they are on the right Zero plan to get the best return on investment. As consumers, clients ask themselves if they can afford a service and if the value is worth the money, so showing value is key.
    • Difficulty Reassure clients that you’ll support them throughout the transition. Point them to available Zero resources and support. If you’re migrating multiple clients, consider using Zoom webinars for training. By showcasing the value and benefits of Zero and how it addresses their pain points, clients may be more willing to switch.
    • Lack of trust in the cloud You might mention that Zero adheres to strong security standards and is ISO compliant. Also, note that major banks trust and connect with Zero, and it’s a publicly listed company on the Australian Securities Exchange. Remind them that they likely already use cloud services like online banking and social media in their daily lives.
    • Unstable internet connection Discuss whether the value they receive from Zero is worth the unstable connection. Explore options like portable Wi-Fi connections.
    • Fearful of change Highlight the benefits of the change.

    Cloud Migration: Client Needs Analysis and Effective Communication

    When positioning cloud migration to clients, it is important to perform a client needs analysis. Consider your audience, highlight the benefits, and communicate effectively.

    Knowing the audience:

    • Consider clients that are enthusiastic about online solutions or new businesses without legacy systems.
    • Consider timing, such as during tax season or expansion.

    Highlighting the benefits:

    • Focus on the benefits that resonate with business owners such as business visibility, collaboration, faster payments, time savings, and compliance.
    • Business visibility can provide clarity over business performance.
    • Clients value the collaboration with their advisor through the cloud.
    • Faster payment can improve cashflow.
    • Saving time reduces time on admin tasks.
    • Staying compliant helps businesses avoid issues with tax authorities.

    Communicating effectively:

    • Ask open-ended questions to understand client pain points, and avoid complex jargon.
    • Determine whether it is a meeting to discuss business goals.
    • Take note of the client’s pain points and needs.
    • It should not be a sales pitch, but about what is best for the client.

    For example, a client might not understand how their business is performing, struggle to get paid on time, spend too much time tracing invoices, have paper bills piling up, and be unsure if their retail shops are profitable. Based on the client’s pain points, highlight the value and benefits of cloud migration.

    Convincing clients to switch to the cloud with ease Xero

    are you looking to get more clients onto the cloud and more clients onto zero if so then you’ve come to the right place good morning good afternoon good evening everyone from wherever you’re joining in the world and welcome to this webinar convincing clients to switch to the cloud my name is Daniel Hustler and I look after education here at zero for the Asia region um now in terms of my background my background is in accounting systems it’s also in sales as well so I do have a sales background where I used to sell and onboard businesses onto the zero platform and in my current role I deliver demos and do lots of presentations to people to Showcase Zero’s value um and make sure that they understand how to use it and love our products now the purpose of this webinar today is to give you some tips and tricks that I’ve learned that you might want to try to get more clients onto the cloud ultimately get more clients coming through your door and ultimately growing your practice we do also have a handy takeaway guide so this covers some of the things we’ll be talking about today my colleague Gracie will post a link to that in the chat box very soon alternatively feel free to pull out your phone scan that QR code and we’ll also send you through that takeaway guide in the chat box so what’s on the agenda for today over the next 45 minutes we’re going to cover some tips and tricks that you could use to get more clients onto the cloud and just get them over the line now this will include how you position the cloud in your client’s mind so that way they understand the value the second is that you’ll learn how to demo zero like a pro in my career I’ve performed demos to quite possibly hundreds of thousands of people so I’ll share a few tips and tricks in this space which you can use to get your client over the line they do say an image speaks a thousand words I would say a demo speaks a million words and it can be a really really effective tool to have in your pocket and finally how to handle some of the common questions and objections that we see I do like to customize my sessions according to what the audience are interested in so I’m going to launch a quick poll I’m Keen to understand what you’re most interested in hearing about today it is multiple choice are you most interested in learning about how to position zero in the cloud so the client understands the benefits do you want to deliver an effective product demo do you want to know how to handle some of those common questions and objections but if there’s something else put it in the Q&A box and I’ll do my best to answer it throughout this webinar at some stage so if there’s anything you’re particularly interested in feel free to leave it in that Q&A box now I’ll just leave that running for a couple more seconds I’ll end it in three five 5 4 3 2 1 I’ll end that poll I’ll share the results and it looks like we’ve got quite an even split so 40 roughly 40% want to know how to position zero and its benefits more than half want to know how to deliver an effective product demo and close to 60% are really Keen to understand some ways on how to handle those objections and questions um and we do have a answer in the Q&A which is around improving the ability and understanding of zero and the benefits so don’t worry we will cover that in the webinar as well so thank you now positioning the cloud positioning a product or service is ultimately about deciding how you want your target market to to think about that product or service in this case it’s how do you want zero to sit in the minds of your clients do they understand the value and you might even want to think about how you position yourself in the minds of your clients as an advisor and when you’re positioning say for example zero the there are a few things to keep in mind the first is to know your audience who are you looking to move to the cloud or who are you speaking to you should also know the benefits and the value that you can bring to the table with zero and on top of that you do need to know how to communicate that value effectively and it’s really important not to ignore any of these considerations for example you might know the value and benefit that zero can bring but if you can’t communicate it in a way that your client will understand and see that value then you might not actually end up with the result that you want ultimately positioning zero and your practice correctly could be the difference between moving clients to the cloud and getting new clients versus not moving clients at all um there is no one correct way for this but I’ll share a few tips and ideas in this space which you might want to consider so the first point was around the audience now if you are new to moving clients to the cloud then you may want to consider starting with clients that are enthusiastic about online Solutions those clients are already using a wealth of technology in their day-to-day lives so generally speaking they’re probably going to be easier to switch new businesses are also great because they don’t have Legacy software built-in so that way they can start fresh ultimately cons of the timing and events that are taking place as well are they in the middle middle of a very busy tax period or maybe are they in time of an expansion where they need realtime financials in which case that might be a really great opportunity um even those businesses which are looking to maybe retire soon and sell their business um moving them onto the cloud could be a good opportunity because that gives them real-time financials to sell their business at a higher price um consider the right industry for you as well you might have a wealth of Soul Trader clients you might be familiar with the apps in those Industries generally those clients won’t be super complex so in that case you can set them up with the right tools from the GetGo rather than having to investigate a whole lot of apps that you might not be familiar with and again if you are new to this pilot with a small number of clients and refine your process over time start with those easier clients before you move on to those more complex ones the second point we saw was the benefits knowing those benefits and this is important because the benefits to you as an advisor may not necessarily be the same to a business owner for example while one of the benefits that I absolutely love is power report customization a business owner may not necessarily resonate with that so instead focus on the benefits that resonates with them and consider bringing those into the conversation now what I found in my experience is that a lot of business owners and clients tend to resonate with some of the following so business visibility being able to get clarity over their business and where they’re at this is especially true for those businesses that currently don’t have that visibility and are currently managing their business based on what they see in their bank statement collaboration never understand underestimate the value that you can bring as an advisor and how much your clients value that if you can collaborate together through the cloud getting paid faster cash flow is top of mind for clients we know many clients are struggling with cash flow especially at the moment with those Rising costs so if you tell your clients that there is a way to help them get paid faster improve their cash flow then you’ve already got them hooked and for a lot of clients that’s probably going to resonate much more than say for example powerful report customization Saving Time people want to spend less time on admin and more time on doing the things that they love whether that be spending time with their friends and their family or maybe just developing a really great product or service for their own customers and also staying compliant I’m sure no one wants a letter from the local tax Authority telling them that they’ve being uncompliant so if they’re on the cloud and you can work with them closely together then this is a really easy way to solve this so have these benefits ready in in your pocket there are some benefit as well and when talking to the to to the benefit when talking to your clients about the benefits consider maybe using three of these and see how it works for you three is a very magic number and it’s me memorable in the human mind so I’m going to ask a question can I get it if you have heard of the rule of three before the rule of if you’ve heard of that then can you click the raise hand Link in the zoom toolbar so if you’ve heard of the rule of three so it looks like a very small number of us have heard of the rule of three basically it’s a persuasive technique used by public speakers writers and even politicians as humans we remember in threes so if you speak in threes then it’s going to be much more memorable for the Eld audence not two not four but three for example life liberty and the pursuit of happiness government of the People by the people for the people location location location stop look and listen friends Roman countrymen these are all used and threes these are examples of the rule of three and it becomes very memorable in the uh in the human’s mind but knowing the audience and kn knowing the value is one thing being able to communicate it is just as important if not more important you might have the greatest product or service in the world but if you don’t communicate it in a way where the audience will understand then it’s going to be a whole lot more difficult to get the outcome that you desire so again there’s no one way to get this right but I’ll share a few tips and ideas that you might want to consider in your client conversations and see how it goes some of these you might be aware of some of these you might not be aware of the first is to ask open-ended questions things like what are your pain points what are your goals where are you spending most of your time what is holding you back you can get a large list of open-ended questions from the likes of chat GPT now this will help your client do most of the talking from there you can understand their pain points and understand what type of benefits you might want to highlight again it’s helpful because you will understand those pain points one of the questions that I asked at a business owner at an event recently was what is keeping you awake late what is keeping you up late at night and what they said is they spent all of their evenings sending out invoices in Microsoft Word and they’re struggling to keep on top of payments um now the reality is is if they were aware of let’s say for example um the ability to send invoices through zero and the ability to accept those online payments um that would have removed that whole pain point and and they were much more receptive when they saw that zero had this particular ability to do that um I would also say avoid complex jargon where possible as we know um a lot of clients don’t understand accounting jargon and terminology things like accounts receivable accounts payable statement of comprehensive income if you add a lot of confusing terminology it’s probably going to take away the message that you the underlying message that you ultimately want to convey but of course it does depend on who you’re speaking to as well if you’re speaking to a financial controller they’ll probably want you to speak their language also consider the setting is it a meeting that you’ve set up with them to discuss their business goals or are you just passing by their office and you say hey move switch to the cloud and when you’re in the conversations take note of their pain points and their needs because that’s going to be very helpful for when you want to talk about the benefits or even when you want to perform a demo and finally really important here it’s not a sales pitch pitching a software can sometimes come across as impersonal so one idea would be to set up a meeting to understand their pain points and goals but again this just depends on your relationship with them and sometimes zero might not be the best option for them and if that’s the case that’s perfectly fine ultimately it’s all about what is best for the client which again may not always necessarily be zero and that’s perfectly fine now after we’ve had those conversations with a client we might even want to consider bringing in a product demo a demo is a very powerful tool to have in your pocket as I mentioned earlier they say A picture speaks a thousand words and I would say a demo speaks a million words because it allows your audience to see the benefits that they receive with their own eyes now again I like to keep my sessions fairly interactive so I’m going to launch a second poll you should see it appear on your screen shortly and don’t worry it’s completely Anonymous um how much do you agree with this statement I am confident in running an effective product demo do you strongly agree with the statement do you somewhat agree with the statement do you somewhat disagree with this statement or do you strongly disagree with this statement so we’ve got a lot of answers flowing through I’ll just give a few more seconds to answer that and I’ll end that poll in 5 4 3 2 1 and I’ll just go ahead and share those results so it looks like about 12% of us strongly agree more than half of us somewhat disagree so somewhat agree and then the rest are about 30% are falling and either somewhat disagree or strongly disagree um thanks for those answers so in my role it’s basically to perform product demos right so I’ve done product demos to hundreds of thousands of people and it’s very very powerful because it really showcases the value that a product or service can bring and it can make people want to use it so no matter what spectrum what what area you’re falling in on that poll hopefully I can bring in a few tips and tricks for you in the space now before we do jump into the demo in the earlier conversations with your clients you may have found a number of pain points or goals that they’re working towards especially if your asking those open-ended questions maybe you’re already familiar with with what their goals are and you can use these insights to create what is known as a client needs analysis in this case let’s assume I’ve had a conversation with a client and these were some of the things that I noted the client has no idea how their business is performing they feel like they’re Flying Blind they struggle to get paid on time they spend too much time tracing invoices they have paper bills piling up on their dining room table and they’re not sure if any of their retail shops are profitable this information is gold for my product demo because I know exactly what I need to show to wow them and get them excited about the product with that in mind um I would also let’s take a look at the time where running with that in mind let’s take actually I would love a couple of suggestions from the crowd actually so in the Q&A box I would love to hear what features you would consider demoing in this in the demo what features would you demo based on this client needs analysis so can I get a few ideas from the Q&A box what features would you demo zero analysis report direct payments hubdoc and online payments tracking reports these are all fantastic answers um p&l absolutely so these are all some really great features that you can highlight in your demos depending on what your client needs are so with that in mind I’ll actually jump into zero and take a look at how I would demo this based on on that information we have at hand when doing this demo I’ll assume that I’m the accountant and I’ll PR present back to you as if you are the client a few things to note here is that I’ll use a workflow which illustrates how they use zero daytoday and what you’ll see is that I’ll continue to refer back to the benefits the value and how it addresses their pain points so we’ll get started with the St demo and again I’ll assume that you’re the client the first screen that you’re going to log on land on when you log to zero is going to be the dashboard and the dashboard is to give is designed to give you a quick snapshot of your business the moment that you log in so that way you have all that information at your hand I can see how much money is in my bank account so that way I don’t need to log into internet banking in order to figure that out I can see how much I can see the total cash in and cash out graph because we do understand that for businesses cash flow is king um we do understand as well that sometimes you might not know where you’re spending your money so we you can also add any accounts you want to add through to the account watch list moving further below we have the invoices ow to you graph so rather than spending your mornings trying to figure out who owes you money and chasing those online chasing those payments I can see that information the moment that I log in and that way I can get paid faster further down below we also have the bills you need to pay graph so again we’ll be able to see all of the bills that we need to pay so that way they’re not piling up on that dining room table now we do understand as well that you spend a lot of time sending those sales invoices so let’s take a look at how you raise and follow up on sales invoices and zero the sales overview screen is designed to give you a snapshot of the status of all of your invoices we can see what’s in draft we can see what’s a waiting payment we can see what’s overdue likewise a list of the customers that owe us the most money so again this is already giving us some great Insight around who we need to follow up on to get paid faster and we don’t have to spend all our evening searching through trying to figure this out and if we want to raise an invoice it’s really easy in this case I just need to select the who which will be Bayside Club I can also see how much they currently owe maybe we might want to follow up with them on those unpaid invoices before we send another the due dates have autop populated for us as well as the invoice number now we know you spend a lot of time chasing those unpaid invoices so why not add a big pay now button to the top of your invoices which your customers can click to pay you now rather than spending all of my time typing up the same detail in Microsoft Word I can simply set up an item that I’ve sent up set up I can preview how the invoice will appear both in the web and the mobile version I can go ahead and email this off the email template will autop populate for me and what that means is I don’t need to continually draft up an email over and over again I can send this off and look how quick and easy that was compared to maybe the current process that we’re using of sending those invoices through Microsoft Word but of course we do also understand that um we want to follow up on those unpaid invoices so in this case we’re going to choose to turn on the friendly little robots in zero called invoice reminders now in this particular case I’m going to choose to automatically email a customer when whenever an invoice is seven days overdue 14 days overdue 21 days overdue or maybe I might want to send them a friendly little reminder when an invoice is due in 3 days not only is this going to help me get paid faster not only is this going to save time but it’s also going to avoid those awkward followup conversations which none of us like to have now in order to remove those paper bills piling up on the dining room table we can check out the purchases overview screen this has a full list of all of the bills that we need to pay and if we want to get these bills into zero it’s actually very easy every single organization in zero has their own unique bills email address if I was to if we were to forward a bill through to this email address then it will be automatically created in zero as we can see here with this forwarded bill now after we’ve approved those bills maybe we might want to schedule to pay a number of these bills say for example on the 20th of the following month so that way on December the 20th we know exactly how much money we need in the bank in order to pay those bills so that way we feel we won’t feel like we’re Flying Blind now we also understand as well that we really want to get oversight over how your business is performing and part of that is through a bank reconciliation which we’ll take a look at now now as a business owner you might not be familiar with what a bank reconciliation is so let me explain on the left hand side we have all of those Bank transactions in that happen in your bank account so your actual physical bank account and on the right hand side we need to say what it is for for example is it for sales is it for maybe some expenses is it for power is it for rent etc etc so let’s take a look at how it works we can see further down below that we have a statement line for Cooper Street Bakery the who in this case is for Cooper Street Bakery the account code which is what we spent it on was for a staff lunch so entertainment and I’ll just say staff lunch and this is what is going to give us insight about our business and on our financial reports but if I scroll further down down below you will be able to see that we have a number of another Cooper Street Bakery line zero has remembered the details from last time it’s autop populated that information for us all we have to do is click okay and that’s going to update our financial reports which you’ll see in a moment and also give you the type of visibility that um you may be wanting um on top of that since we’re both working on the cloud um if you’re not comfortable with this process I can even do that for you so that way we can make sure you’ve got the upto-date financials and can make those better decisions so let’s take a look at those financial reports the first report that I want to show is the profit and loss report now the profit and loss report is a very useful report at the moment I understand you don’t have this type of ins currently but to provide some context this report is designed to give you a quick snapshot of how much what what your income is where your expenses are going and what you’ve got left over which is your net profit or loss but with a few minor tweaks we can get some really great insight about your business one example is the percentage of trading income so this will help you to identify where your costs are high relative to your income for example I can see that for every dollar in sales about 177% is on motive will be on Motor Vehicle expenses so this is the type of insight you may not have had before um you also have I know that there are multiple stores as well that are being operated so if you want to see which stores are profitable then that’s very easy you’ll actually be able to get that side by side comparison simply listed here so again you’ll be able to see which of your stores are profitable and which are not now one other report that I thought I would show is going to be the age receivables report um again understand that quite a bit of time is being spent chasing those unpaid invoices and this report is helpful because it’s going to show what is owed to us and who owes us money a very helpful report but with a few minor changes we can get some really great Extra Value in this case I’ll add the phone the primary person the mobile column email and I’ll click update so at a glance we’ll have all the contact details that we need to follow up on those unpaid invoices we know who to contact in the business we know the person name we know their phone they know their email everything that we need to get paid faster is listed right here and on top of that is going to save us time around searching through the um your contact details trying to figure this information out so that’s it in the space of I would say about 12 minutes performed a very highly effective demo which continually refers back to the value that the client will receive now this particular demo is something that I’ve done many times before um it usually gets a lot of high ratings um so I thought I’d just share a few other tips for you when performing a demo again entirely up to you if you want to use these or not but these are just some ideas to see if it works for you the first is to consider the relationship between you and your client what is your relationship with them again you know don’t make it a sales pitch um but consider the relationship I would also suggest to use the demo company every single user on zero has access to their own demo company it’s prepopulated with fictitious data so use it where possible also customize the demo to suit your client needs ideally show them features that solve their pain points and features that they will use for example if they’re not someone that is financially Savvy or doesn’t do a lot of complex reporting then don’t show them complex reporting don’t show them things that will make us as accountants excited just show them something basic like a profit loss how it gives them insight and maybe that percentage of trading income if they don’t use multicurrency then we don’t need to show that either um value value value cons continually highlight the value that they receive and how it makes things better for them use analogies like I did which was as an example by using zero and chasing those unpaid inv and having that invoices o to to me graph I don’t need to spend my morning searching through different folders on my computer trying to figure it out those are tasks which they are already doing so the so continually highlighting the value will really resonate with them also workflows work well in this demo we went from the dashboard through to invoicing through to bills to bank wck and then through to reporting a workflow works really well because it shows how they will use it daytoday and in a way that makes sense so that way it doesn’t feel disjointed and finally it’s not a sales pitch ultimately we’re focused on whether or not zero will be beneficial to our client we don’t want to force something on them which they are not comfortable with we just want to highlight the benefits so what I’ve actually usually found is that off the back of a highly effective demo quite often you will see a client’s face light up in excitement sometimes though they may have some questions and also some objections which is again perfectly fine as well and perfectly reasonable if they’re making a switch on their systems but I do like to keep again my sessions interactive and I like to customize it to my audience I will cover off a few objections and ideas that I sometimes speak to accountants about but if you have anything that you want to know about then can you let me know in the Q&A box what type of objections do you sometimes get let me know in that Q&A box and I’ll try my best to um answer those questions along the way so um not making enough money at the moment yes um employee onboarding unstable internet connections absolutely price can’t justify the cost the competition yes um clients who don’t trust the cloud that’s that that’s one as well so um Gracie if you could just leave these questions open I’ll try to answer these along the way I have got three qu object qu three objections listed and I’ll talk through these ones first and hopefully this should answer some of them the first is competition here are a few ideas and suggestions in the space the first is to avoid feature battles don’t start comparing one product to another instead focus on the value that zero can bring and the value that you can bring as an advisor when you’re both working on the cloud quite often this particular objection can be addressed through the demo where they actually see the value that they receive if you start comparing features one product versus another it ends up in a spiral of taking away the underlying message you want to push which is the value um focus on the time savings as well and how doing things in the cloud is much more simple this one pops up it’s popped up a few times in the Q&A box which is cost individual circumstances ultimately depend how you address this so many of our accounting and bookkeeping Partners have adopted value based pricing and flat fee pricing that lets them bundle the cost of zero into their service that way the client has one bill to manage which includes their zero software and the cost isn’t an issue but if you are passing the cost directly to your clients then address this by going back to the value they receive as consumers when we look at a product or service there are probably a couple of questions we ask ourselves the first is can we afford it some clients won’t be able to afford it and that’s perfectly fine but if we can afford it then we ask is the value I receive worth the money we spend so if you can show the value that they receive very clearly through the conversations and through the demo then cost often doesn’t become an issue if you can wow them show them those amazing features and show them how it will make a difference in their life then you’ll sometimes be surprised at how um how cost doesn’t always become an issue show how it resolves those pain points and communicate the value that not just zero but you as an advisor can bring to the table when you’re both working on the cloud um again focus on the time savings as well they’re probably spending a lot of time and costs on manual tasks so if you focus on how zero can free up that time by automating those manual tasks then that in itself is actually a lot of value um another area which is sometimes overlooked is making sure that they’re on the right plan that will deliver them the best return on the right best return on investment sometimes clients aren’t put on the right plan which will deliver them the most value um and the third one is difficulty so I’m on my current system I don’t know how to use the new system so just reassure the clients tell them that you’ll be with them every single step of the way that you’ll support them through their Journey if you’re doing this with multiple clients at at once you know a really good way to support them is just even through Zoom webinars at zero we use zoom webinars to scale our training across our accounting partners and our small business clients in bulk so maybe this is even an opportunity for you but definitely reassure your clients they highly value you as their trusted advisor showcase to them the zero resources available as well show them zero show them the courses that are available show them how they can get help and support if needed and again just highlight the value and benefits that they receive if you can showcase the value and the benefits and how it fixes their pain points through those conversations through the demo then already they’re probably quite often more likely wanting to switch anyway um fearful of change again you know that’s a great one C I’m just looking at the Q&A box some customers are scared of change completely understandable again just highlight the the benefit um one is hubdoc is difficult to use can’t sort out documents easily um again for that one reassure them support them um show them the resources that are available um clients who don’t trust the cloud so if a client doesn’t trust the cloud so thank you Courtney for your question um there are a few suggestions here um one you could say is that zero for example is um adheres to the strongest security standards we are isoc compliant some of the biggest banks in the regions trust zero they connect with us zero is a publicly listed company on the Australian Securities Exchange um and of course we use cloud daytoday anyway in our day-to-day lives so internet banking um Facebook all of these tools that they’re probably using anyway is already on the cloud um we have covered quite a lot there are just a few other question things I’d want to tick off so just to recap what we discussed we spoke about positioning the cloud and your service in here there were three key considerations to be aware of the first was to to know your audience to know the benefits that can bring those audience because again the benefits for us an advisor may not necessarily resonate with a a business owner and the third was being able to communicate those benef benefits really effectively we also took a look at how to demo zero like a pro maybe give this a try in yourself um just a couple of tips in this space include avoiding complex jargon continually referring back to the benefit and also um using a workflow is a really good idea as well demos are also great for training clients and bulk across zoom and I do I do know that some partners actually use demos as a way to um get new clients through the door and also we took a look at some suggestions to um handle some client questions um don’t forget about the benefits that you receive as part of the zero partner program as well um promote your practice on the Zero advisor directory this is available to bronze partners and above it’s a great way to grow your client base help you get new clients and again it’s available to bronze partners and above um continue to upskill yourself with our free education content and I would also suggest if you’re wanting to grow your firm spend take the time to connect your account manager they’re dedicated to helping your practice grow and Thrive so speak to them about what other partner program benefits are available now if you’re not on the partner program already but you’re an accountant or bookkeeper in in in practice then I will also post a link to the partner program in the chat box so I’ve posted it now before we wrap up with some final thoughts let me level you with this quote no plan and no action will lead to no results what steps can you take to help clients really realiz the benefit of the cloud is it about bringing in some tips around product positioning is it trying out a product demo or is it taking note of some of those key ideas around objection and question handling that we discussed in this webinar that actually does bring us to the end of the webinar I’m going to take a look to see if there are any unanswered questions but if you do have any more question question please continue to put them in that Q&A box and um I’ll do my best to answer them so one of them is how to call to action to ask client to join so Unice it sounds like how should I begin the conversation um around a move to the cloud um there is no right or wrong way ultimately it depends on your relationship with your client but sales pictures depending on your client can sometimes seem impersonable so maybe one idea that I have off the top of my head is maybe even just set up a meeting to discuss their their business goals their pain points um and from there you can see whether or not a switch to the cloud might be helpful for them um and thank you Rowena that was really really nice cont uh comment we absolutely love to hear that you thought this webinar was engaged in um clients who have unstable internet connections that was a that looks like that was uh what’s the word an objection oh that’s that’s definitely a tough objection I guess for unstable internet connection it will be very much around you know do they is the value that they receive worth that unstable internet connection um or maybe if they’ve got let’s say for example um hotspot or something like that will which will allow them to continue to use zero there’s all sorts of of Technology available out there including portable Wi-Fi connections portable internet connections that customers can use to try and get better internet connections um but I think everything has been out answered thank you all for listening to that I really really appreciate it again register for more webinars and we’ll see you at another one in the future and please let us know what you thought in that um feedback survey as well but again thank you everyone for listening and we look forward to seeing you at another webinar in the future take care everyone bye

    By Amjad Izhar
    Contact: amjad.izhar@gmail.com
    https://amjadizhar.blog

  • QuickBooks Online: Mastering Your Chart of Accounts

    QuickBooks Online: Mastering Your Chart of Accounts

    This source provides a detailed guide to understanding and customizing the Chart of Accounts within QuickBooks Online. It emphasizes the importance of the Chart of Accounts as the accounting system’s core, where transactions are categorized for financial reports. The guide explains how to access, sort, and modify the Chart of Accounts to fit specific business needs, including importing custom charts and creating new accounts. It also addresses common challenges such as managing beginning balances, dealing with uncategorized transactions, and merging redundant accounts, offering solutions for cleaning up and organizing the Chart of Accounts for effective financial reporting. Finally, it walks the user through adding or deactivating accounts and turning account numbers on and off.

    QuickBooks Online: Chart of Accounts Mastery

    Quiz:

    1. What is the chart of accounts and why is it important? The chart of accounts is a list of categories used to organize all transactions in QuickBooks Online. It is the heart of the accounting system, providing the framework for categorizing income and expenses and creating financial reports.
    2. How do you access the chart of accounts in QuickBooks Online? To access the chart of accounts, click on the gear menu on the top right of the screen, then click on “Chart of Accounts.”
    3. What are the two principal financial statements that utilize the chart of accounts? The two principal financial statements that utilize the chart of accounts are the Balance Sheet and the Profit & Loss (Income Statement).
    4. Explain the difference between balance sheet accounts and profit and loss accounts. Balance sheet accounts (assets, liabilities, and equity) have a running balance and reflect a company’s financial position at a specific point in time. Profit and loss accounts (income, cost of goods sold, expenses) track financial performance over a period of time.
    5. What is “opening balance equity” and how is it affected when you enter beginning balances for asset and liability accounts? Opening balance equity tracks the accumulated value of a business before accounting in QuickBooks. Adding an asset increases opening balance equity, while adding a liability decreases it.
    6. Why is it important to reconcile credit card transactions in QuickBooks Online? Reconciling credit card transactions ensures that the beginning balance is accurate and allows users to catch up on any transactions from the previous year, ensuring accurate financial reporting.
    7. How do you create sub-accounts, and why might you use them? To create a sub-account, when creating a new account, select the “Is sub-account” option and choose the parent account. Sub-accounts are used to further categorize and organize accounts, providing more detailed financial information.
    8. How do products and services connect to the chart of accounts? Products and services tie, or map, to accounts in the chart of accounts. When creating transactions with those items, QuickBooks knows where to categorize them.
    9. What is the purpose of the “uncategorized expense” account, and why can’t it be deleted? The “uncategorized expense” account serves as a default category for transactions that QuickBooks cannot automatically classify. This account cannot be deleted because QuickBooks uses it to fulfill a specific mechanic.
    10. How do account numbers affect the order in which accounts are displayed in QuickBooks Online, and how do you enable account numbers? Enabling account numbers in QuickBooks Online orders the chart of accounts numerically instead of alphabetically. To enable account numbers, go to “Account and Settings,” then “Advanced,” and turn on “Enable account numbers.”

    Essay Questions:

    1. Discuss the importance of a well-organized chart of accounts in QuickBooks Online for effective financial management and decision-making. Provide specific examples of how different account classifications (assets, liabilities, equity, income, expenses) contribute to a comprehensive understanding of a business’s financial health.
    2. Explain the process of setting up a chart of accounts for a new business in QuickBooks Online. What factors should be considered when creating new accounts and assigning account types and detail types?
    3. Describe the best practices for maintaining and updating a chart of accounts over time. How should businesses handle account reconciliations, and what steps should be taken to ensure data accuracy and consistency?
    4. Compare and contrast the balance sheet and profit and loss statement, emphasizing the role of the chart of accounts in generating these financial reports. How do different account categories (e.g., assets, liabilities, income, expenses) contribute to the information presented in each statement?
    5. Analyze the impact of chart of account design on financial reporting and compliance. How can a well-structured chart of accounts facilitate the preparation of accurate tax returns and other regulatory filings?

    Glossary of Key Terms:

    • Account Type: A classification of accounts in the chart of accounts (e.g., Asset, Liability, Equity, Income, Expense).
    • Balance Sheet: A financial statement that reports a company’s assets, liabilities, and equity at a specific point in time.
    • Chart of Accounts: A list of all the accounts used to record transactions in the general ledger of a business.
    • Cost of Goods Sold (COGS): Direct costs attributable to the production of the goods sold by a company.
    • Credit Card: A liability account used to track balances owed on credit cards.
    • Detail Type: A further categorization of accounts within an account type, providing more specific classifications.
    • Equity: The owner’s stake in the business, representing the residual value of assets after deducting liabilities.
    • Expenses: Costs incurred in the normal course of business to generate revenue.
    • Fixed Assets: Long-term tangible assets used in a company’s operations (e.g., vehicles, equipment).
    • Income: Revenue generated from the sale of goods or services.
    • Liabilities: Obligations or debts owed by a business to external parties.
    • Long-Term Liabilities: Obligations due more than one year in the future.
    • Opening Balance Equity: An equity account used to record the initial value of a business when setting up QuickBooks Online.
    • Profit and Loss (P&L) Statement: A financial statement that reports a company’s revenues, expenses, and net income over a period of time. Also known as an Income Statement.
    • Retained Earnings: The accumulated profits of a company that have been retained for reinvestment or other purposes.
    • Sub-Account: An account that is a child of another account. Used to provide more detailed information about transactions recorded to the parent account.
    • Transaction: Any activity that affects the financial position of a business (e.g., sales, purchases, payments).
    • Uncategorized Expense: A default expense account used for transactions that cannot be automatically categorized.

    QuickBooks Online: Mastering Your Chart of Accounts

    Okay, here’s a detailed briefing document summarizing the key themes and ideas from the provided QuickBooks Online Chart of Accounts source.

    Briefing Document: QuickBooks Online – Mastering Your Chart of Accounts

    I. Overview:

    This source is a video transcript focusing on the Chart of Accounts within QuickBooks Online (QBO). The speaker emphasizes that understanding and properly utilizing the Chart of Accounts is crucial for effective accounting within QBO. It is described as “the heart of the accounting system.” The tutorial covers accessing the Chart of Accounts, understanding its structure, creating new accounts, importing a chart of accounts, and linking accounts to products and services. It also touches on how the chart of accounts impacts financial reports like the Profit & Loss statement and the Balance Sheet.

    II. Key Themes and Ideas:

    • Importance of the Chart of Accounts: The video stresses the critical role of the Chart of Accounts in organizing financial data within QuickBooks Online. The speaker states, “the most common mistake that QuickBooks users make is they don’t get very much acquainted with the chart of accounts and then they really don’t know how to categorize transactions.” Proper categorization ensures accurate financial reporting.
    • Structure and Organization:The Chart of Accounts is a list of categories used to organize all transactions.
    • It can be sorted by name, account type, detail type, or balance.
    • Sorting by account type is logical, as it aligns with the order of accounts on financial statements (Balance Sheet and Profit & Loss).
    • The Balance Sheet accounts (assets, liabilities, and equity) display a running balance in the “QuickBooks Balance” column.
    • Profit & Loss accounts (income, cost of goods sold, expenses) do not show a running balance.
    • Accessing and Navigating the Chart of Accounts: The speaker explains how to access the Chart of Accounts by clicking on the gear menu and then “Chart of Accounts.” Once inside, users can sort the accounts by various criteria.
    • Account Types and Financial Statements:Balance Sheet Accounts: Bank accounts, accounts receivable, assets, liabilities, and equity.
    • “All those accounts from Equity to the top, they belong to your balance sheet. Now the other way to quickly tell that the account belongs to the balance sheet is there’s going to be a column called quick QuickBooks balance and there’s always going to be a running balance in the QuickBooks balance column”
    • Profit & Loss Accounts: Income, Cost of Goods Sold, Expenses, Other Income, and Other Expenses.
    • “everything from income cost to good so expenses other income other expenses those are going to belong in your profit and loss account”
    • Creating New Accounts:The video details the steps to create new accounts, emphasizing the importance of selecting the correct account type and detail type.
    • A key aspect is setting the “When do you want to start tracking your finances in QuickBooks from this account?” date. This determines when the opening balance will be booked (the prior day).
    • Examples are provided for creating bank accounts, fixed assets (vehicles), loans, and credit card accounts.
    • Opening Balances:Entering opening balances is essential when starting to use QuickBooks Online.
    • For asset accounts, the opening balance increases the “Opening Balance Equity” account.
    • For liability accounts, the opening balance decreases the “Opening Balance Equity” account.
    • The Opening Balance Equity reflects the net worth of the business at the start of tracking finances in QBO.
    • Subaccounts: Subaccounts allow for more detailed tracking within a general account category. The example used is creating individual vehicle listings as subaccounts under a main “Vehicles” fixed asset account.
    • Credit Card Account Considerations: The video highlights the nuances of setting up credit card accounts, especially when the statement closing date doesn’t align with the month-end. Two options are presented: enter transactions for the overlapping period or combine balances as of the desired start date. It also covers situations where multiple cards exist under one master account, requiring the creation of sub-accounts for each card user.
    • Importing Chart of Accounts: Users can import a Chart of Accounts from an Excel or CSV file. This is especially helpful for accountants or users with pre-existing Chart of Account templates.
    • The speaker says that, “the way I import the chart of accounts into QuickBooks is I export this into Excel and then I go uh in here into… the chart of accounts click on the drop down menu and then click on import”
    • Linking Products and Services to Accounts: The video explains how to link products and services to specific income or expense accounts. This connection is crucial for automatically categorizing transactions created from invoices or sales receipts. The speaker shows how to navigate to the product and service list, create a service and assign its income account.
    • Cleaning Up and Customizing the Chart of Accounts:Inactive (deleted) accounts can be hidden or grouped under a general category (e.g., “Old/Deleted Accounts”) to declutter financial reports.
    • Redundant accounts can be merged to simplify the Chart of Accounts.
    • Account numbers can be enabled to provide a more organized view (accountants prefer this).
    • The ultimate goal is to have as few accounts as possible that provide the most meaningful information for your business needs.
    • Managing Profit and Loss Accounts:The video shows how to create custom income and expense accounts.
    • It also shows how to make default accounts inactive if they are not needed
    • “uncategorized expense is one that you cannot change QuickBooks gives you that you have to leave that in there because when you connect your Banks and QuickBooks just so know where to categorize stuff that’s where it’s going to put it it’s going to put it under uncategorized expense”
    • Transition to Retained Earnings
    • The speaker recommends that you perform a journal entry to move the opening balance equity to retained earnings.
    • “in the accounting World opening balance Equity is actually referred to as retain earnings retain earnings so what I’m going to do is I’m going to take this number…I’m going to do a journal entry to move it out of op balance equity and into rain earnings”
    • Report CustomizationThe speaker shows how you can make the Profit and Loss report show all rows and active columns to see all of the accounts.

    III. Important Considerations:

    • QuickBooks Online Version Differences: The video notes that the appearance of certain screens, particularly the “New Account” pop-up, may vary depending on the QBO version. Functionality, however, remains consistent.
    • Accountant Assistance: For complex tasks like entering accumulated depreciation for fixed assets or handling journal entries, the video suggests seeking help from an accountant.
    • Dynamic Nature of QuickBooks: The video acknowledges that QuickBooks is constantly evolving, particularly with changes to report interfaces.

    IV. Target Audience:

    This video is primarily aimed at beginners and intermediate QuickBooks Online users who want to gain a deeper understanding of the Chart of Accounts and its impact on their financial reporting. It is also useful for accountants who may want to import their own charts of accounts.

    QuickBooks Online: Mastering Your Chart of Accounts

    FAQ on Mastering Your QuickBooks Online Chart of Accounts

    1. What is the Chart of Accounts in QuickBooks Online and why is it important?

    The Chart of Accounts (COA) is the backbone of your accounting system in QuickBooks Online. It’s a comprehensive list of categories used to organize all your financial transactions. By categorizing income and expenses appropriately, you can generate accurate and meaningful financial reports, such as the Profit and Loss statement and the Balance Sheet. Familiarity with the COA is crucial for proper categorization of transactions as they come in from your bank and credit card feeds, leading to better financial insights.

    2. How do I access and navigate the Chart of Accounts in QuickBooks Online?

    To access the Chart of Accounts, click on the gear icon in the top right corner of QuickBooks Online and select “Chart of Accounts.” You can sort the list by name, account type, detail type, or balance by clicking on the respective column headers. Sorting by account type is generally recommended as it displays accounts in the order they appear on financial statements (Balance Sheet accounts first, followed by Profit & Loss accounts).

    3. What are the main sections of the Chart of Accounts and what type of accounts do they hold?

    The Chart of Accounts is primarily divided into two main sections, corresponding to the two primary financial statements:

    • Balance Sheet Accounts: These accounts show the financial position of your business at a specific point in time. They include:
    • Assets (e.g., Bank accounts, Accounts Receivable, Fixed Assets)
    • Liabilities (e.g., Accounts Payable, Credit Cards, Loans)
    • Equity (e.g., Owner’s Equity, Retained Earnings)
    • Profit & Loss (Income Statement) Accounts: These accounts track your business’s financial performance over a period of time. They include:
    • Income (e.g., Sales, Service Income)
    • Cost of Goods Sold (direct costs associated with producing goods or services)
    • Expenses (e.g., Rent, Utilities, Salaries)
    • Other Income & Expenses (e.g., Interest Income, Loss on Sale of Assets)

    4. How do I add a new account to my Chart of Accounts and what are the key considerations when doing so?

    To add a new account, click the “New” button in the upper right corner of the Chart of Accounts. A window or drawer will appear (the interface may differ slightly depending on your QuickBooks Online version). Choose the appropriate account type (e.g., Bank, Fixed Asset, Income, Expense), then select a detailed type that best describes the account’s nature. Give the account a clear and descriptive name (including the last four digits) to identify the account easily. Finally, if applicable, enter a starting/opening balance and the date for tracking transactions from the account.

    5. How do I enter opening balances for existing accounts and why is it important to do this correctly?

    When setting up QuickBooks Online, it’s crucial to enter accurate opening balances for all existing accounts. This involves selecting the date that your tracking finances will begin with, such as Jan 1st 2024, and entering balances as of the last date (December 31st, 2023 in this example). For bank accounts, use the ending balance from the December bank statement. For fixed assets, it could be original purchase price minus depreciation. For loans, use the actual outstanding loan balance. Credit cards can be tricky as the closing date may not be the end of the month, so use the ending balance from the closest statement and account for any in-between transactions to align with your start date. Inaccurate opening balances can distort your financial reports from the outset.

    6. How do Sub-Accounts work in QuickBooks and how should I name them?

    Sub-accounts allow you to create a hierarchical structure within your Chart of Accounts for more detailed tracking. For instance, you might have a main “Vehicle” fixed asset account and then sub-accounts for individual vehicles (“Ford F-150 White”). Another example is having a Chase rewards plus INC credit card main account, with sub accounts that name all the various users. To create a sub-account, check the “Is sub-account” box when creating or editing an account and select the parent account from the drop-down menu. You need to be very careful to select an account from the same category. Sub-accounts provide greater granularity while keeping your main financial statements organized.

    7. How do Products and Services relate to the Chart of Accounts and how do I connect them?

    Products and Services are the individual items you sell or buy in your business. Each product or service is linked to a specific income or expense account in your Chart of Accounts. This connection determines how sales and purchases are categorized on your financial statements.

    To link a product or service to an account, go to the “Products and Services” list (accessible via the gear icon) and edit or create an item. In the item details, select the appropriate “Income account” or “Expense account” from the drop-down menu. This mapping is essential for accurate revenue and expense tracking.

    8. How do I customize my Chart of Accounts for better reporting, and what steps should I take when doing it?

    Customizing your chart of accounts for better reporting is key.

    1. First, consider all the business transactions and think of the best names to give these accounts.
    2. Second, import your own list of accounts, using a pre-made template on Excel.
    3. Third, customize your income and expense accounts. Do you want to eliminate the default expense accounts that come with QuickBooks Online?
    4. When setting up your Chart of Accounts, you may need to merge some accounts that are redundant.
    5. Finally, if you are comfortable with account numbers, you can enable it so that the Chart of Accounts can be more clearly viewed.

    QuickBooks Online Chart of Accounts: Setup and Management

    QuickBooks Online and its chart of accounts are discussed in the source.

    Here’s a breakdown:

    • Chart of Accounts: The chart of accounts in QuickBooks Online is the core of the accounting system. It’s a list of categories for organizing transactions to be viewed in financial reports like profit and loss statements or balance sheets.
    • Accessing the Chart of Accounts: To access it, the gear menu is clicked, followed by ‘chart of accounts’.
    • Sorting: The chart of accounts can be sorted by name, account type, detail type, or balance. Sorting by account type displays accounts in the order they appear on financial statements.
    • Balance Sheet Accounts: These accounts include bank accounts, accounts receivable, current assets, other assets, fixed assets, liability accounts (accounts payable, credit cards), and equity. Balance sheet accounts have a running balance in the QuickBooks balance column.
    • Profit and Loss Accounts: These include income, cost of goods sold, expenses, other income, and other expenses.
    • Account Limits: Some QuickBooks versions limit the number of accounts (e.g., Simple Start, Essentials, and Plus versions are limited to 250 accounts).
    • Creating a New Account: When creating a new account, the user selects the account type and detail type. They also specify when to start tracking finances in QuickBooks.
    • Beginning Balances: When a new asset account is created, the opening balance equity increases. When a liability account is created, the opening balance equity decreases.
    • Cash Account: QuickBooks creates a cash account for petty cash, which can be renamed.
    • Fixed Assets: When adding a fixed asset like a truck, the original purchase price (minus depreciation) is entered.
    • Long-Term Liabilities: Loans with over a year left for repayment are considered long-term liabilities.
    • Credit Cards: Credit card accounts can be a bit tricky because their closing dates might not align with month ends. One option is to enter the combined balance as of the statement’s closing date, including any transactions up to the end of the year.
    • Sub-accounts: For credit cards with multiple users or extensions, the parent account carries the beginning balance, and sub-accounts are created for each physical card without entering individual beginning balances.
    • Balance Sheet Review: After entering all the beginning balances, a balance sheet can be run for the last fiscal year to verify the balances.
    • Journal Entry for Retained Earnings: The source notes that opening balance equity is a QuickBooks concept, and in accounting, it’s referred to as retained earnings. A journal entry can be made to move the balance from opening balance equity to retained earnings.
    • Importing Chart of Accounts: Users can import their own chart of accounts from Excel.
    • Customizing Income Accounts: Users can create their own income categories (e.g., on-site services, remote services) and make existing ones inactive.
    • Products and Services: Products and services are linked to specific income accounts.
    • Account Numbers: Account numbers can be enabled to order the chart of accounts numerically.
    • Merging Accounts: Redundant accounts can be merged.
    • Profit and Loss Report Customization: QuickBooks Online allows you to customize your profit and loss report to show accounts even if they don’t have any activity.

    QuickBooks Online: Chart of Accounts Management and Customization

    The chart of accounts in QuickBooks Online is a vital part of the accounting system. It serves as a list of categories that are used to organize all transactions. These categories are essential for generating financial reports such as profit and loss statements and balance sheets.

    Accessing and Sorting To access the chart of accounts, one must click on the gear menu and then select “chart of accounts”. Once accessed, the chart of accounts can be sorted by:

    • Name
    • Account type
    • Detail type
    • Balance

    The most logical way to sort the chart of accounts is by account type, as this will display the accounts in the order in which they appear on the financial statements.

    Types of Accounts The chart of accounts includes both balance sheet and profit and loss accounts.

    • Balance Sheet Accounts These accounts comprise assets, liabilities, and equity.
    • Assets Bank accounts, accounts receivable, current assets, other assets, and fixed assets.
    • Liabilities Accounts payable and credit card accounts.
    • Balance sheet accounts will always have a running balance in the “QuickBooks balance” column.
    • Profit and Loss Accounts These accounts include income, cost of goods sold, expenses, other income, and other expenses.

    Account Limits Different versions of QuickBooks Online have different account limits. For example, Simple Start, Essentials, and Plus versions are limited to 250 accounts.

    Creating New Accounts When creating a new account, users must select the account type and detail type. They also need to specify when they want to start tracking finances in QuickBooks. When creating a new asset account, the opening balance equity increases, while creating a liability account decreases the opening balance equity.

    Fixed Assets and Long-Term Liabilities When adding fixed assets, such as a truck, it’s important to enter the original purchase price minus any depreciation. Long-term liabilities are loans that have more than a year left to pay.

    Credit Card Accounts Credit card accounts can be tricky because their closing dates may not align with the end of the month. In such cases, one option is to enter the combined balance as of the statement’s closing date, including all transactions up to the end of the year. For credit cards that have multiple users or extensions, create sub-accounts for each physical card, while the parent account carries the beginning balance.

    Balance Sheet Review and Retained Earnings After entering all beginning balances, run a balance sheet for the last fiscal year to verify the balances. The source specifies that in accounting, opening balance equity is referred to as retained earnings, and a journal entry can be created to move the balance from opening balance equity to retained earnings.

    Importing and Customizing Users have the option to import a chart of accounts from Excel and customize income accounts by creating their own categories and deactivating the existing ones. Products and services are linked to specific income accounts.

    Account Numbers and Merging Accounts Account numbers can be enabled to numerically order the chart of accounts. Additionally, redundant accounts can be merged.

    Profit and Loss Report Customization QuickBooks Online enables customization of the profit and loss report to show accounts, even when there is no activity.

    QuickBooks Online: Understanding Account Types and Chart of Accounts

    The sources discuss ‘account type’ in the context of QuickBooks Online and its chart of accounts. The chart of accounts is a list of categories that are used to organize transactions and is essential for generating financial reports.

    Here’s a breakdown of account types based on the sources:

    • Sorting by Account Type: The most logical way to sort the chart of accounts is by account type, as this displays the accounts in the order in which they appear on the financial statements.
    • Balance Sheet Accounts: These accounts comprise assets, liabilities, and equity.
    • Assets: Bank accounts, accounts receivable, current assets, other assets, and fixed assets.
    • Liabilities: Accounts payable and credit card accounts.
    • Balance sheet accounts will always have a running balance in the “QuickBooks balance” column.
    • Profit and Loss Accounts: These accounts include income, cost of goods sold, expenses, other income, and other expenses.
    • Creating a New Account: When creating a new account, users must select the account type and detail type.
    • When creating a new asset account, the opening balance equity increases, while creating a liability account decreases the opening balance equity.
    • Fixed Assets: When adding fixed assets, such as a truck, it’s important to enter the original purchase price minus any depreciation.
    • Long-Term Liabilities: Loans that have over a year left to pay are considered long-term liabilities.
    • Customizing Income Accounts: Users can create their own income categories and deactivate the existing ones. Products and services are linked to specific income accounts.
    • The source notes that income accounts typically start with the number 4, cost of goods sold accounts start with 5, and expense accounts start with 6 or 7. Asset accounts start with 1, liabilities with 2, and equity accounts with 3.

    QuickBooks Online: Understanding the Balance Sheet and Chart of Accounts

    The balance sheet is a key financial statement in QuickBooks Online, and the sources provide extensive information on how it relates to the chart of accounts. The balance sheet contains asset, liability, and equity accounts.

    Here’s a detailed breakdown:

    • Balance Sheet Accounts: The principal financial statement is called the balance sheet.
    • Types of Balance Sheet Accounts:
    • Assets: These include bank accounts, accounts receivable, current assets, other assets, and fixed assets.
    • Liabilities: These include accounts payable, credit card accounts, other current liabilities, and long-term liabilities.
    • Equity: This is the equity section.
    • Identifying Balance Sheet Accounts: A quick way to identify a balance sheet account is the presence of a “QuickBooks balance” column, which displays a running balance. The last account with a running balance is typically an equity account.
    • Setting Up Initial Balances:
    • A typical first step is to create accounts with a running balance.
    • When a new asset account is created, the opening balance equity increases.
    • When a liability account is created, the opening balance equity decreases.
    • Fixed Assets: When adding a fixed asset like a truck, the original purchase price (minus depreciation) is entered.
    • Long-Term Liabilities: Loans with over a year left for repayment are considered long-term liabilities.
    • Credit Cards: Credit card accounts can be a bit tricky because their closing dates might not align with month ends. One option is to enter the combined balance as of the statement’s closing date, including any transactions up to the end of the year.
    • Sub-accounts: For credit cards with multiple users or extensions, the parent account carries the beginning balance, and sub-accounts are created for each physical card without entering individual beginning balances.
    • Reviewing the Balance Sheet: After entering all the beginning balances, a balance sheet can be run for the last fiscal year to verify the balances.
    • Opening Balance Equity vs. Retained Earnings:
    • The source specifies that in accounting, opening balance equity is referred to as retained earnings, and a journal entry can be created to move the balance from opening balance equity to retained earnings.
    • The source notes that journal entries can be tricky and it may be best to consult with an accountant.
    • Account Numbers:
    • Account numbers can be enabled to order the chart of accounts numerically.
    • Asset accounts typically start with the number 1, liabilities with 2, and equity accounts with 3.

    QuickBooks Online: Profit & Loss Statement Insights

    The Profit & Loss (P&L) statement is a crucial financial report in QuickBooks Online, and the sources offer detailed insights into its composition and customization. The P&L statement, also called an income statement, relies on information from the chart of accounts.

    Here’s a comprehensive overview:

    • Profit and Loss Accounts: These include income, cost of goods sold, expenses, other income, and other expenses.
    • Location in Chart of Accounts: When sorting the chart of accounts by account type, the P&L accounts are found under the equity section, beginning with income. They appear after the balance sheet accounts.
    • Customizing Income Accounts:
    • Users can create their own income categories (e.g., on-site services, remote services).
    • Existing income accounts can be made inactive.
    • The source mentions that it is not possible to delete certain accounts, such as ‘billable expense income’ and ‘uncategorized income’.
    • Products and Services: These are linked to specific income accounts, ensuring proper categorization of income on the P&L statement.
    • Showing All Accounts: QuickBooks Online allows you to customize your profit and loss report to show accounts even if they don’t have any activity.
    • Deleting and Nesting Accounts: The source explains how to nest old or deleted accounts under a general category to clean up the P&L.
    • Account Numbers:
    • Account numbers can be enabled to order the chart of accounts numerically.
    • The source notes that income accounts typically start with the number 4, cost of goods sold accounts start with 5, and expense accounts start with 6 or 7.
    • Merging Accounts: Redundant accounts can be merged. For example, Heating and Cooling can be merged with Electricity.
    • Miscellaneous Accounts: The source suggests having miscellaneous accounts under general categories such as ‘miscellaneous travel’.
    • Importance of Organization: The purpose of working on the chart of accounts is to organize categories of income and expenses in a way that is useful and meaningful for financial reports. The goal is to have as few accounts as possible while still providing rich information.
    • Default Chart of Accounts: The source notes that the default chart of accounts in QuickBooks may vary, but gives the user control to customize it.
    • Utilities Example: The source shows an example of renaming ‘Water and Sewer’ to ‘Water Sewer and Disposal fees’.
    • Example of Travel Expenses: The source explains how to set up a ‘miscellaneous travel’ account.
    QuickBooks Online: Chart of Accounts (1 hour full tutorial)

    The Original Text

    in this video we’re going to talk about the chart of accounts in QuickBooks Online the chart of accounts is the heart of the accounting system the most common mistake that QuickBooks users make is they don’t get very much acquainted with the chart of accounts and then they really don’t know how to categorize transactions when the income and the expenses come in from the bank the chart of accounts is the list of categories that you use to organize all your transactions into so essentially you can see them in financial reports such as your profit and loss or the balance sheet to access your chart of accounts you click on the gear menu on the top right and then you click on chart of accounts now once you look at the chart of accounts you can actually sort these by name by clicking on the header of the grid by account type by detail type by balance and so forth the most logical way to sort your chart of accounts is going to be by account type because when you sort by account type what you’re going to see is all your accounts based based on the order in which they fall into in your financial statements for example the principal financial statement is called the balance sheet the very first account you see on the top is going to be your bank account naturally that is going to be the very first account or group of accounts that you see they’re going to be under bank then you’re going to see things like accounts receivable current assets other assets fixed assets and then you’re going to go see your liability accounts they still on the balance sheet you’re going to see accounts like accounts pay pble credit card accounts which we haven’t created one yet other current liabilities long-term liabilities and Equity so all of those that I mentioned just now from bank which is an asset all the way down to equity which is in the equity section once you see this break between equity and income all those accounts from Equity to the top they belong to your balance sheet now the other way to quickly tell that the account belongs to the balance sheet is there’s going to be a column called quick QuickBooks balance and there’s always going to be a running balance in the QuickBooks balance column and the very last account when they’re sorted by account type notice that the very last account that has a running balance which happens to be zero it’s an equity account and then you’re going to see income right under it so all the accounts under Equity starting from income and then going to cost to good sold which are your direct expenses down to expenses which are your operating expenses and then all the way in the bottom to your what they call the extraordinary accounts let me go to the next page which is going to be your other income and other expenses everything from income cost to good so expenses other income other expenses those are going to belong in your profit and loss account okay so that gives you an idea more or less you know where these two type of accounts fit now uh the the chart of accounts can actually have a limitation of so many accounts you can fit per page right now it’s set up for 175 but you can go all the way to 300 per page some people could have more than 300 accounts that is possible most versions of QuickBooks Simple Start Essentials and plus are limited to 250 accounts anyway so only the advanced version of QuickBooks Online can have more than 250 this particular chart of accounts that you’re seeing has 135 accounts so I’m actually using I’m not using I’m not using the advanc Edition so I can add a whole bunch of accounts get all the way to 250 now for the purposes of this video and this is the point I want to make is that if I go to reports and I go into profit and loss for example and I change the date to all dates and click on run report notice that none of the accounts show because there’s absolutely zero activity on this account for this example I created a brand new account from scratch so we can kind of see what happens as we work the chart of accounts I’m going to go to reports and do the same thing with the balance sheet and I could do uh all dates or pick a particular date range let’s do all dates for example click on run report and then we get the exact same behavior so again just proving the point this is a blank account brand new account let me go back into the chart of accounts and let’s talk about what are the typical first steps that we do now now that we understand exactly what it is what are the typical first steps that we do is we’re going to create all the accounts that we know from fact have a running balance so let’s start with that I’m going to click on the new button on the top right click on new now I do want to pause for a second this little square popup that you see on your screen it’s one of the versions of QuickBooks Online show this depending on what version of QuickBooks Online you have and you really can’t tell by looking at it this is just based on how QuickBooks creates the accounts you may not see this Square little popup you might see a drawer that comes from the right hand side all all the functionality is identical all the buttons you can press are identical they’re just organized differently and I’m I’m a little bit later on I’m going to switch to a different company file that actually has the drawer so you can see the two differences so I do want to show you with both so don’t freak out if your screen doesn’t look like this it’s fine uh we’re actually going to cover what the other screen looks like just kind of follow along uh the concepts themselves cuz those are going to be really important then I’m going to choose the type of account I’m going to create let’s say I’m going to enter all my bank accounts that already have a running balance uh from the first date that I’m going to decide to enter transactions inside QuickBooks so for examples of this video Let’s just assume that I created a brand new account and I’m going to start January 1st 2024 that means that I need to enter all the balances that come all the way up to December 31st 2023 because I want to start January 1st 2024 and this is the time that I want to go back into QuickBooks and enter transactions so I can have a clean tax year for example so I’m going to enter all my bank accounts so I’m going to select Bank as the account type under the detail type I’m going to pick the best possible option here so let’s say it’s a checking account and then under name I’m going to put the name of the bank so let’s say this is in Chase and then I’m going to put whether it’s checking or savings as the name and and this is the part where you can actually put whatever you want really but this is how I like to do it and then put the last four digits of the account typically the easiest way to identify now some people create an operating account a payroll account a tax savings account okay that’s fine you can do that and you can add that naming to your naming convention if you want to the description is optional you actually do not need to create that and then sub we’ll discuss that later on as we start creating more accounts now down here at the bottom it says when do you want to start tracking your finances in QuickBooks from this account and as I mentioned earlier we’re going to start January 1st 2024 so we’re going to click on other and then we’re going to specifically select January 1st 2024 essentially letting them know that if I do enter an opening balance that opening balance is supposed to be booked 1231 2023 so then I’m going to go get my actual physical bank statement for December for that bank account and I’m going to read the ending balance from the bank statement and I’m going to enter it in there so let’s say that’s 45,0 89062 so I read that from the bank statement for the ending balance of 1231 2023 and then I click on Save and close you will notice that immediately you’re going to see the account gets created in here the QuickBooks balance is going to be that beginning balance and then I’m going to scroll down so you can see something because there’s going to be a balancing entry under opening under opening balance Equity you’re going to see a balance entry for the same exact dollar amount every time I add an asset like a bank account that opening balance Equity is going to go up every time I add a liability such as a credit card or a loan that opening balance is going to go down essentially keeping track of what’s the accumulated value of your business uh before we start doing Accounting in QuickBooks okay so we did that bank account let’s do another bank account I’m going to click on new and let’s say I have another bank account but this time around it’s a savings account and it happens to be let’s say in Bank of America so I’ll put boa savings and then the last four digits of the account then I’m going to choose the same thing I’m going to choose the beginning date that I’m going to start entering transactions in this account then I’m going to go read the the ending balance from the statement let’s say it’s 100,000 flat I’ll put that in there and click save and close so now I have my two beginning balance uh balances in there and let’s say that’s it let’s say I don’t have any more bank accounts notice QuickBooks uh created a cash account which is going to be your petty cash account is where you’re going to be holding cash maybe your customers pay you in cash maybe you took out an ATM cash withdrawal you’re keeping cash in there and you might make some expenditures keep track of those receipts and that’s going to be sort of the account that’s not going to be tracked through the bank but you’re going to be tracking manually now you can change the name of that account if you don’t like it to be called Cash you want to call it petty cash for example we click on edit and then we change the name from cash to petty cash and then it’s up to you and and now one thing I want to make uh clear is there are some businesses where they have multiple employees holding petty cash and then what you can do is you can do is let me just edit that and put petty cash you can put petty cash John petty cash Hector petty cash uh Mary or whatever and then you can keep track of everybody’s petty cash uh per se so depending on how you manage cash some businesses will never touch cash so you don’t you not even need that and if you don’t want that account you can just click on the drop- down menu and click on make inactive or delete now one thing is QuickBooks Online does create a chart of accounts for you and um as we mentioned earlier it was 135 accounts you can delete them you don’t have to use them but that’s what QuickBooks suggests that you use so we entered both of our bank accounts let’s now go down here and create let’s say a fix asset so let’s say that I want to put a truck that I have in here now QuickBooks has a general category for vehicle so maybe I want to use that and then make a sub account so that’s the example we’re going to show you so I’m going to click on new on the top right under account type I’m going to go down to fix asset that’s where a truck would go under detail type I’m going to go look for the best possible uh category that matches what I’m trying to do so vehicles and then I’m going to call this by what it is so let’s say this is a Ford F150 I typically like to put the acquisition date on it it just it’s easier for me to do that so I put that in parenthesis so let’s say I bought it um May 19th 2019 or something like that okay so that helps me understand especially if I have multiple F-150s it will help me understand which one specifically it is you can go further and say that this is a 2019 model and then let’s say you have multiple 2019 models you can do white black whatever so you get to you get to pick whatever name makes the most sense for you and you put that in the balance sheet then it says here when do you want to start tracking finances from this account we’re going to do the same thing we’re going to do January 1st and then this gets a little trickier you have to go find your um well at some point when we do the actual loan we’re going to have to go find what the actual loan balance is but this is the fixed asset so this one’s actually not going to be the loan balance this is going to be the original uh purchase price of the vehicle minus depreciation now that gets tricky you might need an accountant to help you with that for the time being just to keep it simple we’re going to put the original purchase price of the vehicle now you can always look in your balance sheet in your tax return return if you had an accountant or CPA do your tax return your balance sheet will probably keep track of all your fixed assets all the depreciations and then maybe have a separate fixed asset schedule that tells you exactly how much um they booked the value of the vehicle and they accumulated depreciation so that could help you figure out what number goes in here for the time being I’m going to keep it simple let’s say I spent 4,608 on this car out the door with taxes and everything and then I’m going to click on Save and close so this is the original value of the vehicle again you might need um an accountant to kind of help you make sure you you enter the the cumulated depreciation and all that stuff now in this particular case I created the vehicle I’m going to go back and edit it and I didn’t make it a sub account of the existing vehicle category I think I want to do that so I’m going to click this little check box that says is sub account click on the drop down and then go look for uh the vehicle category that way it’s all organized sort of in the same grouping then I click on Save and close and then you will notice as I scroll down now I see a category for vehicles and then I see the fort in there if I happen to have another vehicle I can uh create that vehicle by clicking on new doing the exact same process so going down here to fix assets detail type let’s go down to uh Vehicles let’s say this one is a Ford F-150 as well but this is from uh 2021 and then this one happens to be black and let’s say I purchase that on 1231 2020 or something like that and the same thing I’m going to pick the original purchase price of this vehicle I’m going to be consistent with the beginning date of all the transactions that we’re going to be managing and the let’s say the purchase price was 51985 click save and close so now as I go down and scroll down to my vehicles and it looks like I also forgot to put this one as a sub account so let’s go back in that and click on edit so I didn’t do it on purpose that was my mistake but it’s good because you get to see um the actual um the actual process here so I’m going to put it under Vehicles which is a fixed asset you got be very careful because it happens to be a vehicle expense account as well and that’s not where I want to put it um even if I try to put it there and click save and close um it’s not going to let me because I can’t put a sub account of a fixed asset into an expense so it needs to be the exact same category for that to work anyway so let’s put here a vehicle and let’s do the fix assets and then click save and close perfect so now as I scroll down now I see that I have my vehicles category and both of my fixed Assets in there with the right uh labeling with the right $2 amounts and then the sum of the two right under that parent account if I scroll down and I look at my opening balance Equity notice that all the accounts that I put that are assets are increasing my open opening balance Equity account let’s start creating some loans so I’m going to go to new then I’m going to click on the drop down menu and then I’m going to click on long-term liabilities so typically any loans that you have more than a year left to pay it’s going to be a long-term liability and then we can call this a node payable or other long-term liability it doesn’t matter and then we’re going to put here Ford let so here Ford truck loan and if I happen to have two of them I probably want to identify and by the white truck or the black truck or whatever it is so we’ll put here white and then I’m going to come here same thing I’m going to select the beginning of the year which is the date that we decided to start our finances and then I’m going to get this is actually the loan balance not the original purchase price I’m going to get the actual statement from uh Ford Finance or whatever I’m going to get the statement from the bank and that shows the ending balance as of 1231 2023 and let’s say for this one was 23,00 965 I click on Save and close and then I’ll do the same thing for the other one so I’m going to go to new go drop down menu get the other physical statement long-term liability and let’s say this is the Ford uh truck loan black and then other same thing one one 2024 and then this one happens to be let’s say 38,7 seven eight something like that save and close so I enter two loans which are two liabilities as I scroll down scroll all the way down now I see my two long-term liabilities they show up in here now you should notice that my opening balance Equity is going down and this is the exercise you have to do you have to create all your assets all your liabilities and all of these balances essentially are going to give you the opening balance equity which is the net worth of your business at that moment in time where you started tracking your finances in in this case January 1st 2024 okay let’s do one more exercise which is our credit cards so I’m going to go to new then I’m going to get all my credit card statements I’m going to click here credit card and this one’s a little bit tricky because credit cards don’t have clean months like bank accounts or loans a credit card can end in the 15th of the month it can end in the 20th of the month so that gets a little bit tricky so essentially what I want to do is I’m let me start with let’s say my AMX uh platinum or something like that and uh we’ll put the last four digits of the account what I want to do is I want to do the same thing here I want to do 11 11 2024 but because I don’t have I can’t get that information from the statement I have to get the closing uh date as of the last time that the the bank that the um credit card closed let’s say this happens to be December 15th I can put the number there but then when I go reconcile it’s not going to work because I will be missing the transactions from December 15th to the end of the year so for this to actually work you have to put in here combined the balance as of December 15 and all the transactions in between plus and minuses you have to add them together in order to have a correct beginning balance and then you can then start creating your transactions January 1st 2024 and you should have a clean beginning balance that’s one option the other option that we have and this could happen like this is I want to make put here the ending date of the last statement and then essentially you will be also entering some 2023 transactions in QuickBooks in order to catch up and then when you go reconcile let’s say January 15th 2024 you have a full month worth of statements and your QuickBook is going to have some transactions for 2023 so that that’s that’s the part that just gets a little bit tricky so that could be the other option that you have so let’s say this is 12489 60 and again we’re picking the method where we’re taking the actual physical statement let’s say they close on the 15th we’re taking the ending balance on the 14th and then when we go enter transactions we’re going to choose to catch up those uh last transactions of the year and again who cares we got we put some things for 2023 here even though it wasn’t our intention but we were able to do a clean reconciliation now very very important I do have a video that talks about reconciling entirely different process we won’t cover that in here but um keep that in mind because this is like the precursor of doing any sort of bank reconciliations so we’ll do save and close and there there is my um American Express uh credit card now let’s do one more credit card and this is going to be a situation where I have a credit card and then multiple sub accounts under it or multiple physical cards and that’s a really important one because when you connect your Banks your Banks usually download the activities in each of the sub account accounts but then when you reconcile you reconcile the master parent account so I’ll give you an example so let’s say I have a Chase account with three extensions or three users so I’m going to start by creating the let’s say the parent account so let’s do Chase Visa uh uh let’s call it rewards rewards INC plus or whatever it happens to be and that’s going to be our parent account that’s going to be like our overall account that we’re going to reconcile I’m going to do the same thing do the beginning balance let’s say picked up the statement and that closes on the 20th so we’re going to go back to December 20th and then we’re going to put the physical balance on December 20th and we’re going to choose the technique where we’re going to enter the the catchup transactions for December so we can have a clean reconciliation month so let’s say this happens to be 15, 44630 and then we’ll do save and close so that’s my parent account that’s my that’s my master account and the beginning balance is going to be carried there but I have to create each of the extensions each each of the physical cards that are under this one statement and put in them as a sub account so I’m going to come in here and select credit card and then I’m going to choose let’s say uh Hector and then put the last four digits that’s my card that’s Hector’s card and I’ll put that as a sub account in this case of Chase um Chase Visa Inc rewards and then for this one I don’t enter a beginning balance because the beginning balance is being carried by the parent account by the master account then I’m going to click on Save and new cuz I’m going to keep creating accounts and let’s say the next one I gu said new didn’t work let’s do new one more time let’s go and select credit card again and we’re going to put here let’s say Maria this is the the user Make data sub account of the same uh Chase plus Chase rewards plus and then click on Save and new let’s scroll down over here and see that we have my Chase Inc and my two sub users let’s do one more we’ll pick here credit card now hopefully you’re getting the hang of this at this point let’s do Carlos and put the last four digits make it a sub account and look for that account the rewards Plus Inc and then no beginning balance again because the master parent account has that save and close and let’s say that’s it let’s say that’s all of our assets and our liabilities and we use the chart of accounts to build that up so now let’s go to report and let’s go into balance sheet and then we’re going to select last fiscal year that way um last fiscal year we know exactly what our balance sheet looked like before I start using QuickBooks notice that I’m going to have let’s click on run report actually let’s do the beginning of the year that might be easier uh because sometimes those um those uh transactions get created at the beginning of the year so let’s do the beginning of the year and then click on run report and then we’re going to get to see all of our balances so here’s the opening balances that we created here’s the opening balances of the vehicles then we went down here and we we have both of our vehicle loans we have our our two credit cards the AMX and the chase reward and then we have our opening balance Equity that adds up in there now a couple things I’m going to click on all of these uh amounts and this is sort of like General General cleanup sometimes these beginning balances get created at the very beginning of the year year January 1st I actually prefer to bring those back and make sure that those beginning balances are booked on the last of the year because I don’t want this year to show like a weird transaction that technically doesn’t belong to this year because that comes from previous here so you can click on all of these balances in your reports you can click on them and inspect each and every one of them and notice that I’m manually making sure that they they hit 1231 2023 which is what QuickBooks should it done in the first place is just you know sometimes it doesn’t work exact way you want it so I’m just going into each one of these transactions that were created and I’m changing the dates to 12:31 2023 to have a real clean uh balance sheet and then when I run it for last year it actually shows up full uh and complete um different than you know how it was showing when I first tried to do the the last year balance sheet and let’s say these credit cards to I think these should be good because both of these were entered I believe uh prior because those those we had a different date for if you remember those actually were done in December so that’s so the credit cards kind of obey that rule that concept that I mentioned but the other ones didn’t so let’s go back and run the balance sheet account one more time and let’s just make sure that all of the accounts that we uh that we want them to show up for last year uh because this is all the last year exercise they’re showing in there so let’s see both I got both my bank accounts great I have my loan accounts I mean my my assets I have my credit cards I’m still missing the loans so let’s if I push this date up one more day the loans should show up in here and the same thing I want to click on that going into each one and bring that back so that’s going to be um you know the most clean as possible when we run a a previous year and there’s a very specific reason for this which we’re about to show you and this is more of a sort of nerdy technical accounting reason why we do this but let’s just run the balance sheet one more time let’s make sure we got a clean balance sheet here so let’s do last fiscal year last year okay there we go so there’s my two Banks there’s my two assets perfect there’s my two loans and my two credit cards beautiful okay so now what I want to do and again this is a technical thing this is I do want to cover in this video Even though this video is more for beginners I don’t not do not like opening balance Equity accounts it’s actually there’s actually no such thing as an opening balance Equity account that’s a that’s a madeup concept from QuickBooks okay in the accounting World opening balance Equity is actually referred to as retain earnings retain earnings so what I’m going to do is I’m going to take this number 152 80473 let me go back here for a second I’m going to take that number and then I’m going to do a journal entry to move it out of op balance equity and into rain earnings and I do want to mention something again this is the type of stuff that you might need an accountant for journal entries are are tricky I’m going to go into journal entry and then I’m going to date this journal entry 12:31 2023 and then I’m going to move it from opening balance Equity so I’m going to move it out of equity which means I have to debit it and again this is this this is the accounting stuff this is where you might need an accountant to help you with this stuff and one of the interesting things about this is Most accountants never never want to touch the retent earnings account actually I agree with that you should never be touching retent earnings account this is a very special situation where I’m taking last year’s this is not a current clean year that I’m doing I’m taking last year’s entry for the beginning balances and I’m cleaning it up into ritten earnings because that’s where it’s supposed to go again this part super nerdy maybe something you could have you could have skipped but this is like for me the way I want to see a balance sheet um I want it to be right so that’s kind of where it belongs and and to take it one step further it might not all be retain earnings it could be retain earnings it could be beginning Capital it could be additional painting Capital there’s all sorts of layers of complexity in the equity account where you track not just the value of your business but the accumulated assets sorry the accumulated Capital that each of the business owners have uh in that business so gets a little bit trickier I won’t cover that but you do need to maybe discuss that with your accountant to make sure your QuickBooks is fully cleaned up okay so that’s the balance sheet and that’s the exercise of working uh balance sheet accounts uh in the chart of accounts let’s go back to the chart of accounts and let’s talk about profit and loss accounts whole another ball game so we’re going to scroll down here we’re going to see the very last account that has a running balance so right here the very last account had a running balance that’s my last Equity account then we’re going to start looking at our profit and loss accounts which starts with the income account there’s a whole bunch of accounts in here the default chart of accounts in QuickBooks created billable expense income refunds to customers sales sales or product income services on categorized income now I have my own spreadsheet that I put together this is for sanity check as an accountant helping hundreds of people a year get on QuickBooks Online sometimes I like using my own chart of accounts so I have my own profit and loss accounts that I keep track of this is because I prefer it this way and in my same spreadsheet I have a list of my um the default accounts that QuickBooks Online has that way I can compare you know the original one that QuickBooks has versus the one that um that that that I like to use now quick side note uh for all my clients that want me to set up the chart of accounts or not just me but my team I I typically give them my chart of accounts and I load my chart of accounts in there it gets a little tricky when it’s an existing account because there’s ex existing transactions that need to be reclassified but generally when it’s a brand new account I like using my own chart of accounts for my own reasons and the way I you the way I import the chart of accounts into QuickBooks is I export this into Excel and then I go uh in here into um into the chart of accounts click on the drop down menu and then click on import and then once I click on import I click on browse and I go choose the chart of accounts file that I have in Excel in my computer somewhere and then once I’m just going to pick this thing this thing although even though that’s not that’s not it I’m going to load an Excel file or a CSV file click on next and then I’m going to map you know what’s what in here and then once I map it I go to the next screen and then the next screen tells me okay this is all the accounts that you want to import and this is uh all of the account types that each one is supposed to be and then you start loading that in there again that this wasn’t an actual chart of accounts file but that’s how you import your own chart of accounts so if you happen to have a super clean beautiful chart of accounts in Excel that you like to use maybe something your accountant gave you maybe you yourself are an accountant and you have your own template that’s how you would import it you would go into the little drop- down menu next to New and click on import okay um and again I’m going to scroll down here and then I’m going to go through the income accounts and let’s say I don’t like this configuration let’s say I have one account that is for on-site services and one account that is for let’s say remote services so I don’t like any of the income accounts that QuickBooks has here for me so I can create my own categories my own income category so I’m going to go to new I’m going to select the account type I’m going to scroll down to income and then I’m going to pick on detail type the one that makes the most sense so let’s say service fee income and then I’m going to put in here onsite Services okay and then I’ll try to do save a new see if it works this time sometimes it does sometimes it doesn’t let’s go to new again and let’s click on the drop down menu go to go to income and then I’m going to select let’s say um service fee income again and then this will be remote services so let’s say all of my sales of whatever I do service Consulting whatever it is essentially all of the income is going to fall into just those two categories the uh let’s just scroll down here it’s going to be the remote services or the on-site Services now I can start getting rid of the accounts that QuickBooks give me so for example this one called refund to customers I can click on the drop down menu and click on make inactive makeing active and I get rid of that it doesn’t disappear but it gets removed from there some accounts might give you issues like this sales sales of product income and Services I selected three then I’m going to scroll all the way in the top I’m going to click on batch actions and click on make inactive so I’m going to try to make three inactive but anytime there’s an error that it says look I can’t make that inactive because I have it tied to a product or service then you won’t be able to so I’ll do the ones that it can and then I’m going to scroll down one more time see what it allowed me to get rid of and what what it kept so it looks like it got rid of the other two but it kept this one here called uh services and then there’s this one called uncategorized income essentially there’s three accounts you can get rid of one is billable expense income if you get rid of that it comes back 10 times so don’t get rid of it bailable expense income leave it there and on carag income same thing if you delete it it keeps coming back so you have to keep that in there is because those two accounts are used by QuickBooks to fulfill a very specific mechanic so you might need to keep keep those in there um so just leave them in there for now but I do want to address this one services that I couldn’t get rid of let me just try to do it one more time so you can see it says I can’t use it because a product or service is tied to that so we’re going to take a quick break out of chart of accounts and then we’re going to talk about product Services because this is an important link between your items and your accounts so I’m going to click on the gear menu on the top right I’m going to go to an entirely different section of QuickBooks which is products and services and these are going to be the individual items the things that you sell or or buy that you’re going to track in transactions and those tie or map to accounts so then when you create um transactions with those items QuickBooks knows where to C cize them so let let’s let’s add a new item here I’m going to this is I’m in the products and services list and if I go to new I’m going to add a new item I’m going to call it a service and I’m going to say remote one hour support or something like that and just keep in mind this is the name of the item the product or service that’s going to show up in the transaction like an invoice okay then in here I can put uh remote services for it support with with level one Tech or something like that now under income account this is where you pick which category this item ties into and notice this is tied to Services which is why I couldn’t delete the service account but I’m going to tie this one called remote 1 hour support I’m going to uh tie it to my remote account actually let me ex out of that and click on refresh one more time because I just changed the accounts and sometimes it takes a few minutes for the accounts and the items to to sync together so let’s go back into add item let’s do service let’s do remote 1our support okay I want to copy and paste this into the description click on the drop- down menu and now I should see my remote Services there we go and let’s say this is what we charge uh $75 an hour for and if I plan to pay a vendor pay a contractor or something like that for the service and use the same item then I want to put um the same thing here and put uh paid to contractor or something like that and then on their expense account I’m going to put the expense account you know maybe something on cost to good sold called subcontractor expenses or something like that if I’m going to pay a third party to fulfill the work that I’m selling to these clients and let’s say I pay the third party $40 an hour or something like that it’s $40 an hour and if I have a very specific unique person that I always use you can put that on their prefer vendor but you don’t have now this option to make a product or service um eligible for a purchase transaction that’s an option if I uncheck that that goes away completely okay so let’s click save and close and there we go here’s my remote 1H hour support now this item call services I’m going to edit it and I’m going to tie it to let’s say the remote Services click on Save and close and then this one called hours and QuickBook cre is for you I’m going to tie it to remote Services click save and close and the way I the reason I did that is because I don’t want any items to tie to the generic Services account so I can actually delete it so I’m going to click on the gear menu and and check uh income account here so it shows up in the list that way I’m just double-checking that nothing is tying to that generic Services account then I’m going to go back into my gear menu I’m going to go back into chart of accounts and then I’m going to see if it lets me delete that Services income account that I don’t want so let me scroll down here and go into services and let’s now cross our fingers click on making active click on making active and beautiful it let me do it so that’s the tie between the your your uh your items and your accounts let’s see if I can delete this uh sale of product income does it let me do that one too yes it does so now I’m down to the two accounts I actually want on-site services and remote services and the two that I can delete which is uncategorized and building expense income so that’s how you create accounts your own categories that’s how you tie them to products and services and as you go down you can check all your cost to good sold or your expenses and check which is the account which is the account or list of categories that you want when you actually um look at your reports okay so let’s take a look at that let’s go into reports and let’s go into profit and loss and let’s see what a profit and loss looks like with the chart of accounts the way we have it now there are no transactions in my profit and loss because this is a blank file so when I run the profit and loss it just shows a bunch of zeros now one of the cool things about reports in QuickBooks Online is there’s an option for you to show Force show accounts even if they don’t have activity so if I click here it says all and then click on run reports all the accounts in my chart of accounts should show but there’s one caveat for this to show there has to be at least one transaction so at least one transaction in there all the accounts will show so we’re going to create one transaction to make make it very simple here so I’m going to go into new I’m going to go into invoice I’m going to just I’m going to create a customer here really quick let’s do ABC customer and I have an entirely different video that covers invoicing and accounts receivable make sure you check the description all the videos that I have walking you through how to use QuickBooks but essentially I’m going to use this item that we created called remote one 1 hour support let’s say that we’re charging for 3 hours of that and we’re going to date this uh this year at some point uh that way it shows up in the current report that we’re looking at and then we’re going to click on Save and close so as long as there’s one transaction this is a very interesting thing about QuickBooks as long as there’s one transaction in here I can click on where it says all rows and active columns click on all and then essentially my entire chart of accounts will show here now unfortunately it also shows all the deleted accounts which is kind of a bummer because it does kind of mock with the organization of my chart of accounts so you do want to be careful with creating and deleting creating and deleting because as time goes by if you want to use this option to show all you’re going to be showing or you’re going to be seeing this deleted account so that’s kind of a the the bummer now when you look at your profit and loss and we have an entirely different video that talks about manipulating reports and customizing reports but I’m just going to show you something really quick I’m going to click on customize here and then I’m going to click on accept zero amounts basically to show the zeros which makes the report a lot easier to see I’m going to click run report and I do want to warn you one more thing I’m recording this video in January of 2024 and uh QuickBooks is going through a transition period where these profit and loss reports are going to look different they’re going to have an entirely different screen and these settings are going to be all over the place you should have the same settings but it won’t be the same I this is not a reports video and I will be doing updating updated videos focused on just reports all I’m trying to do is illustrate how chart of accounts flow information into uh your your reports now let’s say for example that this one called refunds to customer sales deleted sales of product income and Services I want to kind of just Nest them under one General category that way it’s easier for me to just visually hide them per se if I want to so let’s go back into the chart of accounts okay and I’m going to open chart of accounts in a new tab that way I can keep my report reports here live and I can just click refresh when I’m ready to to look at it again then I’m going look at these accounts that were deleted so I’m going to click on the on the settings button on the right hand side of the screen and click on include inactive that way it’s going to show me the deleted accounts for some reason QuickBooks calls deleted and inactive accounts uses the same term we’re going to go down here and look at all my accounts that are deleted I’m going to create a general category let’s go to new here I’m going to make it income and then I I’ll pick any of these categories here it doesn’t matter and I’m going to call this old slash deleted accounts and I think you’re starting to kind of see read between the lines on what I’m going to do let’s hit save and close here and then I’m going to go look at all my accounts that we deleted okay let’s scroll down here until I find it there we go let’s go to this one go to uh make active so I’m making it active temporarily I’m going to edit it make it a sub account account of my deleted perfect then click save and close then I’m going to go to my next one this um refunds to customers click make active edit delete make that a sub account of my deleted group and by the way there’s nothing wrong with an account called refunds to customers you can have that um just because I’m deleting it doesn’t mean that you need to delete it it’s just I don’t want it right so you need to plan with your accounting team exactly how you want your accounts to look this is this is a very personal very specific to your company decision that you need to make so I’m going to click on edit and then make that a uh put that under the deleted group and then click save and close and now all my accounts have been deleted which I temporarily made inactive so I can edit them they’re showing up here so now I can just select all these here and then go into batch action make inactive and I should be able to make them all inactive in one shot uh with the exception of all the ones that have scroll down here the income so with the exception to the parent account the parent account that still needs to show up in there I can only make the sub accounts I can only make the sub accounts inactive um the the parent account is the one that has to stay active okay so when I go back into my profit and loss report and click on run report what’s going to happen is now all these deleted accounts get it under that one category and then I can just click on the little triangle and then collapse it and then it cleans up the p&l for me and essentially the purpose of doing this sort of work on your chart of accounts is when you go print your chart your your profit and loss report and you have two three four pages that you’re monitoring your numbers in you’re going to lose uh the ability to really analyze your numbers cuz you print a report in four pages is too much so you want to work as hard as possible to not have the goal is not to have as many accounts as possible it’s actually the opposite the account is to have as little accounts as possible that provide the most amount of Rich information you need for your business based on your needs okay so as we go down let’s start looking at all the other categories we have advertising and marketing Let’s see we like that uh let’s say I want to have something called traditional Media or something like that or radio ads I can do that I can come back into my chart of accounts go to new and then go down to expense and then I’m going to call this one radio ads click on the sub account drop down menu and then that’s under advertising and marketing and then click on Save and close I created one more category switch back to my profit and loss there’s three there now so if I refresh then I I expect to see now four in there now do you need to have all this no you don’t uh some people just have advertising marketing no sub accounts and they throw everything in there this is personal based on your business needs you need to figure out how many accounts you have under employee benefits you see there’s a whole bunch of sub accounts there pretty clear we’re going to collapse that under General business expense again a whole bunch of other categories let’s say we like that and we um and we we uh collapse that under insurance we have all these sub accounts that’s great beautiful and again your chart of accounts might look different keep that in mind this is the default chart of accounts that was created with the company I company file I created in QuickBooks yours could look completely different there’s actually no guarantee yours will look anything like this um and then there’s interest paid there’s legal and accounting services there’s meals with clients and travel meals there’s office expenses and then all these subcategories there is payroll expenses and in here you would have uh stuff like payroll tax and that sort of thing even though there’s a category for taxes paid and payroll taxes is in there let’s say I don’t want payroll taxes to be under taxes paid let’s say I want it under payroll taxes so I go back into my chart of accounts I can search the specific account here that way it’s there so taxes paid okay so very specific account called taxes paid I’m going to click on edit and this is my uh taxes paid account actually I’m looking for the one under it which is uh payroll taxes I believe sorry so let’s look at payroll payroll taxes this one right here payroll taxes we’re going to edit that one and then make that a sub account instead of taxes paid I want to move it and make it a sub account of payroll taxes again there’s not this is not the rule or it needs to be like that or not this is based on your needs let’s run the report let’s go down here one more time and then under payroll taxes now I have under payroll expenses I have both payroll taxes and wages so you’re going to go through all these and you’re going to figure out what what works uh best for you look there’s utilities there’s uncategorized expense this is another example uncategorized expense is one that you cannot um change QuickBooks gives you that you have to leave that in there because when you connect your Banks and QuickBooks just so know where to categorize stuff that’s where it’s going to put it it’s going to put it under uncategorized expense now sometimes uh you can merge accounts that are redundant so let’s say for example that you think electricity Heating and Cooling are redundant you can collapse them to or you can merge them and let’s say for example disposal weight fees and water and sewer we also want to do the same thing with that so let’s do one at a time let’s take this one called Heating and Cooling I’m just going to copy the name of it and uh actually let’s that we’re going to collapse that with merge that with electricity so I’m going to copy electricity actually I’m going to copy electricity then I’m going to go find heating and cooling in the chart of accounts let’s type here Heating and Cooling there we go we’re going to edit that and we’re going to change the name from Heating and Cooling to electricity see I had copied it so I pasted it then when I click on Save and close QuickBook says hey wait a second that account it’s already duplicated would you like to uh merge it then you’re going to say yes now these chart of account screens they they look different for different people sometimes you will have a drawer on the right hand side that will be um that will be showing the me the mechanics of creating these accounts and instead of having the popup they would look slightly different so I’m going to show you what that looks like just in case your screen looks one way or the other this is the tricky part of dealing with QuickBooks as sometimes you’ll have one screen that behaves a certain way versus others that behave a different way depending on your company file so I switched I switch modes so now you’re going to see the chart of accounts behave a little bit different when you create new accounts but anyway let’s go back into here and let’s say that this one um disposal weight fees this is the one that we’re going to condense into water Water and Sewer so I’m going to copy Water and Sewer I’m going to go back into my chart of accounts I’m going to look for uh the waste fees this PO on W fees I’m going to click on edit and see now it’s a drawer on the right hand side so depending on again your QuickBooks might have the popup you saw earlier or it might have the drawer on the right hand side and they work very similar and we’ll kind of show you how this one works so I’m going to change the account name in here to match Water and Sewer it’s under the category utility so it’s under the sub category of utilities and then now the the warning for the merge is actually the two lines highlighted in red letting you know this is going to be merged then when I click on Save now we get instead of a popup that says would you like to merge them you get the message right there inside the drawer and then you click on merch accounts and then click on Save and then does the same process so it kind of gives you an idea more or less how that how that functions let’s say I want to go back in here click on U run report go look at my accounts and you’re going to see the old um utilities you’re going to see the old accounts that were deleted again that’s only because we’re forcing them to show because we did the all options once you enter transactions anything that’s a zero gets automatically deleted but let’s say that this one called Water and Sewer I want to add water sewer and Disposal I want to rename it so I go back into the chart of accounts I’ll type water okay I’m going to click on the uh the edit menu and then now it’s using the drawer on the right hand side so we’re going to change the name from water sewer and Disposal fees notice that there’s always a description you can keep it you can delete it the descriptions really don’t do anything other than kind of help you guide you to the process let’s click on Save and that does that so I go back into my profit and loss I click on run and then scroll down to utilities and and there we go here’s all my C categories let’s go down to travel we have airfare hotels taxi and right chair vehicle rental uh let’s say I also want to have um a category for um for miscellaneous things that I buy at the airport for example so one called misscellaneous travel so I want to create a category under travel just called miscellaneous travel and that’s typically a good idea to have like I don’t like miscellaneous accounts but whe they’re under a general category I’m okay with that so I can have one called miscellaneous travel I can have miscellaneous utilities I can have miscellaneous uh payroll I can have miscellaneous office that sort of thing so there’s sort of a catch up opt a catch all option inside those uh parent categories so let’s go and create a miscellaneous travel so let’s come in here let’s go to new now we have the drawer on the right hand side we’re going to pick expenses remember income and expenses are profit and loss accounts under save account under this is where I’m going to go look for that travel account so see the mechanics a little bit different it doesn’t have the check box that says sub account off you just pick whether it’s going to be part of the general expenses or if you don’t pick the parent General expenses then automatically by picking anything that’s under that that would be like sub account off so let’s go down to travel and then we’re going to call this one miscellaneous travel okay yes I’m putting it uper case I know driv some people with OCD crazy so let’s go to save and let’s go back to our profit and loss and click on run report so we know exactly what’s happening and there’s my miscellaneous uh travel okay now one thing to keep in mind uh QuickBooks always orders accounts in alphabetical order unless you turn on account numbers if you turn on an account number then it’s going to use uh the account number for ordering which is my preferred method of looking at accounts so how would that work so let’s click on the gear menu let’s go to account and settings and let’s go down to Advanced and then here where it says chart of accounts we’re going to click on enable chart of accounts and show the numbers so we’re going to turn them on so they show up in the chart of accounts and we’re going to show them so QuickBooks can order it by account numbers then I’m going to click on Save and click on done and most beginner users don’t like account numbers they feel it confuses them accountants love it account number so we got to figure out somewhere in between you know I think it’s not that bad once you get used to using account numbers it’s pretty easy as I mentioned earlier I have my own chart of accounts that already has all the numbers so obviously for me it’s easy to deal with that because every single client I have uses the same account the same number uh combination so as I categorize stuff I start memorizing the numbers and I sometimes I just look at a expense and just say oh yeah that’s 529 or something like that so that’s the cool thing about uh knowing your chart of your your numbers in your chart of accounts that you can either recall it by name or recall Call It by number we’ll show you an example of that so let’s uh come back into our chart of accounts and I’m going to click on the on the gear menu actually let me click on refresh because I just turned on the account number so this screen now to needs to recognize that we have account numbers and I’m going to click on the gear menu and make sure that the number it’s enabled so now I see that my numbers are enabled and I’m going to start adding numbers to these accounts and this is kind of a long process I know but this is the problem with the default chart of accounts don’t give you good account number so for example I’m going to add account numbers only to the ones that I’m going to use so I’m going to use on-site services and remote Services those are the two that I want to use and everything else I leave it without a number you can do that so let’s go into the little pencil button here and then go down to each of the accounts that I want to edit the number for so let’s say that for my on-site Services I’m going to call this [Music] 4101 and the remote Services is 41 1 02 now rot of thumb income accounts start with four cost to good hold accounts start with five expense accounts start with six or seven and then other income and other other expense is going to be eight or nine and then assets let me just go up here assets like bank accounts those start with ones so let’s do 10101 10102 10103 uh liabilities start with two so for example let’s say I’m going to go down to my credit cards and do 2011 0 2011 01 2011 02 notice the the logic that I’m using for these okay I’m trying to keep them into like logical sequence so when they’re organized like that and then the equity accounts they’re all start with three so let’s do 30101 and then I go down here and just assign mul actually opening balance Equity I don’t use that one so I would never put a number on something that I don’t mean to use and that’s kind of like how I that’s kind of how I how I see uh Logic on these things um where if I know for a fact I don’t mean to use it then I don’t put an account number on it so I’m just going in here these are the equity accounts so I’m putting account numbers on those and again I’m not going to do it um in all of them I just kind of want to show you uh the process let’s go down down here to travel and let’s say travel is going to be 60700 and then my expenses are going to be 60701 I want miscellaneous to be the last one so let’s do 60702 skip miscellaneous 6073 vehicle rent or 6074 and then for miscellaneous I’ll do 6075 so notice I’m I’m overriding the numbers in here so I can see the accounts on a very specific order and that’s what I’m going to that’s what I’m trying to show you that the the the account numbers actually have rhyme reason and logic to them so let’s go back into my profit and loss account report click on reports notice that the two accounts I want to use have numbers the ones that I don’t don’t and that’s kind of how I differentiate between the chart of accounts I meant to put together and the chart of accounts that QuickBooks is forcing on me if I can delete the accounts go down and look at my travel notice that my travel is showing number one because none of the s have numbers so of course numbers take priorities and notice that QuickBooks is now respecting the number sequence all of these accounts need to have a a natural number sequence and especially if they’re going to be sub accounts make sure that you um make the parent account a z0 or something like that so you have plenty of numbers under that so you can have a logical sequence inside of those parent accounts and that’s essentially how you organize your chart of accounts and essentially what the chart of accounts means to you is the way you organize your categories of your income and expenses so you can put them in a financial reports in ways that it that it’s useful and meaningful to you so anyway this is that was the chart of accounts master class hopefully that was helpful hopefully you get acquainted with your own chart of accounts that way you um you have a really pleasant experience using QuickBooks I do sell my chart of accounts for advanced users for accountants that like my own chart of accounts I’ll put a link into that in the description somewhere if you want to buy the chart of accounts and import it into your QuickBooks online if you’re a brand new user if you’re not an accountant I don’t recommend you just import your accounts in your QuickBook file because it could become a mess so that’s really for the advanced type of users and if you need private one-on-one support um to help you set up your own chart of accounts I’ll put my contact information my firm’s contact information down in in the description and you can schedule and oneon-one with one of the consultants in my firm to walk you through this process and make this a real uh a personal experience that pertains to your business that way your chart of accounts doesn’t have anything you don’t need only the stuff you actually need and understand anyway hope this video was useful check out the description for all other videos that we recommend you watch after this as part of you know a multi-part series And subscribe to the channel to to be notified when I create another video just like this thank you and I’ll see you on the next one

    By Amjad Izhar
    Contact: amjad.izhar@gmail.com
    https://amjadizhar.blog

  • ABBA in Switzerland (1979)

    ABBA in Switzerland (1979)

    ABBA in Switzerland (1979)
    THE SHOW a Tribute to ABBA

    By Amjad Izhar
    Contact: amjad.izhar@gmail.com
    https://amjadizhar.blog

  • Partition’s Legacy: A Conversation on India and Pakistan

    Partition’s Legacy: A Conversation on India and Pakistan

    The provided text is a transcript of a discussion about the partition of India and Pakistan. The conversation explores the complex history of communal violence and its lingering effects, examining the perspectives of both Hindus and Muslims. Participants debate the roles of key figures like Mahatma Gandhi and Muhammad Ali Jinnah, and discuss the ongoing challenges faced by religious minorities in both countries. The discussion touches upon various historical events and their impact on communal relations.

    Partition and Identity: A Study Guide

    Quiz

    Instructions: Answer each question in 2-3 complete sentences.

    1. According to the text, what are some of the speaker’s conflicting feelings regarding the partition of India?
    2. How does the speaker describe the role of various groups in the violence that followed partition?
    3. What is the speaker’s perspective on the legacy of Mahatma Gandhi?
    4. The speaker expresses concerns about the treatment of Muslims in different regions. What examples are given to illustrate these concerns?
    5. What does the speaker say about the idea of “terrorism” in relation to specific groups and historical events?
    6. How does the speaker address the claims that only one side suffered because of partition?
    7. The speaker mentions specific historical events, such as the Babri Masjid demolition. How does he connect this to the broader issues he discusses?
    8. How does the speaker describe the concept of “minority” populations in India and Pakistan post-partition?
    9. What is the speaker’s opinion on forced conversions and marriages of girls in the region?
    10. How does the speaker’s own experiences shape their perspective on being labeled “pro-Pakistan”?

    Answer Key

    1. The speaker expresses feeling torn between a desire for separation from what he sees as a foolish country and the pain caused by the partition. He also admits to a lingering jealousy towards those who initiated the partition while simultaneously acknowledging the weight of the responsibilities it created.
    2. The speaker suggests that various groups poured fuel on the fire, as per their own intentions, and instigated the violence. He does not absolve any group, and in fact says that, “all four” (of whatever parties) were involved in making things worse.
    3. The speaker is conflicted about Gandhi. He expresses some admiration but questions Gandhi’s approach and states that some are “Gandhi worshipers,” implying some may be blindly following him. He also brings up the alternative view of Nathuram Godse, who is very well known for assassinating Gandhi.
    4. The speaker highlights concerns about the status of Muslims in India and Bangladesh after partition, the attacks on Muslims following the Babri Masjid incident, and perceived discrimnation in Pakistani society. The speaker also mentions the loss of homes and property suffered by many Muslims.
    5. The speaker argues that the term “terrorist” is often applied inconsistently, pointing out that groups like the Tamil Tigers have also committed acts of violence, and says it is too easy to point fingers at Muslim and Hindu religious groups for violence. The speaker suggests that anyone who harms innocent people can be considered a terrorist, regardless of their group or affiliation.
    6. The speaker challenges the notion that only one side, specifically Hindus, suffered losses. He contends that both Hindus and Muslims suffered deeply during the partition, sharing accounts of both sides experiencing loss, violence, and displacement.
    7. The speaker connects the attack on the Babri Masjid to the treatment of Muslims and suggests that these events are a continuation of historical oppression. He expresses anger and concern that these attacks can happen with impunity.
    8. The speaker points out that the minority populations of Muslims in India have grown significantly since partition, while the minority populations of Hindus in Pakistan have declined drastically, raising questions about the unequal treatment of minority groups in both countries.
    9. The speaker is completely against it, calling it out as an abusive act, particularly in forced marriage situations. The speaker mentions the idea of being forced to convert.
    10. The speaker expresses frustration about being labeled “pro-Pakistan” despite his identity as someone who lived in India and never claimed allegience to Pakistan. He is critical of this easy categorization, which he feels stems from nothing more than his name.

    Essay Questions

    1. Analyze the speaker’s internal conflict and the complexities surrounding national identity in the context of the partition. What are the various competing forces that shape the speaker’s sense of self?
    2. Examine the speaker’s critique of historical narratives and the role of differing perspectives in shaping accounts of partition. How does the speaker challenge dominant viewpoints?
    3. Explore the speaker’s discussion of violence and terrorism, considering the diverse examples they present. How does the speaker attempt to challenge a simplistic understanding of these concepts?
    4. Discuss the speaker’s concern with the treatment of minority populations in the region. How do specific anecdotes and statistics contribute to an understanding of the issues the speaker raises?
    5. Using details from the source, evaluate the speaker’s viewpoint on the legacy of partition and its enduring impact. How does this viewpoint contribute to a better understanding of the complexity of the period?

    Glossary of Key Terms

    • Partition: The division of British India into the independent nations of India and Pakistan in 1947. This resulted in large-scale displacement and communal violence.
    • Communalism: The socio-political ideology that prioritizes the interests of one’s own religious or ethnic group over the interests of society as a whole, often leading to tensions and violence between different groups.
    • Mahatma Gandhi: A prominent leader of the Indian independence movement known for his philosophy of non-violent resistance. His legacy is complex and contested, with both fervent supporters and critics.
    • Nathuram Godse: A Hindu nationalist who assassinated Mahatma Gandhi in 1948. His actions are often seen as emblematic of the extreme violence that erupted in post-partition India.
    • Islamic Fundamentalism: A term referring to various movements emphasizing strict adherence to religious doctrines and often associated with political activism and violence.
    • Tamil Tigers: A separatist militant group that fought for an independent Tamil state in Sri Lanka. They were known for their use of suicide bombings and were designated a terrorist organization by many countries.
    • Babri Masjid: A mosque located in Ayodhya, India, that was demolished in 1992 by Hindu nationalists, leading to widespread communal violence. The event is a touchpoint in communal relations in India.
    • Article 370: A constitutional article that granted special autonomy to the state of Jammu and Kashmir. This article was revoked in 2019.
    • Lok Sabha: The lower house of the Indian Parliament.
    • Minority: A group of people that is distinct from the majority population in terms of ethnicity, religion, language, or other characteristics, and who often face discrimination or marginalization.
    • Mukti Bahini: A guerrilla organization in the former East Pakistan (now Bangladesh) that fought for independence from Pakistan during the 1971 war.

    Partition’s Legacy: A Critical Reassessment

    Okay, here’s a detailed briefing document summarizing the main themes and ideas from the provided text:

    Briefing Document: Analysis of “Pasted Text” Excerpts

    Overview:

    This document analyzes excerpts from a transcribed conversation, likely a debate or discussion, exploring complex historical and political issues related to the Partition of India, Hindu-Muslim relations, and the legacy of violence and discrimination in the region. The speaker expresses a range of personal perspectives, challenging dominant narratives, and raising uncomfortable truths about all sides involved. The tone is passionate, at times accusatory, and often attempts to counter what the speaker perceives as biased viewpoints. The speaker identifies strongly with the experience of Muslims but critiques Muslim behavior as well, showing an internal awareness of hypocrisy in the debate of fault and blame.

    Key Themes and Ideas:

    1. The Pain and Legacy of Partition:
    • Personal Anguish: The speaker expresses deep personal pain connected to the Partition, referring to it as a “fire that flared up in which humanity was destroyed.” They acknowledge the enduring pain of those who were displaced and lost loved ones, including the sentiment of not being able to “part ways” with their “foolish country.” The speaker states that they are “jealous of those who broke” the country, indicating a deep sense of betrayal and historical trauma.
    • Unresolved Trauma: There’s an insistence that the pain caused by Partition is still present, with the speaker declaring, “those who have reached there are still crying, those whose Nazari If she is crying then it is our responsibility to find and see the people who did this.” This highlights the need for accountability and acknowledgment of suffering.
    • Questioning the Necessity of Partition: The speaker challenges the fundamental logic of Partition, asking, “Was it necessary to divide the country, was there no other way for the benefit of the Muslims?” This suggests a critical perspective on the choices made by leaders and a belief that perhaps a more peaceful solution was possible.
    1. Critique of Leadership and Historical Narratives:
    • Disappointment with Gandhi: The speaker expresses confusion and some skepticism about Mahatma Gandhi’s methods, saying, “here I am confused when No doubt I was not convinced You must know that I am saying this with full justice, whoever went and tried to stop this attack, in fact as far as possible all the four have poured fuel on it as per their respective intentions.” They criticize the tendency to blindly worship Gandhi and acknowledge that some people despise Gandhi. This signals a rejection of simplistic hero-worship and a demand for nuanced analysis of history.
    • Criticism of All Sides: The speaker argues that all sides involved in the conflict, including political leaders and groups, “poured fuel” on the fire and are culpable for the violence. There’s a rejection of a single scapegoat narrative.
    • Challenging the Glorification of “Heroes”: The speaker questions the idea of terrorists being labeled heroes, stating, “We cannot give tests at places where innocent people are prosperous.” They argue that anyone who harms innocents, regardless of their background or stated cause, is wrong.
    1. Terrorism, Extremism, and International Influence:
    • Critique of Labeling: The speaker questions the automatic labeling of any group with a beard or association with Islam as terrorist groups, stating “You find it very easy to point your finger at bearded people or pandits. Because they celebrated their own Holi but Afzal celebrates it here because he did not celebrate Holi with beard Holi, he celebrated the city Holi, so these people are good”. They are critical of the tendency to blame entire groups of people for the actions of extremists.
    • Western Influence: The speaker implicates America and the west in funding and creating extremist groups stating “why are you making this film when America is fighting with Russia So he created this group, he did not create the group for that reason sir, he did not name it nor did he get it created, he created Theke Mujahideen.” This suggests that external powers have exploited regional tensions for their own gain.
    • Equating Extremist Violence: The speaker attempts to show that different extremist groups are not that different from each other, stating “you will be surprised that the maximum number of suicide attacks have been done in the Tamil Tiger group” which is not traditionally a Muslim group. This highlights a critique of bias based on religion or identity.
    1. The Plight of Minorities and Discrimination:
    • Muslim Suffering in India and Pakistan: The speaker argues that Muslims have not gained status or security in either India or Pakistan and were harassed in both countries. They declare, “Muslims neither got their status in India nor did they get their status in Bangladesh, Muslims got their status in these three places, Muslims were harassed there.”
    • Loss of Muslim Property in India: The speaker states that after partition Muslims were targeted in Delhi, losing a disproportionate amount of property: “85% of the property was theirs, today everything has been snatched away from them.”
    • Persecution of Hindus in Pakistan: The speaker highlights that while the Muslim population in India grew from 9% to 15%, the minority Hindu population in Pakistan drastically reduced from 22-25% to 3% after partition. The speaker questions why Hindus in Pakistan were driven out.
    1. Critique of Democracy and Majority Rule:
    • Questioning Democracy’s Fairness: The speaker expresses skepticism about democracy, claiming that if the majority ruled in India, then Article 370 wouldn’t have been removed and that Kashmiris would have been kept as prisoners. They suggest that democracy can be used to oppress minorities stating, “I wish to see the ir running, brother, they have kept you captive in your morning, neither the man in power nor the international community is talking about it nor are they doing anything.”
    • Fear of Oppression: The speaker fears that under a different rule in India there would be greater violence, stating, “if this was not in the taxi and if the majority had ruled, then by removing Article 370, people would have been kept as prisoners in Kashmir, people would have been sitting in the same way in your Lahore too.”
    • Critiquing Double Standards: The speaker points out inconsistencies in how terrorism is defined depending on who carries it out and says “when the British used weapons in the Spanish War, they were not terrorists, they were British” which shows a double standard.
    1. Personal Identity and Experiences:
    • Being Labeled a “Pakistani”: The speaker expresses the frustration of being labelled “Pakistani” in India solely because of their name, despite having no allegiance to Pakistan. They say, “I was considered Pakistan Nawaz because of my name, whereas I neither raised the flag of Pakistan nor did I make me Pakistan, nor did I become the Prime Minister of Pakistan, living in India I will be called Pakistan.”
    • Firsthand Witness to Violence: The speaker states, “I am a match for Lahore, since I am near Talab, you can talk to me, I have still seen you here in our place, those forced marriages, converting girls.” They position themselves as having personal knowledge of the issues.
    • Contradictions in Family History: The speaker raises the issue that Muslims have been accused of having multiple wives and says that their “grandfather had married twice and he had brothers, all four of them had married twice each, I asked questions in my family, they remained stuck in my family.” This points out hypocrisy in blaming a whole community for individual actions.

    Key Quotes:

    • “Jot Bhai Free, the fire that flared up in which humanity was destroyed, when I searched for it, I found Nation Free”
    • “If I had to tell my motherland that I want to part ways with my foolish country, then I know that the pain of the person who broke my country, I never go away and I am still jealous of those who broke it”
    • “I believe in Gandhi, I think Gandhi is a good man, at the same time there are some people who abuse Gandhi, so what do you say to this”
    • “We cannot give tests at places where innocent people are prosperous. Wherever someone harms innocent people, by any name or any organization, I don’t even talk about it.”
    • “Muslims neither got their status in India nor did they get their status in Bangladesh, Muslims got their status in these three places, Muslims were harassed there”
    • “It seems as if we got the partition done so that their ticket is also in Hindustan”
    • “My question is that the population there has grown. You say that despite the caste system, it seems that many people say that the Muslims have adopted only one mission.”

    Conclusion:

    These excerpts showcase a complex and critical perspective on the history of the Partition and its lasting consequences. The speaker challenges conventional narratives, calling out hypocrisy and seeking to expose the uncomfortable truths underlying this painful period in history. The passionate and sometimes contradictory nature of the statements indicates the speaker is grappling with a deep sense of injustice and a desire for reconciliation based on honesty and accountability. The speaker highlights the suffering and the lasting impact of these events while holding multiple identities and communities to task.

    Partition’s Legacy: A Critical Reflection

    FAQ

    • What is the speaker’s perspective on the Partition of India, and what lasting impact did it have?
    • The speaker expresses a deep sense of pain and confusion regarding the Partition of India, referring to it as a “fire that flared up” and destroyed humanity. They acknowledge the breaking of their country as an event that caused enduring pain and jealousy towards those involved in it, and a responsibility that is still felt. The speaker laments the displacement, loss, and the suffering of individuals who were affected. They question the necessity of dividing the country and whether there was an alternative for the benefit of Muslims. They highlight the continued suffering of those displaced by the Partition, particularly from 1947. They believe it was an action that caused more harm than good and divided a country unnecessarily.
    • The speaker mentions Mahatma Gandhi and Nathuram Godse. What are the contrasting views presented about Gandhi and his legacy?
    • The speaker expresses a complex and conflicted view on Mahatma Gandhi. They acknowledge that many worship Gandhi and agree with his viewpoints. However, they also highlight the perspective of those who oppose Gandhi, such as those who believe in Nathuram Godse, his assassin. They are also critical of Gandhi’s approach to conflict resolution, such as the idea that one should be hit first before others move, which they find confusing. The text also brings up the fact that many people disagree on his legacy and even see him in a negative light.
    • How does the speaker view the issue of terrorism and who they consider to be responsible for it?
    • The speaker argues against the common practice of exclusively associating terrorism with specific groups, like Muslims. They point out that the Tamil Tigers were responsible for the maximum number of suicide attacks. They highlight instances of attacks being carried out by other groups, and criticize the tendency to quickly blame Muslims or Hindus, while overlooking the larger and more complex issues behind these acts. They feel that the West and Russia are responsible for funding terror groups for their own ends, and that such groups are not representative of larger religions. They strongly believe anyone who hurts innocent people is a terrorist regardless of religion or background.
    • What is the speaker’s concern regarding the treatment of minorities in both India and Pakistan?
    • The speaker is highly concerned about the treatment of minorities in both India and Pakistan. They discuss how both Hindus and Muslims suffered immensely due to the Partition. They point out that although the population of Muslims has grown significantly in India since partition, the minority population of Hindus in Pakistan has dwindled. The speaker details that in Pakistan, they believe minorities are seen as less than others and that their basic rights are not respected. They also feel that the problems and violence experienced by minorities in both nations are often ignored.
    • What are some of the specific examples given by the speaker to show how Muslims and Hindus are treated in India?
    • The speaker refers to the example of the Babri Masjid attack to illustrate the precarious situations that Muslims often face. They talk about Muslims seeking shelter in Hindu neighborhoods, but also the financial support from Muslims to mosques, and how Hindus and Muslims supported each other in the aftermath of this event. They talk about how they feel Muslims are unfairly kept at a lower standard than Hindus in India, as if they had to prove their loyalty to the nation. The speaker also points to the rise in the Muslim population in India as evidence that they are not oppressed as a whole, highlighting the complex situation of the country. They refer to examples like Abdul Kalam being made President as proof of Muslim inclusion in India.
    • What is the speaker’s critique of democracy and its impact on minority groups?
    • The speaker voices skepticism about the fairness of democracy, particularly when it comes to the treatment of minorities. They suggest that in a majority-rule system, the needs of minority groups can be easily overlooked or suppressed, especially if they’re seen as a problem for the majority. They are also critical of democracy being used as a weapon to oppress people, like the revocation of Article 370 in Kashmir, in which case they believe those people have been kept as prisoners.
    • How does the speaker’s personal experience shape their views on the issues discussed?
    • The speaker’s personal experiences greatly shape their views, showing that they have lived in both India and Pakistan. They reference their familiarity with Lahore and how they believe the people there are similar to them. They feel that people in Pakistan have similar experiences with their families having dual marriages, for example, and that some Muslims in India are unfairly seen as loyal to Pakistan instead of India. They also highlight their lived experience of being labeled as “Pakistan Nawaz” simply because of their name, despite their deep ties to India. They talk about their family history and how it has been affected by the decisions of their elders and the Partition. Their intimate awareness of events on both sides of the border shows their deep ties to both cultures and people.
    • What is the speaker’s main argument for open, unbiased discussions about the issues faced by various communities?
    • The speaker repeatedly emphasizes the need for open, honest, and multi-faceted discussion, arguing against one-sided viewpoints and finger-pointing. They believe that attributing blame to only certain groups is simplistic and hides the deeper and more complex issues behind it. They suggest that all parties should look inward and address their own flaws and misdeeds before accusing others of their problems, and that there needs to be honest discussion and accountability to prevent future harms. The speaker calls for truth, self-reflection, and unbiased discussions to truly understand the history and to foster better relationships between communities.

    India’s Partition: Legacy of Violence and Division

    Okay, here is a timeline and cast of characters based on the provided text. It’s important to note that this text is a transcript of a conversation, likely an interview or debate, and is not a formal historical account. The timeline is thus constructed from the events and periods discussed, which sometimes overlap and are not always presented in chronological order within the text.

    Timeline of Main Events and Periods

    • Pre-1926: Discussion of a long history of oppression in a specific unnamed location (likely India).
    • 1926: Mentioned as a year of a significant event related to the oppression.
    • Pre-1947: The text discusses the growing tension between communities and the push for a unified India. There is a desire to wash away hatred.
    • 1947: Partition of India and Pakistan. The text discusses the violence and trauma associated with this event, the creation of new borders, and displacement of populations. There is also mention of a debate about whether or not the partition was necessary.
    • 1947-1948: Immediately after partition, there is discussion about property ownership, the movement of people and the loss of lives. There are mentions of groups who were forcibly moved or pushed out of their homes. The text states that in 1948 Hyderabad was annexed by India.
    • Post-Partition (General): The text discusses the ongoing issues of communal tension, violence, and the treatment of minorities in both India and Pakistan. There is a discussion about the demographic shift of religious minorities in India and Pakistan.
    • 1971: Discussion of the events of the 1971 war and the separation of Bangladesh (formerly East Pakistan) from Pakistan, specifically the atrocities suffered by people during the war.
    • 1992: The text mentions the attack on the Babri Masjid as an event where Muslims in Hindu neighborhoods sought refuge.
    • Modern Era: The text touches on the rise of Islamic fundamentalist terrorism globally, mentioning groups such as the Taliban, Al-Qaeda, and the Tamil Tigers. There is also discussion of contemporary events and leaders and their relation to these events. There is mention of an instance in modern day where a Hindu leader is calling for weapons for protection of Muslims.
    • Ongoing: There are continuous discussions about the fairness of democracy, international community, oppression, the treatment of minorities, and the overall nature of the conflict discussed. The text indicates these are still pertinent contemporary concerns.

    Cast of Characters

    • Mahatma Gandhi: A prominent leader of the Indian independence movement, advocating for non-violent resistance. The text mentions his philosophy of offering oneself as the first target to prevent violence, and has been discussed with skepticism. Some in the text are discussed as “Gandhi Worshippers.”
    • Nathuram Godse: Known for assassinating Mahatma Gandhi. His views are contrasted with those of Gandhi and his followers. The text mentions people who believe in Godse’s ideology.
    • Ganga Prasad: Referred to as a criminal, he is used to make a point about moral equivalence and how criminals are not confined to any one community.
    • Maulvi Yunus and Rabbani: Mentioned as examples of figures involved in groups that were armed and supported by external sources, particularly during the Soviet invasion of Afghanistan.
    • Sheikh Mujibur Rahman: Leader in East Pakistan (later Bangladesh) who was involved in the independence movement of Bangladesh.
    • General Nirad: Mentioned in the context of the atrocities committed during the 1971 war.
    • Mandal Sahab: A law minister in Pakistan post-partition, who is described as having had a painful experience during the partition and has written letters documenting it.
    • Abdul Kalam: Mentioned as an example of a Muslim who became President of India, used to make a point about Indian pluralism.
    • Mustafa: Mentioned as a name representing the large population of Muslims who lived in India.
    • Mastan Khan: Mentioned as a cloth merchant in a very large state who was affected by the military actions that led to it’s annexation.
    • “Our Sir”: A respected figure who gave a talk at an unknown time that is still considered relevant to present events.
    • Afzal: Person referred to as having celebrated the “city Holi” which is separate from the traditional celebration of the holiday.

    Important Notes:

    • Perspective: The text is a highly opinionated and subjective account of events. It represents one perspective, and it’s important to recognize that other viewpoints exist.
    • Ambiguity: Many details are missing, particularly specific places, dates, and names of groups or individuals. The context relies on a shared understanding of history, which may not be universal.
    • Focus on Partition: A major emphasis of the discussion is on the partition of India and its consequences. There is much discussion about blame and responsibility, focusing on the historical impact and modern-day consequences.
    • Communal Tension: A significant theme is the ongoing communal tension and violence between different religious groups, particularly Hindus and Muslims, with the text exploring the causes, effects, and possible solutions to the ongoing conflict.
    • Use of Analogies and Examples: The speaker frequently uses examples and analogies from historical and contemporary events to make points, sometimes drawing parallels between seemingly unrelated situations.

    Let me know if you have any more questions or need additional clarification.

    Partition’s Enduring Pain

    The sources discuss the pain of Partition from multiple perspectives, highlighting the violence, displacement, and lasting impact on individuals and communities.

    Experiences of Displacement and Loss:

    • Many people were forced to leave their homes and lost their properties during Partition [1, 2]. In Delhi, for example, 85% of the properties belonged to Hindus, and they were later snatched away [1]. Similarly, on both sides of the border, people were forced to flee and abandon their homes [2].
    • The text notes that those who reached Pakistan in 1947 were still crying, and those who are still crying should be seen [3]. The pain of Partition is a long-lasting wound that continues to affect generations [3].
    • The text also mentions the people in the East who were beaten and forced from their homes [4].

    Violence and Atrocities:

    • The sources reference looting, killings, and crimes that occurred during Partition [4]. There was “bloody destruction” [3] and oppression [5].
    • The text mentions a lot of atrocities committed in 1971 [6], and that people were crying for freedom [6].
    • The sources recount how the British used weapons and caused pain in the Spanish war, and says that anyone who causes pain or harm is a terrorist [7].
    • The text also points out that the British administration was responsible for a criminal system during Partition [4].

    Communal Hatred and Division:

    • The text says that both Hindus and Muslims suffered losses due to Partition [7]. It mentions that the country was divided and hatred was spread, leading to the killing of people [7].
    • The text argues that India was divided due to hatred and people who were living in neighborhoods, where there was no crime, were tagged as criminals [4].
    • The text highlights how easily people point fingers at bearded people or pandits [3], and the dangers of communalizing violence [7].
    • The text also mentions that some people believe that the mistakes of Muslims in India were allowed by their own people and that this is now acting as a trap [8].

    Ongoing Consequences:

    • The text states that the issues of partition continue to cause conflict [4], and that even now, people are divided amongst themselves [4].
    • The source mentions that Muslims did not get their status in India, nor in Bangladesh, and that Muslims were harassed in these three places [7].
    • It also notes that people still remember the forced marriages and conversions of girls [9].

    Multiple perspectives on the causes and effects of Partition are presented in the text:

    • Some believe that the partition was unnecessary and there could have been another way [4].
    • Others focus on the role of the British in dividing the country and leaving it for the people to keep arguing [4].
    • The text also highlights the different viewpoints about Gandhi and whether his approach was the right one [5].
    • The text mentions that some people believe that Muslims have only adopted one mission, to grow a state, and that the situation for Muslims in India is difficult [1].
    • The source notes that the population of Hindus in Pakistan has not decreased, and that there are many communities where the population is less [2].

    Overall, the sources emphasize the deep and lasting pain caused by the Partition of India, which included displacement, violence, communal hatred, and the ongoing consequences that are still being felt today [2, 6, 7].

    Hindu-Muslim Relations in India: A Legacy of Partition

    The sources provide a complex and multifaceted view of Hindu-Muslim relations, particularly in the context of the Partition of India, and its aftermath. Here’s a breakdown of the key aspects:

    Historical Tensions and the Partition:

    • The sources suggest that the Partition was a major turning point, exacerbating existing tensions [1, 2]. The division of the country led to immense suffering, with both Hindus and Muslims experiencing displacement, violence, and loss [1, 2].
    • The text mentions that hatred was spread against both communities [2], and that people were killed. It also says that the country was divided due to hatred, and people who were living in neighborhoods, where there was no crime, were tagged as criminals [3].
    • The text states that the British administration was responsible for a criminal system during Partition [3], and that they left the country so that the people could keep arguing [3].

    Differing Perspectives on Responsibility and Blame:

    • The sources reveal different viewpoints on who was responsible for the violence and the division [1, 4]. Some feel that Muslims were responsible for their own fate and the subsequent violence, while others point to the role of the British in creating the conditions for conflict [1, 2].
    • Some believe that the mistakes of Muslims in India were allowed by their own people, and that this is now acting as a trap [5]. There are also those who blame the political leadership at the time for not finding a better solution [3].
    • The sources describe differing views on the legacy of Mahatma Gandhi, with some viewing him as a positive force, and others criticizing his approach [1]. The text mentions Gandhi worshippers and those who believe in Nathuram Godse [1].

    Experiences of Muslims in India and Pakistan

    • The sources indicate that Muslims in both India and Pakistan have faced challenges. In India, some feel that Muslims have not achieved their full potential and that they have faced discrimination. They note that Muslims did not get their status in India, nor in Bangladesh, and that Muslims were harassed in these three places [2].
    • In Pakistan, the sources claim that minorities have been marginalized, with a significant decrease in their population after the partition. The source notes that the population of Hindus in Pakistan has not decreased, and that there are many communities where the population is less [6, 7].
    • The sources also note that Muslims are often viewed with suspicion and are easily targeted [4], with people pointing fingers at bearded people or pandits [4].

    Ongoing Issues and Concerns:

    • The text highlights that the issues stemming from the partition continue to cause conflict [2]. It also suggests that people are still divided amongst themselves, with continuing communal tensions [3].
    • The sources mention that there are concerns about the safety and security of Muslims, with examples of violence and displacement [8]. It is noted that even after the partition, people remember forced marriages and conversions of girls [7].
    • The text discusses the difficulties of navigating a diverse society, where people with different religious beliefs must coexist [2, 8].

    Points of Unity and Shared Experiences:

    • Despite the tensions, there are also calls for unity and understanding. The text emphasizes that the people should be ashamed that they are tagging their brothers as criminals and not looking out for them. [3].
    • It is suggested that Muslims and Hindus share common concerns about wages, housing, and security [8].
    • The sources also show that people from both communities have been affected by displacement and violence [7].

    In conclusion, the sources paint a complex picture of Hindu-Muslim relations characterized by historical grievances, ongoing tensions, and shared challenges. The legacy of Partition continues to impact the relationship between these communities, highlighting the need for reconciliation and understanding.

    Religious Violence in India: Partition and Beyond

    The sources discuss religious violence primarily in the context of the Partition of India and its aftermath, revealing a complex interplay of historical tensions, political actions, and communal hatred.

    Key Aspects of Religious Violence:

    • Partition as a Catalyst: The sources identify the Partition as a major event that triggered widespread religious violence [1, 2]. This violence included displacement, looting, killings, and general destruction, impacting both Hindus and Muslims [1-3].
    • The text notes that “bloody destruction” occurred and that people were oppressed [2, 4].
    • It suggests that the division of the country led to the killing of people and the spread of hatred [3].
    • People were forced to leave their homes and lost their properties during Partition, and many are still suffering the consequences [2-4].
    • Communal Hatred and Targeting: The sources highlight the role of communal hatred in fueling the violence.
    • The text says that both Hindus and Muslims suffered losses due to Partition, that the country was divided, and that hatred was spread [3].
    • It also notes how easily people point fingers at “bearded people” or “pandits” [4], indicating the dangers of communalizing violence.
    • The text states that people were tagged as criminals in their own neighborhoods [2], which indicates the spread of suspicion and distrust within communities.
    • Multiple Perspectives on Blame: The sources present diverse views regarding responsibility for the violence.
    • Some believe that Muslims were responsible for their own fate and the subsequent violence [1, 3].
    • Others blame the British for dividing the country and setting the stage for conflict [1, 2, 4].
    • Some suggest that the mistakes of Muslims in India were allowed by their own people [5].
    • Specific Instances and Examples: The sources mention specific instances of violence.
    • The text talks about the violence in 1971, where many people suffered atrocities [6].
    • The text refers to the Babri Masjid attack in 1992, and how Muslims in Hindu neighborhoods had to seek protection and make payments to survive [6].
    • The sources also recount how the British used weapons in the Spanish war and caused pain [3].
    • Ongoing Consequences and Concerns: The sources emphasize that the effects of religious violence persist.
    • The text says that the issues of partition continue to cause conflict, and that people remain divided amongst themselves [2, 3].
    • It also points out that Muslims did not get their status in India, nor in Bangladesh [3].
    • It suggests that forced marriages and conversions of girls are still remembered, highlighting a continuation of religiously motivated violence [7, 8].
    • Terrorism and Extremism: The text touches on the topic of terrorism and extremism, noting that they are not limited to any one group or religion [4].
    • It states that the maximum number of suicide attacks have been done in the Tamil Tiger group [4].
    • The text claims that the British formed groups like the Mujahideen, and that there was fighting and quarreling [4].
    • It also notes that some terrorist groups are labeled as criminals such as the Taliban and Al-Qaeda [1].
    • Displacement: The text discusses displacement of individuals and communities across different places and times, due to religiously motivated violence.
    • Hindus in Delhi were displaced from their properties [9].
    • In Pakistan, minorities faced displacement and population decrease [9].
    • Hindus in Haryana held a meeting to say give them weapons to use against Muslims [5].

    In conclusion, the sources portray religious violence as a complex issue with deep historical roots, exacerbated by political decisions and communal hatred. The violence is not limited to any one side or religion, and its impact continues to be felt in the present day. The sources emphasize the lasting pain and ongoing consequences of this violence and the need for reconciliation and understanding.

    India-Pakistan Relations: Partition’s Enduring Legacy

    The sources provide a detailed perspective on India-Pakistan relations, particularly in the context of the Partition and its lingering effects, highlighting a complex mix of historical grievances, ongoing conflicts, and some shared experiences.

    Historical Context and the Partition:

    • The Partition of India in 1947 is presented as a foundational event that significantly shaped the relationship between the two countries [1, 2]. The text indicates that the division led to widespread violence, displacement, and communal hatred, leaving lasting scars on both sides [1-3].
    • The sources suggest that the British are partly to blame for creating a system that led to conflict, and for leaving the two countries to argue with one another [1, 2].
    • The text argues that the country was divided due to hatred, and that people who were living in neighborhoods where there was no crime were tagged as criminals [2, 3].
    • The text mentions that the issues of partition continue to cause conflict and that people are still divided amongst themselves [2, 3].
    • The sources note that Muslims did not get their status in India nor in Bangladesh and were harassed in those three places, and that the issues stemming from partition are still creating conflict [3].

    Conflicting Perspectives and Accusations:

    • The sources reveal that there are differing perspectives on who was responsible for the violence and division. Some believe that Muslims were responsible for their own fate, while others point to the role of the British in creating the conditions for conflict [1-3].
    • The text mentions that some people believe that the mistakes of Muslims in India were allowed by their own people, and that this is now acting as a trap [3].
    • The sources also indicate that there are accusations and counter-accusations between the two countries. For example, the text claims that Hindus in Pakistan have not decreased in population, while also stating that minorities in Pakistan have been marginalized [3-5]. The text also describes the displacement of Hindus from their properties in Delhi [4].
    • The text also mentions that there are those who point fingers at bearded people or pandits, as a form of communal violence [3, 6].

    Ongoing Issues and Tensions:

    • The text points out that the legacy of Partition continues to fuel tensions and that the issues surrounding the division of the country have never been resolved [1-3].
    • The sources suggest that there are ongoing concerns about the treatment of minorities in both countries, with each side claiming that the other is persecuting its minority populations [3-5]. The text specifically mentions that the Muslim population in Pakistan has greatly decreased since partition [4].
    • The text also discusses the situation in Kashmir and how the removal of Article 370 led to people being kept as prisoners [2].
    • The sources reveal that the violence in 1971 is still remembered, and that there were atrocities committed at this time [7].

    Points of Convergence and Shared Experiences:

    • Despite the tensions, there are some instances of shared experiences. The text mentions that people in both India and Pakistan experienced displacement and violence [1-3].
    • The sources also suggest that the people in both countries have similar basic needs and concerns [7].
    • The text also provides examples of people from both countries who have achieved success in various fields, like Gopi Chand Narang and Gulzar [5].
    • The text suggests that leaders in both countries need to come together to address shared concerns and move forward [2, 6].

    Additional points:

    • The text mentions that the population of Muslims in India has grown significantly since partition [3, 4].
    • The sources indicate that both countries have faced internal conflicts, and that violence and oppression is not specific to one country [1, 2, 6].
    • The text notes that the British used weapons and caused pain in the Spanish war and that anyone who causes pain or harm is a terrorist [3].

    In conclusion, the sources paint a picture of India-Pakistan relations that are deeply affected by the historical trauma of Partition, characterized by ongoing tensions, mutual accusations, and a need for addressing long-standing grievances. Despite the conflicts, there are also suggestions of shared experiences and common concerns that could potentially pave the way for reconciliation and understanding. The sources emphasize the lasting pain and the complex nature of this relationship, which continues to be shaped by its past.

    Minority Rights in India and Pakistan

    The sources discuss minority rights primarily in the context of the treatment of religious minorities in India and Pakistan, revealing significant concerns and challenges related to their status and safety [1-8].

    Key aspects of minority rights discussed in the sources:

    • Discrimination and Marginalization: The sources indicate that religious minorities in both India and Pakistan face discrimination and marginalization [7].
    • In Pakistan, there are claims that minorities face significant problems, and the text notes that humanity does not start with them in Pakistan [7].
    • The text also notes that the minority population in Pakistan has greatly decreased since the partition [7].
    • In India, Muslims are mentioned as facing discrimination, with some suggesting that they have not achieved their full status. [4, 6]
    • There are claims that it is easy to point fingers at “bearded people or pandits”, which indicates the dangers of communalizing violence [2].
    • Population Changes and Displacement: The sources discuss the change in minority populations since the Partition [7].
    • In Pakistan, the minority population has significantly decreased since the partition [7].
    • The text states that at the time of Partition, the minority population in Pakistan was 22-25%, but now is less than 3% [7].
    • In India, the Muslim population has increased from 9% to 15% since partition [7].
    • The displacement of Hindus from their properties in Delhi is also mentioned [7].
    • The text also notes that many people were driven out of Pakistan during the partition [7].
    • The sources state that Hindus in Pakistan are leaving their homes [8].
    • The text notes that people were forced to leave their homes and lost their properties during partition and many are still suffering the consequences [1].
    • Violence and Security: The sources highlight instances of violence against minorities [4, 6, 8].
    • The text mentions the Babri Masjid attack in 1992, where Muslims had to seek protection and pay for their safety in Hindu neighborhoods [6].
    • The text mentions forced marriages and conversions of girls, which highlights the continuation of religiously motivated violence [8].
    • The sources also recount the violence in 1971 where many people suffered atrocities [2, 6].
    • The text also notes that there are ongoing concerns about the treatment of minorities in both countries [2, 4, 6, 7].
    • The sources also highlight that people in both countries have experienced displacement and violence [1-3, 8].
    • Political Representation and Status: The sources also discuss the political status of minorities [3-8].
    • There is a mention that Muslims in India have a significant population, and questions if their political power is aligned with their numbers [5].
    • The sources claim that the Muslims did not get their status in India nor in Bangladesh [4].
    • The sources mention that in Kashmir, people were kept as prisoners after the removal of Article 370 [3].
    • Shared Concerns: Despite the conflicts, the sources also suggest some shared concerns [3, 6].
    • The text states that people in both countries have similar basic needs, such as wages, housing and security [6].
    • The text mentions examples of people from both countries who have achieved success in various fields, like Gopi Chand Narang and Gulzar [8].
    • Complexities of Identity and Belonging: The sources note that in both India and Pakistan, minority groups can experience a struggle with their sense of belonging [2-8].
    • The text gives an example of an individual who was considered Pakistan Nawaz because of their name, even though they never raised a flag of Pakistan [9].
    • The text also notes that people were tagged as criminals in their own neighborhoods, which indicates a spread of distrust [3].
    • Historical Responsibility: The sources suggest different parties may have historical responsibility for the current situation, including the British [1-4, 6].
    • Some believe that the British are to blame for creating a system that led to conflict [3].
    • Some believe that Muslims were responsible for their own fate [2].
    • Others suggest that the mistakes of Muslims in India were allowed by their own people [1, 5].

    In conclusion, the sources depict a complex and challenging situation for minority rights in both India and Pakistan. There are clear instances of discrimination, displacement, and violence, along with a lack of security and equal status. The sources suggest that historical events and political decisions have contributed to these ongoing problems, and that these issues continue to affect minority groups in the present day.

    By Amjad Izhar
    Contact: amjad.izhar@gmail.com
    https://amjadizhar.blog

  • Al Riyadh Newspaper, March 9, 2025 Social, Economic, and International Affairs

    Al Riyadh Newspaper, March 9, 2025 Social, Economic, and International Affairs

    These articles from “20709.pdf” primarily cover Saudi Arabian news and perspectives, featuring the national philanthropy platform “Ehsan” and its significant charitable impact. Another key theme is the growing role and achievements of Saudi women across various sectors, highlighted by their increasing participation in the workforce and leadership positions. The publication also reports on regional and international political developments, including Saudi Arabia’s stance on global issues and its involvement in diplomatic efforts. Additionally, the sources discuss economic trends, cultural events, and social initiatives within the Kingdom.

    Study Guide: Analysis of News Articles (March 9, 2025)

    I. Quiz (Short Answer)

    1. What were the two primary focuses of the meeting held in Jeddah, and on which page of the source material can details about this meeting be found?
    2. According to the source, what is the total amount of donations received by the “Ihsan” platform for charitable work since its inception in 2021, and what principle reflects the community’s support for these donations?
    3. How does the Kingdom of Saudi Arabia support the increased participation of women in the workforce, referencing the goals of Vision 2030?
    4. What are the key features of the sound system implemented in the Grand Mosque in Mecca, and what is its purpose in serving worshippers?
    5. What was the main goal of the “Walk with Health” campaign, and how many steps did participants collectively record within the first five days of Ramadan?
    6. In what fields have Saudi women become a significant force, as highlighted in the article celebrating their empowerment and achievements? Provide at least two examples.
    7. What is the significance of the Princess Reema bint Bandar Al Saud’s role mentioned in the article regarding Saudi women’s achievements?
    8. What was the main topic discussed at the meeting of the Council of Foreign Ministers of the Organization of Islamic Cooperation held in Jeddah?
    9. According to the article on “The British Man Who Preserved America’s Legacy,” what was the unusual aspect of James Smithson’s will, and what institution eventually resulted from it?
    10. What is the central theme of the news piece titled “Brain Rot,” and what are some of the contributing factors and potential consequences mentioned?

    II. Quiz Answer Key

    1. The meeting held in Jeddah focused on discussing bilateral relations and the latest developments in the region, as well as the efforts being made regarding these developments. Details can be found on page 8.
    2. The total amount of donations received by the “Ihsan” platform since 2021 has exceeded 10 billion Riyals. This reflects the spirit of social cohesion demonstrated by individuals in the community.
    3. The Kingdom supports increased female participation through strategic goals within Vision 2030, focusing on guaranteeing women’s rights and empowerment in the labor market, education, and health sectors. This aligns with the fifth Sustainable Development Goal.
    4. The sound system in the Grand Mosque uses the latest digital audio technology and Dante Audio Network for clear, uninterrupted sound. Its purpose is to ensure equal sound distribution to all worshippers throughout the mosque, including the courtyards and different levels.
    5. The main goal of the “Walk with Health” campaign was to promote a healthy lifestyle during the month of Ramadan. Participants collectively recorded over two billion steps within the first five days.
    6. Saudi women have become a significant force in fields such as medicine, engineering, economics, administration, and technology. They have achieved unprecedented successes, becoming a source of pride for their nation.
    7. Princess Reema bint Bandar Al Saud is highlighted as a prominent female figure who has contributed to shaping a new image of Saudi women both locally and internationally, notably through her appointment as the first Saudi female ambassador to the United States.
    8. The main topic discussed at the OIC Council of Foreign Ministers meeting in Jeddah was the Israeli aggression against the Palestinian people and attempts to displace them from their land.
    9. The unusual aspect of James Smithson’s will was that he bequeathed his entire fortune to the United States government to establish an institution for the “increase and diffusion of knowledge among men,” despite having no direct connection to the country. This led to the founding of the Smithsonian Institution.
    10. The central theme of “Brain Rot” is the potential negative impact of excessive and uncontrolled social media use on cognitive functions. Contributing factors include inactivity and lack of physical movement, and potential consequences involve difficulties in decision-making, problem-solving, focus, and memory.

    III. Essay Format Questions

    1. Analyze the various initiatives and campaigns highlighted in the news articles that demonstrate the Kingdom of Saudi Arabia’s commitment to social welfare and development. Discuss the objectives and potential impact of these efforts.
    2. Discuss the significance of the increasing participation and empowerment of women in Saudi Arabia, as portrayed in the provided news sources. How does this align with the goals of Vision 2030, and what are some of the key areas where women are making notable contributions?
    3. Evaluate the role of international cooperation and diplomacy, as evidenced by the meeting of the OIC foreign ministers and other mentions of global engagement, in addressing regional and international issues discussed in the news articles.
    4. Critically examine the potential societal and individual impacts of the trends and issues highlighted in the articles, such as the growth of charitable giving through platforms like “Ihsan” and the concerns raised about excessive social media use in “Brain Rot.”
    5. Compare and contrast the different areas of development and change highlighted in the articles, such as social empowerment, technological advancements (e.g., the Grand Mosque’s sound system, digital currency initiatives), and economic activities (e.g., mergers and acquisitions), in shaping the Kingdom of Saudi Arabia and its global engagement.

    IV. Glossary of Key Terms

    • Ihsan: (From the Arabic text) A platform for charitable work in Saudi Arabia.
    • Vision 2030: The Kingdom of Saudi Arabia’s ambitious plan for economic and social reform.
    • WAS (وكالة الأنباء السعودية): (From the Arabic text, implied by datelines) Saudi Press Agency.
    • LEED (Leadership in Energy & Environmental Design): A globally recognized green building certification system.
    • OIC (Organization of Islamic Cooperation): An international organization founded in 1969 consisting of 57 member states, with a collective voice of the Muslim world.
    • Digital Audio: Technology that uses digital signals to transmit and process sound, often resulting in higher fidelity and less noise.
    • Dante Audio Network: A specific network protocol that allows for the transmission of high-quality audio over a digital network with low latency.
    • Brain Rot: (As used in the article) A colloquial term referring to the potential decline in cognitive abilities due to excessive and uncontrolled use of social media, often associated with inactivity.
    • Smithsonian Institution: A U.S. institution created by funds from James Smithson’s will, dedicated to the “increase and diffusion of knowledge.”
    • Fatwa: (Not explicitly in the text but relevant to religious contexts mentioned) A non-binding legal opinion or ruling issued by a mufti or religious scholar on a point of Islamic law.

    Saudi Arabia: Key Themes and Developments

    Based on the provided excerpts, here is a detailed briefing document reviewing the main themes and most important ideas or facts:

    Briefing Document

    Date: October 26, 2023 (Based on the issue date of some articles) Subject: Review of Key Themes and Information from Provided Sources

    This briefing document summarizes the main themes, important ideas, and key facts identified across the provided Arabic language news articles and excerpts. The sources cover a diverse range of topics, reflecting current events and ongoing initiatives in Saudi Arabia.

    I. Philanthropic Campaigns and Social Responsibility:

    • “Joud Regions Campaign” and “Ehsan” Platform: A significant theme revolves around organized philanthropy, exemplified by the “Joud Regions Campaign” and the national charitable work platform “Ehsan.”
    • The “Joud Regions Campaign” aims to integrate the objectives of charitable work with national goals under the leadership of King Salman and Crown Prince Mohammed bin Salman. It operates both domestically to benefit regions and internationally for relief efforts, driven by values of giving, quality, and generosity established by King Abdulaziz.
    • The “Ehsan” platform serves as a national portal for charitable donations and has garnered substantial support from the leadership.” (Translation: “The Custodian of the Two Holy Mosques, King Salman bin Abdulaziz Al Saud, and His Royal Highness Crown Prince Mohammed bin Salman bin Abdulaziz Al Saud – may God protect them – have presented two generous donations to the fifth edition of the National Campaign for Charitable Work, through the ‘Ehsan’ platform, amounting to 40 million Riyals from the Custodian of the Two Holy Mosques and 30 million Riyals from His Highness the Crown Prince.”)
    • The total donations through the “Ehsan” platform since its inception in 2021 have exceeded 10 billion Riyals.
    • The platform is characterized by high levels of governance, transparency, reliability in advanced technical handling of donations, and ease of use.
    • The campaign and platform are particularly active during Ramadan, a time when charitable giving is emphasized in Islam.
    • The “Ehsan” platform operates under the supervision of the Saudi Authority for Data and Artificial Intelligence (“Sdaya”) and is overseen by a Sharia committee to ensure compliance with Islamic law.
    • “Sum Bi Saha” (Fast with Health) Campaign: This health awareness campaign launched by Al-Sahah Holding and its 20 health clusters during Ramadan encourages walking, with participants recording over two billion steps in the first five days. It also promotes regular health check-ups.

    II. Women’s Empowerment and Role in Development:

    • Increasing Participation in the Labor Market: A significant focus is placed on the increasing role and empowerment of Saudi women across various sectors.
    • The Kingdom emphasizes the importance of women in development by investing in their capabilities at local and international levels.
    • Vision 2030 includes a strategic objective to increase women’s participation in the labor market, guaranteeing their rights in health, education, and employment.
    • Efforts and legislation in recent years have contributed to notable progress in women’s empowerment in the job market, aligning with Vision 2030 targets.
    • The government’s commitment to women is evident in the attention and programs dedicated to their advancement..” (Translation: “The Kingdom of Saudi Arabia affirmed the importance of the Saudi woman’s role in development through investing in and developing her capabilities to activate her role at the local and international levels, based on its belief that the woman is an important element of society, and this is reinforced by the Saudi government’s dedication of a separate strategic goal in Vision 2030 to increase women’s participation in the labor market, guaranteeing their rights in health, education, and the labor market, to align with the fifth goal of the Sustainable Development Goals, and reflects the ambitious vision of the three basic pillars: an ambitious nation, a prosperous economy, and a vibrant society.”)
    • Achievements and Leadership: Saudi women are achieving unprecedented accomplishments and are becoming influential figures in fields like medicine, engineering, economics, administration, and technology.
    • The Kingdom celebrates the achievements of women on International Women’s Day (March 8th).
    • Examples of pioneering Saudi women are highlighted, including the first female pilot, the first Saudi female Formula E driver, the first Saudi female polar explorer, and women holding leadership positions in various sectors.
    • Princess Reema bint Bandar bin Sultan is recognized for her role in shaping a new image of Saudi women locally and internationally, notably as the first Saudi female ambassador to the United States. She has also contributed significantly to women’s empowerment in sports and other fields.
    • The increasing number of women in leadership roles in ministries, major companies, banks, investment funds, and academic institutions reflects their growing influence.

    III. Regional and International Affairs:

    • Saudi Arabia’s Efforts Regarding the Ukrainian Crisis: The Kingdom continues its efforts to find a lasting peaceful resolution to the Ukrainian crisis, hosting numerous related meetings.
    • Ministerial Council of the Organization of Islamic Cooperation (OIC) and Gaza: The OIC’s Council of Foreign Ministers adopted an Arab plan regarding Gaza.
    • The council firmly rejects any plans aimed at forcibly displacing the Palestinian people, considering it ethnic cleansing and a grave violation of international law.
    • The OIC condemns policies of starvation and the burning of land and crops in the Palestinian territories.
    • The council emphasizes the centrality of the Palestinian issue for the Islamic Ummah and reaffirms its support for the Palestinian people’s right to self-determination, independence, freedom, and sovereignty over their land, with East Jerusalem as its capital.
    • The OIC stresses the need for Israel, the occupying power, to implement a permanent and sustainable ceasefire in the Gaza Strip, facilitate the return of displaced persons, withdraw its forces, open all crossings, and ensure the delivery of humanitarian aid.
    • The council supports the formation of a Palestinian government under the umbrella of the State of Palestine and welcomes the decision of the government to form a committee of national competencies from the Gaza Strip for a transitional period.
    • The OIC holds Israel responsible for war crimes and genocide committed against the Palestinian people.
    • The council calls for international protection for the Palestinian people and supports the efforts of the international coalition to implement the two-state solution, led by Saudi Arabia.
    • Saudi Arabia and Iran Relations: While not a primary focus, the document mentions Iran’s strengthening defense cooperation with Moscow in the past year and the complexities surrounding the Iranian nuclear program and international sanctions. There is also a brief mention of potential direct communication between US presidents and Iranian leaders.
    • Lebanon and Hezbollah: The document notes the end of a ceasefire between Hezbollah and Israel in November and continued Israeli strikes within Lebanese territory.

    IV. Economic Developments and Investments:

    • Mergers and Acquisitions (M&A) in the MENA Region: The Kingdom and the UAE recorded 318 M&A deals worth $29.6 billion in 2024.
    • The broader MENA region witnessed significant M&A activity in 2024, driven by reforms and strategic investment efforts.
    • Cross-border deals were a major driver, constituting a significant portion of both the number and value of deals.
    • Key sectors targeted included insurance, asset management, real estate and hospitality, energy and utilities, and technology.
    • Saudi Arabia and UAE sovereign wealth funds continued to lead investment activity.
    • NEOM and Women in the Energy Sector: NEOM emphasizes the growing presence of women in its workforce, particularly in the green hydrogen sector. The company is actively working to empower women through training and leadership roles.
    • Potential Change in Saudi Riyal Currency Symbol: There is a discussion about the possibility of updating the Saudi Riyal’s currency symbol as part of broader economic and technological advancements aligned with Vision 2030, including the development of digital payment systems.

    V. Cultural and Educational Initiatives:

    • King Salman Global Academy for the Arabic Language Conference: The academy announced its fourth annual international conference to be held in Riyadh in October 2025, focusing on the lexicographical industry.
    • “Downtown Design Riyadh” Exhibition: The Architecture and Design Commission is preparing to launch a leading contemporary design exhibition, “Downtown Design Riyadh,” in May 2025. This event aims to showcase local and international design talent and foster collaboration.
    • Revitalization of Historical Mosques: The second phase of a project to develop historical mosques has been launched after the completion of the first phase, which involved the rehabilitation and restoration of 30 historical mosques in 10 regions. The project aims to restore the architectural authenticity of these mosques, highlight their cultural significance, and contribute to the Kingdom’s cultural dimension under Vision 2030.
    • Sound System Upgrade at the Grand Mosque (Al-Haram): The Grand Mosque in Mecca has been equipped with a state-of-the-art digital audio system to ensure clear sound throughout its courtyards and indoor spaces.
    • “Mawhiba” (Talent) Foundation and Scientific Participation: The Saudi national team for scientific projects saw an increase in participation in international competitions, reflecting a growing interest in scientific innovation among Saudi youth.
    • “Ensan” (Human) Charitable Association for Orphans Care: This association, under the leadership of HRH Prince Faisal bin Bandar bin Abdulaziz, Governor of Riyadh Region, focuses on improving the quality of life and empowering orphans in line with Vision 2030.
    • The Rise of Islamic Civilization and Modern Science: The excerpts briefly touch upon the significant contributions of Islamic civilization to the development of medicine and astronomy, laying the groundwork for modern scientific advancements.

    VI. Health and Wellness:

    • The “Sum Bi Saha” campaign highlights the focus on public health and encourages healthy lifestyles.
    • An article discusses the potential negative impacts of excessive and uncontrolled social media use, likening it to “brain rot” and linking it to difficulties in decision-making, problem-solving, focus, and memory. It suggests solutions such as mandatory exercise and regulatory measures.

    VII. Sports:

    • The Saudi national beach soccer team announced its final list for the Asian Cup in Thailand.
    • Real Madrid’s coach, Carlo Ancelotti, emphasized the need for caution in their upcoming match against Rayo Vallecano in the Spanish league.
    • Liverpool continues its strong performance in the English Premier League, while Manchester City faced another defeat.
    • Al-Taawoun defeated Damac, Al-Riyadh deepened the wounds of Al-Akhdoud, and Al-Ettifaq triumphed over Al-Aruba in Saudi league matches.
    • There is commentary on Al-Hilal’s performance and the need to address the depth of their substitute bench.

    VIII. Social Commentary:

    • An article titled “What is Absent Here Shines Elsewhere” uses the Quranic verse about the sun to reflect on the cyclical nature of opportunity and success.
    • The “Smile of Hospitality” section likely refers to social interactions and cultural norms, possibly related to Ramadan, although the specific content in the excerpt is limited.

    IX. Historical and Biographical Snippets:

    • The article “The Briton Who Preserved America’s Legacy” recounts the story of James Smithson, a British scientist who bequeathed his entire fortune to the United States to establish the Smithsonian Institution, despite never having visited the country.
    • The “Catch” section includes a dialogue about the representation of architecture in literature, mentioning various authors and novels.
    • A piece remembers Sheikh Abdullah Al-Husaini, a pioneer of girls’ education in the Al-Qassim region, highlighting his significant contributions and the challenges he overcame.

    X. Ramadan Specific Content:

    • Several articles mention Ramadan in the context of charitable campaigns, health initiatives, and the unique spiritual environment of the month that can facilitate positive habit change.

    XI. Tourism and Natural Wonders:

    • A brief mention of Niagara Falls highlights its popularity as a tourist destination.
    • The significance of Jabal Abi Qubais in Mecca is noted for its religious and geographical history.

    XII. Weather Report:

    • A short weather update mentions the highest rainfall recorded in the Arjaa region of Al-Dawadimi governorate and other areas in the Kingdom.

    This briefing document provides a comprehensive overview of the diverse topics covered in the provided sources, highlighting key initiatives, achievements, and ongoing developments within Saudi Arabia and its engagement with the wider world.

    Saudi Arabia: Social Initiatives and Progress

    Frequently Asked Questions

    1. What is the Ihsan platform and what are its primary goals?
    2. The Ihsan platform is a Saudi national platform for charitable work. Its primary goals include maximizing the impact of charitable work during the month of Ramadan and beyond, embodying the values of giving, generosity, and quality instilled by the Kingdom’s founders. It aims to channel donations efficiently and transparently to beneficiaries, fostering a sense of social solidarity within the community. The platform operates under the supervision of the Saudi Authority for Data and Artificial Intelligence (“Sdaya”) and a Sharia committee, ensuring governance, reliability, and adherence to Islamic law in the receipt and distribution of donations.
    3. How significant has the Ihsan platform been in terms of donations received and impact?
    4. Since its establishment in 2021, the Ihsan platform has received over 10 billion Saudi Riyals in donations. This substantial amount reflects the strong social cohesion and the public’s trust in the platform’s governance, transparency, and advanced technical handling of donations. The platform has successfully channeled these funds to various charitable causes, aligning with the goals of Saudi Vision 2030 to support the non-profit sector and enhance its societal and developmental contributions.
    5. What are the key aspects of Saudi Arabia’s efforts in empowering women, as highlighted in the sources?
    6. Saudi Arabia emphasizes the crucial role of women in development and aims to empower them by investing in their potential. This commitment is reflected in Vision 2030, which includes a strategic objective to increase women’s participation in the workforce while ensuring their rights in various fields like health, education, and the job market. The Kingdom has achieved significant progress in women’s empowerment through supportive legislation and initiatives, leading to a continuous rise in women’s participation in the labor market and their assumption of leadership roles across different sectors.
    7. Can you provide examples of Saudi women achieving significant milestones and breaking barriers?
    8. The sources highlight numerous Saudi women who have achieved remarkable feats. Examples include Yasmeen Al-Maimani, considered the first Saudi female pilot; Aseel Al-Barrak, the first Saudi female Formula racing driver; Mariam Fardous, a pioneering Saudi female diver holding advanced certifications; and Khulood Al-Subait, the first Saudi female lawyer to win a licensing dispute and a member of the Saudi Organization for Certified Public Accountants. Additionally, Princess Reema bint Bandar Al Saud is the first Saudi female ambassador to the United States, showcasing women’s growing prominence in leadership and international roles.
    9. What is the “Sam Bi Sahha” campaign, and what does it aim to achieve?
    10. The “Sam Bi Sahha” (Walk for Health) campaign, launched by the Health Holding Company and its 20 health clusters during Ramadan, aims to promote a culture of healthy living. It encourages citizens and residents to walk at least 8,000 steps daily. The campaign recorded over two billion steps taken by more than 223,000 participants within the first five days. It also offers a package of diverse health services and introduces various laboratory tests through primary healthcare centers, aligning with efforts to enhance public health.
    11. What are Saudi Arabia’s initiatives concerning historical mosques?
    12. Saudi Arabia has launched the second phase of a project to develop historical mosques, following the completion of the first phase which involved the rehabilitation and restoration of 30 historical mosques in 10 regions. The project’s strategy focuses on restoring the original architectural character of these mosques, highlighting their historical and urban significance to the Kingdom, strengthening the religious and cultural status of historical mosques, and showcasing the Kingdom’s cultural and historical dimensions, as emphasized by Vision 2030, which seeks to preserve and leverage the unique architectural heritage of these sites.
    13. What advancements have been made in the audio system of the Grand Mosque in Mecca?
    14. The Grand Mosque features a state-of-the-art digital audio system to ensure clear and pure sound throughout its premises, both indoors and outdoors. It utilizes the latest digital audio technologies and Dante Audio Network for high-quality sound transmission over a digital network without delay. Thousands of speakers are strategically distributed across the mosque, including the courtyards, the Sa’i area, and the Mataf area, with precise sound direction to ensure even coverage without interference. The system is managed by advanced central control rooms with backup systems to guarantee continuous broadcasting. The sound distribution is meticulously designed to prevent echoes and overlapping sound waves, with volume levels adjusted based on the density of worshippers to maintain clarity without being disruptive.
    15. What was the outcome of the Organization of Islamic Cooperation (OIC) Council of Foreign Ministers’ extraordinary session regarding Palestine?
    16. The OIC Council of Foreign Ministers, in an extraordinary session held in Jeddah, adopted the Arab Plan regarding Gaza. The resolution firmly rejects any plans aimed at forcibly displacing Palestinians, whether internally or externally, under any pretext, considering it ethnic cleansing and a severe violation of international law. The OIC also condemned the policies of starvation and the destruction of lands and property in the occupied Palestinian territories. It emphasized the centrality of the Palestinian cause for the Islamic Ummah, reaffirmed its steadfast support for the Palestinian people’s right to self-determination, independence, and sovereignty, and reiterated its commitment to a just and lasting peace based on relevant UN resolutions and the Arab Peace Initiative. The OIC also called for holding Israel accountable for its crimes and for providing international protection to the Palestinian people.

    Saudi Vision 2030: Transformation and Progress

    Saudi Vision 2030 is a significant topic discussed in the sources, particularly in relation to the empowerment of women and the development of various sectors within the Kingdom.

    Several sources emphasize the goal of increasing women’s participation in the workforce as a key objective of Vision 2030. The Kingdom has set a strategic target to increase women’s participation in the labor market, aiming for 40% by 2030, surpassing the initial target of 30%. This reflects the government’s keen interest in women’s empowerment and their pivotal role in national development. The sources highlight that the past few years have witnessed a continuous rise in the rates of women’s participation in the labor market. This increase is attributed to efforts and legislation enacted in line with Vision 2030. Initiatives and programs by the Ministry of Human Resources and Social Development have contributed to overcoming obstacles facing women in the job market. The empowerment of women is seen as a driving force towards a sustainable and comprehensive economy. Women are increasingly occupying leadership positions in various ministries, major companies, and financial institutions, reflecting their growing role in Saudi Arabia’s economic development.

    Beyond women’s empowerment, Vision 2030 encompasses broader goals for economic diversification and development. The “Down Town Design Riyadh” exhibition, for instance, aligns with the Kingdom’s Vision 2030 by aiming to meet the needs of the growing interior design market, driven by real estate and hospitality developments. Projects like NEOM, The Line, and Diriyah Gate are mentioned as examples of significant developments contributing to this growth. These initiatives suggest a focus on creating a thriving economy.

    The vision also aims for a vibrant society. The support for charitable work through platforms like “Ehsan” can be seen as contributing to this goal by promoting social cohesion and solidarity. While not explicitly linked to Vision 2030 in the immediate context of the “Ehsan” campaign, the broader emphasis on philanthropy aligns with the social development aspects of the vision.

    Furthermore, Vision 2030 has a cultural dimension, as evidenced by the focus on preserving the historical and urban characteristics of mosques while developing their design. The aim is to highlight the Kingdom’s historical depth and cultural significance. The announcement of the fourth annual international conference of the King Salman Global Academy for the Arabic Language also aligns with the cultural enrichment goals of Vision 2030. The empowerment of women in the cultural field by supporting artistic projects and initiatives that provide broader opportunities for creativity and influence is also part of this vision.

    Sustainability is another key pillar of Vision 2030, demonstrated by Dr. Sulaiman Al Habib Medical Group achieving LEED Gold certification for their commitment to environmental practices, which is in line with the Kingdom’s 2030 vision in supporting environmental sustainability and the development process. Additionally, Saudi women are participating in the creation of the world’s largest green hydrogen production plant in NEOM, signifying their role in sustainable energy initiatives that are part of Vision 2030.

    Overall, Saudi Vision 2030 is presented in the sources as a comprehensive framework for the Kingdom’s future, encompassing economic diversification, social progress with a strong emphasis on women’s empowerment, cultural enrichment, and environmental sustainability. The initiatives and achievements highlighted in the articles are portrayed as steps towards realizing the ambitious goals set by this vision.

    Ehsan: Saudi Arabia’s National Platform for Charitable Work

    The Ehsan platform is a national platform for charitable work in Saudi Arabia. Its establishment was approved by the Custodian of the Two Holy Mosques, King Salman bin Abdulaziz Al Saud. This initiative is in its fifth edition as of the publication date of the sources.

    Key features and objectives of the Ehsan platform, as described in the sources, include:

    • Facilitating Charitable Giving: Ehsan provides an opportunity for all members of society to participate in charitable deeds by making donations. This is emphasized as a way to foster a sense of community, social solidarity, and uplift the Saudi society.
    • Digital Infrastructure: The platform operates electronically through its application (Ehsan.sa) and website. It also utilizes SMS and a unified call center (8001247000) to receive contributions. This digital approach aims to enable the charitable sector digitally and streamline the donation process.
    • Transparency and Governance: Ehsan follows a methodical approach to ensure donations reach eligible recipients. It adheres to strict governance programs to enhance the credibility and transparency of charitable work, aiming to avoid errors or arbitrary decisions. The platform operates according to high governance standards.
    • Oversight and Collaboration: The Saudi Authority for Data and Artificial Intelligence (SDAIA) was responsible for creating the Ehsan platform. It is overseen by a higher committee comprising 13 government agencies, indicating a collaborative effort to ensure its effectiveness.
    • Alignment with National Goals: The goals of the Ehsan platform complement the objectives of the national campaign for charitable work and the King Salman Center for Relief and Humanitarian Aid. It is seen as an extension of the generous care for charitable work by the Kingdom’s leadership.
    • Significant Impact: The platform has already demonstrated a significant impact, with the King and the Crown Prince donating 70 million Riyals to the current “Charitable Work Campaign”, and total donations reaching 700 million Riyals with ongoing increases. Ehsan aims to deliver aid to beneficiaries as quickly as possible.
    • Welcoming Diverse Contributions: Ehsan welcomes contributions from individuals, companies, banks, philanthropists, and donors through various channels.

    In our previous discussion, we noted that Saudi Vision 2030 aims for a vibrant society and emphasizes social progress. The Ehsan platform aligns with these aspects by promoting social cohesion, encouraging responsible citizenship through charitable giving, and ensuring that aid reaches those in need efficiently and transparently. While not explicitly stated as a direct initiative of Vision 2030 in the provided sources, its objectives and the support it receives from the highest levels of government suggest its role in achieving the broader social development goals of the Kingdom.

    Saudi Arabia: National Charity Initiatives and the Ehsan Platform

    The sources provide several insights into charity work in Saudi Arabia, highlighting it as a significant national endeavor supported by the highest levels of government and involving broad societal participation.

    One of the central mechanisms for charity work discussed is the Ehsan platform. It is a national platform for charitable work in its fifth edition, having been approved by the Custodian of the Two Holy Mosques, King Salman bin Abdulaziz Al Saud. Ehsan aims to provide an opportunity for all members of society to participate in charitable deeds by making donations. This fosters a sense of community and social solidarity, ultimately uplifting Saudi society [1, You]. The platform operates electronically through its application (Ehsan.sa) and website, as well as via SMS and a unified call center (8001247000) to facilitate contributions. This digital infrastructure is intended to enable the charitable sector digitally and streamline the donation process [You].

    A key emphasis of the Ehsan platform is transparency and governance. It follows a methodical approach to ensure donations reach eligible recipients and adheres to strict governance programs to enhance the credibility and transparency of charitable work. This aims to prevent errors or arbitrary decisions, ensuring the platform operates according to high governance standards [1, You]. The Saudi Authority for Data and Artificial Intelligence (SDAIA) created the Ehsan platform, and it is overseen by a higher committee comprising 13 government agencies, indicating a collaborative effort to ensure its effectiveness [You].

    The Joud Regions Campaign (“جود حملة المناطق”) is another significant initiative for charitable work mentioned in the sources. This campaign is in its second edition and is endorsed by the Emirs of the regions across the Kingdom. It shares the goal of supporting the needy by providing suitable housing. The fifth edition of the national campaign for charitable work, which the Ehsan platform supports, also aims to support social, housing, health, educational, and food projects.

    The goals of these charitable initiatives complement the objectives of the national campaign for charitable work and the King Salman Center for Relief and Humanitarian Aid. This highlights that charity work is seen as an extension of the generous care provided by the Kingdom’s leadership [2, You]. The Kingdom’s commitment to (methodical collection of their donations throughout the month of Ramadan, which the electronic platform “Ehsan” leads, ensuring the arrival of aid to those who deserve it, and helping them in that by following strict governance programs) underscores the organized and well-intentioned nature of these efforts.

    The values underpinning charity work in Saudi Arabia are those of giving, generosity, and benevolence (العطاء والجود والكرم), which were instilled by the founder, King Abdulaziz bin Abdulrahman, and have been upheld by the Kingdom’s leaders. These values drive the various charitable initiatives aimed at benefiting those in need.

    The Ehsan platform has already achieved a significant impact, with substantial donations made by the King and Crown Prince, and total donations reaching large sums [7, You]. The platform’s aim is to deliver aid to beneficiaries as quickly as possible. It also welcomes contributions from diverse sources, including individuals, companies, and philanthropists [7, You].

    In the context of our previous discussion on Saudi Vision 2030, while the sources don’t explicitly link the Ehsan platform or the Joud Regions Campaign as direct initiatives of the vision, their objectives align with the broader goal of a vibrant society [You]. These charitable endeavors promote social cohesion and solidarity and contribute to the overall well-being of the Kingdom’s population by supporting those in need [1, You].

    Saudi Women’s Empowerment and Vision 2030

    The sources and our conversation history provide significant information regarding women’s empowerment in Saudi Arabia, particularly in the context of Saudi Vision 2030.

    Our previous discussion highlighted that increasing women’s participation in the workforce is a key objective of Vision 2030, with a target of 40% by 2030 [You]. This reflects the government’s strong focus on women’s empowerment and their crucial role in national development [You]. The continuous increase in women’s participation rates is attributed to efforts and legislation aligned with Vision 2030, with initiatives from the Ministry of Human Resources and Social Development helping to overcome obstacles in the job market [You]. This empowerment is considered a driving force for a sustainable and comprehensive economy, with women increasingly holding leadership positions [You].

    Several sources further elaborate on these aspects:

    • Governmental Support and Initiatives: The Kingdom has undertaken various initiatives aimed at supporting women and enhancing their role in society as part of its governmental work and social responsibility. This includes initiatives specifically designed to support women’s leadership and empowerment, positioning Saudi Arabia as a model in international forums.
    • Creating Spaces for Expression and Achievement: Spaces have been created for Saudi women to express themselves, share their ideas about society, and gain opportunities to connect with a broader audience, thereby enhancing their presence. The media’s focus on supporting and empowering women is now a national strategic direction aimed at motivating future generations to achieve even greater accomplishments.
    • Progress in Diverse Fields: Saudi women are increasingly proving their presence in various fields, including culture, arts, and literature, contributing to a modern and advanced image of the Kingdom. Princess Reema bint Bandar is specifically mentioned for her role in shaping a new image of Saudi women both locally and internationally.
    • Addressing Challenges and Rights: The Kingdom is intensifying programs to raise awareness of women’s rights as guaranteed by Islamic law and is actively working to remove challenges facing them, guided by the principles of justice, equality, and human dignity.
    • Role in Future Projects: Major projects like NEOM recognize the importance of women’s empowerment, aiming to provide leadership opportunities for women in innovation and the development of new ideas. Women are also actively participating in sustainable energy initiatives, such as the creation of the world’s largest green hydrogen production plant in NEOM [You, 23].
    • Platform for Creators: Initiatives like the “Golden Pen Diwaniyah” provide a platform for creators, including women, to discuss their concerns, exchange experiences, and collaborate on projects.

    It is important to note that while the Ehsan platform and the Joud Regions Campaign are national initiatives for charitable work involving all members of society, including women, the sources do not specifically highlight them as direct women’s empowerment initiatives. However, women’s participation in such activities aligns with the broader concept of their active role in society.

    In conclusion, the sources and our previous discussion consistently portray women’s empowerment as a significant and actively pursued goal in Saudi Arabia, deeply embedded within the framework of Saudi Vision 2030. This empowerment is manifested through various governmental initiatives, increasing participation in the workforce and leadership roles, advancements in diverse fields, the safeguarding of their rights, and their involvement in major national and future-oriented projects.

    Football Leagues: Premier League, Saudi Arabia, and Asian Champions League

    The sources contain information about several football leagues and competitions, including the English Premier League, various Saudi Arabian leagues, and the Asian Champions League.

    Regarding the English Premier League, one article reports on a match where Liverpool defeated Southampton 3-1, with a double by Mohamed Salah, continuing their strong performance and approaching their first league title since 1990. Meanwhile, the article notes that the defending champions Manchester City lost 1-0 to Nottingham Forest, highlighting their inconsistent results. The article also mentions Arsenal as the second-place team, 16 points behind Liverpool, and Manchester United as another rival. Liverpool was initially challenged by Southampton but came back strong in the second half to secure the win.

    In the context of Saudi Arabian football, there are several reports:

    • One section briefly mentions Al-Ahli’s surprising 3-0 victory over Al-Hilal, followed by a draw with Al-Khaleej and a defeat by Al-Rayyan of Qatar. The absence of Kessie and an injury to Mendy are suggested as contributing factors to the draw against Al-Khaleej. There are also notes on individual player performances and incidents involving Tembaekti, Mitrovic, Majrashi, Al-Mousa, and Khaled Al-Ghannam.
    • Another report discusses a match in the Riyadh region between Al-Faisaly and Al-Zulfi, which is considered important for both teams as they aim to avoid relegation. The article mentions that their first-round encounter ended differently. It also briefly notes a match between Al-Bukayriyah and Al-Jandal, with Al-Bukayriyah winning their previous official encounters. Players from Al-Bukayriyah like Mario, Fernando, Nakhli, and Bin Rubeiaan are mentioned.
    • A report from Al-Ahsa covers a quarter-final match in the Saudi Arabia Clubs Championship for the third division, where Al-Qarah defeated Al-Qurayyat 4-1, overturning a previous loss. The goalscorers for Al-Qarah were Azzam Al-Khalifa and Qassem Al-Jassem, while Abdul-Ilah Al-Anzi scored for Al-Qurayyat.

    Regarding the Asian Champions League, there is extensive commentary focusing on the performance of Saudi clubs:

    • Al-Ahli is portrayed as being in good form and securing positive results, with key players like Harash, Kessie, Firmino, Danilo, and Mahrez contributing to their success in Asia.
    • Al-Hilal, despite having star players, is facing a downturn in performance, with their recent loss being described as unexpected. There is criticism of coach Jesus’s tactics, substitutions, and the team’s overall predictability, suggesting a potential “footballing collapse”.
    • Al-Nassr is seen as highly motivated to win the Asian title, especially for Cristiano Ronaldo, as it would be his first official title with the club.

    Finally, one opinion piece discusses a broader issue concerning Saudi football: the reliance on foreign coaches and the lack of development of national coaching talent. It questions the Saudi Arabian Football Federation’s role in preparing local coaches to lead top clubs.

    In addition to these leagues, there is a brief mention of the Spanish league where Real Madrid and Atletico Madrid are behind league leaders Barcelona.

    Our conversation history does not contain specific discussions about these football leagues or teams. Therefore, the information provided is solely based on the content of the current sources.

    By Amjad Izhar
    Contact: amjad.izhar@gmail.com
    https://amjadizhar.blog

  • Petticoat Junction 01×05 – The Courtship of Floyd Smoot

    Petticoat Junction 01×05 – The Courtship of Floyd Smoot

    The provided text appears to be a script or transcript from episodes of the television show Petticoat Junction. The excerpts highlight the comedic situations and romantic entanglements of the characters living near the Shady Rest Hotel and Hooterville. Recurring themes include the romantic struggles of train engineer Floyd, the struggles of the Hooterville football team, and the matchmaking efforts of Kate Bradley. One plotline revolves around Floyd’s pursuit of a mail-order sweetheart and the subsequent fallout. Another centers around the Hooterville coach trying various tactics to improve his football team’s performance. The lighthearted episodes revolve around the daily life of the characters and their relationships, with a blend of romance, comedy, and small-town charm.

    Petticoat Junction: A Study Guide

    Quiz: Short Answer Questions

    1. What is the name of the hotel run by Kate Bradley?
    2. What magazine did Floyd Smoot and Camille Tewksbury meet through?
    3. What is Uncle Joe’s idea for improving the Hooterville Hornets football team?
    4. What is Kate planning to serve for dinner that causes Floyd to initially avoid the Shady Rest?
    5. Why are the girls, Billie Jo and Bobbie Jo, so eager for Floyd to help with the dishes?
    6. What historical event are the girls struggling with for their homework?
    7. What is the name of the train that Floyd operates?
    8. What does Sam Drucker give Floyd after returning from his failed romantic encounter?
    9. In the song, what is the name of the place that they are rolling down the tracks to?
    10. What is the surprise that Uncle Joe has ready for the Elm City team?

    Quiz: Answer Key

    1. The hotel is called the Shady Rest. It is the central hub for characters to meet and interact.
    2. They met through the lonely hearts column of a magazine. This reveals the characters loneliness.
    3. Uncle Joe wants to use Betty Jo as a scat back. This highlights the small town wackiness of Hooterville.
    4. Kate is planning to serve pigs knuckles and sauerkraut. Floyd initially avoids it because he feels bad.
    5. They want him to explain about the birds and the bees. They are teasing him about being a father figure.
    6. The girls are struggling with the Mexican War. Floyd tries to help them with the homework questions.
    7. The train that Floyd operates is the Cannonball. It is an old steam engine with a regular schedule.
    8. He is in love with Camille Tewksbury who lives out of town. She has exchanged pictures with him.
    9. The song says that they are rolling down the tracks to the Junction. It refers to Petticoat Junction.
    10. The surprise that Uncle Joe has ready for the Elm City team is speed in the backfield. It is a reference to Betty Jo.

    Essay Questions

    1. Explore the theme of small-town life in Petticoat Junction. How does the show portray the characters, values, and relationships within this setting? What are the positive and negative aspects of this close-knit community as depicted in the episode?
    2. Analyze the character of Kate Bradley. What are her key personality traits, and how does she interact with the other characters in the show? What role does she play in maintaining the stability and harmony of the community?
    3. Discuss the portrayal of gender roles in Petticoat Junction. How do the female characters (Kate, Billie Jo, Bobbie Jo) conform to or challenge traditional gender expectations? How do the male characters (Floyd, Uncle Joe, Sam Drucker) view and interact with the women in their lives?
    4. Examine the theme of romance and relationships in Petticoat Junction. How are romantic relationships depicted in the episode? What are the sources of conflict and humor in these relationships? How do the characters navigate the challenges of love and commitment?
    5. Evaluate the comedic elements of Petticoat Junction. What types of humor are used in the episode (slapstick, wordplay, situational irony)? How effective is the humor in engaging the audience and conveying the show’s themes?

    Glossary of Key Terms

    • Hooterville: The fictional town where the football team is located.
    • Pixley: A neighboring town and rival to Hooterville.
    • Shady Rest Hotel: The hotel run by Kate Bradley, serving as a central location.
    • Cannonball: The name of the train Floyd operates.
    • Sam Drucker’s Store: The local general store, a frequent gathering place.
    • Camille Tewksbury: The woman Floyd is corresponding with through a lonely hearts column.
    • Scat Back: A fast-running player in football, Uncle Joe suggests Betty Jo to fill this role.
    • Lonely Hearts Column: The source of Floyd’s ill-fated romance, a place for people to write letters to each other and meet.
    • Petticoat Junction: The show’s title and the name of the area around the Shady Rest Hotel.

    Petticoat Junction: Small-Town Life, Romance, and the Cannonball

    Okay, here’s a briefing document summarizing the main themes and ideas from the provided excerpts of “Petticoat Junction” episodes.

    Briefing Document: Petticoat Junction Episode Excerpts

    Overview:

    These excerpts offer a glimpse into the folksy, small-town world of “Petticoat Junction.” The episodes revolve around the lives of the inhabitants of Hooterville, focusing on the antics at the Shady Rest Hotel run by Kate Bradley, the operations of the Cannonball train, and the general, often comedic, challenges and romantic entanglements faced by the characters. Common themes include: small town life, romance, football, and the humorous challenges of every day.

    Main Themes & Ideas:

    • Small-Town Life & Community: Hooterville is portrayed as a close-knit, if somewhat eccentric, community where everyone knows everyone else’s business. The characters frequently interact and rely on each other. The show emphasizes the importance of community, even when facing humorous setbacks.
    • Example: The entire town seems invested in Floyd’s love life and Kate’s potential marriage. When Kate is upset, the community quickly takes her side.
    • Romantic Entanglements & Relationships: Romance is a recurring theme, often played for comedic effect. The excerpts feature a variety of relationship scenarios, from Floyd’s mail-order bride misadventure to Kate’s potential marriage to Floyd. The focus is on the humorous challenges of finding and maintaining relationships.
    • Example: Floyd’s disastrous experience with Camille Tewksbury from the lonely hearts column provides humor and insight into the pitfalls of seeking love through unconventional means.
    • Example: Kate’s near-marriage to Floyd Smoot highlights the complexities of long-term relationships and the difficulty of change.
    • The Cannonball Train & Nostalgia: The Cannonball train is not just a mode of transportation; it’s a symbol of the show itself and, in some ways, a character in its own right. It represents the slow pace of life and a connection to a simpler past.
    • Quote: “Come ride the little train that is rolling down the tracks to the junction forget about your cares it is time to relax at the junction” – the show’s theme song encapsulates the feeling of nostalgia associated with the train.
    • Humor Through Eccentricity: Much of the humor comes from the quirky characters and their unusual situations. The show relies on slapstick, witty dialogue, and situational comedy to entertain.
    • Example: The recurring gag of Charlie rocking the train car, Uncle Joe’s schemes, and the generally inept Hooterville football team all contribute to the show’s comedic tone.
    • Football as a Metaphor for Life: The Hooterville Hornets’ constant losing streak becomes a running joke. While the team’s performance is pathetic, the show emphasizes the importance of effort, community spirit, and not giving up, even in the face of defeat.
    • Quote: “It’s how you play the game that counts” – While Sam Druckers says this sarcastically, it reflects the shows theme of the value of trying.
    • Gender Roles: The episodes offer glimpses into traditional gender roles, particularly for women, where their place was considered to be in the home. However, Kate challenges these stereotypes by running the Shady Rest Hotel.
    • Example: “yes a woman’s place is in the kitchen” – This dialogue highlights the traditional view of a woman’s role in the home.

    Key Characters & Relationships (as revealed in excerpts):

    • Kate Bradley: The matriarch of the Shady Rest Hotel. She is kind, resourceful, and a central figure in the community. She is the object of affection for many men, including Floyd Smoot.
    • Floyd Smoot: The Cannonball train engineer. He is portrayed as a somewhat hapless, though well-meaning, character who is unlucky in love.
    • Uncle Joe: A lovable schemer who often tries to come up with get-rich-quick plans, usually without success.
    • Betty Jo: One of Kate’s daughters.
    • Sam Drucker: The local storekeeper, also a voice of reason.
    • Charlie: The conductor on the Cannonball, prone to rocking the train car.

    Important Plot Points/Details:

    • Floyd’s Mail-Order Bride: Floyd’s attempt to find love through a lonely hearts column backfires when he receives an unflattering picture from his penpal, Camille Tewksbury which turns out to be his own picture.
    • Kate’s Potential Marriage to Floyd: Floyd proposes to Kate, leading to a period of humorous anxiety among the community and Kate herself. Ultimately, Kate chooses her independence and Floyd chooses his train, the Cannonball.
    • Hooterville Hornets Football Team: The Hooterville football team is terrible and always loses. They consider having a woman player, but that fails.

    Overall Tone:

    The excerpts convey a lighthearted, comedic tone. While the characters face challenges, the overall message is optimistic and emphasizes the importance of community, friendship, and finding humor in everyday life.

    Petticoat Junction: Shady Rest and Hooterville’s Charm

    Petticoat Junction: Frequently Asked Questions

    • What is Petticoat Junction?
    • Petticoat Junction is the name of a small town and the setting for a television show. It revolves around the folks who live in and around the Shady Rest Hotel, which is a “little hotel” that is owned and operated by Kate Bradley. It is also a stop along the local train line, the Cannonball.
    • What is the Cannonball?
    • The Cannonball is a steam train that runs through Hooterville and makes a stop at the Petticoat Junction’s Shady Rest Hotel. It is piloted by engineer Floyd Smoot, assisted by fireman Charlie Pratt. It’s portrayed as being vital to the community, providing transportation and connection to the outside world.
    • Who is Kate Bradley?
    • Kate Bradley is the central figure of Petticoat Junction. She owns and operates the Shady Rest Hotel. She also manages to keep the other central figures in line, from her family to the local townspeople.
    • Why is Floyd Smoot so unlucky in love?
    • Floyd has a difficult time navigating relationships with women and is portrayed as being unlucky in love. In one episode, his romantic pursuit of a mail-order sweetheart, Camille Tewksbury, ends in disappointment when she sends back his picture. Even when he is convinced he is marrying Kate Bradley, he eventually chooses the Cannonball train over her. He is portrayed as being naive, awkward, and susceptible to suggestion.
    • What role does Uncle Joe Carson play in the community?
    • Uncle Joe Carson is a somewhat hapless but enthusiastic character who lives at the Shady Rest Hotel and offers unsolicited and often unhelpful advice. He often tries to scheme ways to improve things, but his plans usually backfire. He takes great interest in the local football team and makes attempts to improve their performance.
    • What’s with the Hooterville football team’s losing streak?
    • The Hooterville football team is portrayed as being chronically unsuccessful, consistently losing games by wide margins. Their ineptitude becomes a running joke, and Uncle Joe often comes up with outlandish plans to improve their performance.
    • What do the citizens do for fun in the community?
    • The show highlights small-town life with simple pleasures and community involvement. The people of Petticoat Junction enjoy activities such as church socials and local football games. Gathering at Sam Drucker’s store is a popular pastime. The characters spend a lot of time interacting with each other, whether it is talking to Kate Bradley at Shady Rest or to Sam Drucker at his store.
    • What is the relationship between Kate and Floyd?
    • Kate is a mother-like figure in Floyd’s life. In one episode, Kate attempts to talk him out of a romantic obsession and then, through manipulation, convinces him that he is desirable to the women in the area. Floyd later almost marries Kate and is portrayed as being ready to settle down, but then he chooses to continue piloting the Cannonball.

    Petticoat Junction: Life at the Shady Rest Hotel

    Petticoat Junction is a television show centered around the comings and goings at the Shady Rest Hotel, run by Kate Bradley, and the folks of Hooterville.

    Key aspects of the show from the provided source:

    • The show’s theme song invites viewers to relax and forget their cares at the Junction.
    • The Hooterville football team is notably bad, consistently losing by large margins.
    • Floyd Smoot’s romantic pursuits are a recurring theme, particularly his interest in a mail-order sweetheart named Camille Tewksbury. This leads to humorous situations and romantic complications.
    • Kate’s interactions with Floyd are a key part of the narrative, with other characters trying to give him the confidence to pursue a relationship.
    • The Cannonball train is a vital part of the community, and the characters are very protective of it.
    • Uncle Joe’s schemes often involve the train or attempts to improve the town’s sports teams.
    • Family dynamics are explored through Kate’s relationships with her daughters, who sometimes get involved in town affairs.
    • The community is close-knit, with residents like Sam Drucker involved in the characters’ lives and local events.
    • Humor is derived from misunderstandings, quirky characters, and small-town situations.

    Floyd Smoot: Petticoat Junction’s Romantic Engineer

    Floyd Smoot is a central character in Petticoat Junction, particularly known for his romantic pursuits and his role related to the Cannonball train.

    Key aspects of Floyd’s character and storylines:

    • Romantic interests: Floyd’s romantic life is a recurring theme in the show. He corresponds with a mail-order sweetheart named Camille Tewksbury, which leads to comedic situations when they exchange unflattering pictures.
    • Insecurity: Other characters in the show, like Kate, try to boost Floyd’s confidence so he can pursue a relationship.
    • The Cannonball train: Floyd is closely associated with the train, possibly as the engineer. His dedication to the train is so strong that he chooses it over a potential relationship with Kate.
    • Community involvement: Despite his romantic mishaps, Floyd is involved in the community and often interacts with other characters like Sam Drucker.
    • Character traits: Floyd is portrayed as warm, affectionate, gentle, and sometimes pathetic. Others see him as a handsome, desirable bachelor.

    Kate Bradley of Petticoat Junction

    Kate Bradley is a central character in Petticoat Junction, running the Shady Rest Hotel and interacting with the residents of Hooterville.

    Key aspects of Kate from the source:

    • Hostess: Kate runs the Shady Rest Hotel. The theme song of the show welcomes people to be her guest at the junction.
    • Involved in Floyd’s romantic life: Kate interacts with Floyd and tries to convince him that women in the valley are mad about him, and generally tries to give him the confidence to pursue a relationship. She gets angry when she finds out about Floyd’s mail-order sweetheart, Camille Tewksbury.
    • Sought for advice: Floyd is urged to talk to Kate and seek her advice.
    • Object of affection: Jake comes to declare his love for Kate. At one point, Floyd almost marries her.
    • Community involvement: Kate is involved in the community, and the community is concerned that she is marrying Floyd.
    • Mother: Kate’s daughters try to get Floyd to explain the birds and the bees. She also urges her daughter Betty Jo to do her homework because her grades aren’t good.
    • Protective: Kate hides Joe’s Indian carving so it won’t give the hotel a bad name. She also won’t let her daughter be a “scat back” on the football team.
    • Blueberry pie: Both Floyd and Charlie like Kate’s blueberry pie. At the end of one episode, Floyd chooses blueberry pie over Kate.

    Hooterville Football Team: Performance and Community in Petticoat Junction

    The Hooterville football team is a recurring element in Petticoat Junction, typically depicted in a humorous and disparaging manner.

    Key aspects of the Hooterville team mentioned in the source:

    • Poor performance: The team consistently loses its games by significant margins. Examples from the source include losses of 63-0 and 72-0.
    • Lack of coaching: Hooterville does not have a coach.
    • Sam Drucker’s involvement: Sam Drucker is invested in the team and commiserates with the players after their losses. He also expresses a desire to confront the opposing team’s coach.
    • Attempts to improve: There are attempts to improve the team’s performance, including a plan to use Floyd in the backfield and Uncle Joe’s surprise of using Betty Jo as a “scat back,” though Kate does not allow this.
    • Community interest: Despite the team’s poor performance, the community is interested in the games, with merchants prepared to pay a bonus, not for winning, but if Uncle Joe resigns.
    • Skull practice: After a loss, Uncle Joe says the team needs more skull practice.

    Petticoat Junction: The Blueberry Pie

    Blueberry pie is a dessert mentioned in Petticoat Junction, particularly associated with Kate and with Floyd.

    Key aspects of the blueberry pie from the sources:

    • Kate’s baking: Kate bakes fresh blueberry pie.
    • Charlie’s enjoyment: Charlie says blueberry pie is his favorite.
    • Floyd’s temptation: Floyd says he couldn’t resist the smell of the blueberry pie.
    • Floyd’s choice: At the end of one episode, Floyd chooses blueberry pie over a relationship with Kate. He finishes third “behind blueberry pie”.
    🚂 Petticoat Junction 01×05 – The Courtship of Floyd Smoot

    The Original Text

    [Music] come ride the little train that is rolling down the tracks to the junction forget about your cares it is time to relax at the junction junction there’s a little hotel called a shady rest at the junction [Music] it is run by kate come and be her guest at the junction petticoat [Music] junction [Music] [Music] [Music] [Music] [Music] you burning them tires again i just burned the loose ones charlie besides they burn hot and i’m in a hurry to get to hooterville you keep taking up them ties we won’t make the next curve pretty straight from here hold on charlie there’s something waiting for me in sound drucker store another letter from that lonely hearts woman camille tewksbury it’s more than a letter this time i sent her my picture where in my store of all hairs [Music] now there’s something you can burn and the quicker the better [Music] come on kids cheer up can’t win them all well do we have to lose them all hi betty joel hi mr drucker i uh take it hooterville dropped another one 63 to nothing well look on the bright side it’s not as bad as last week thanks anyway sam i guess you heard the score huh yeah looks like the team’s improving how’d my boy herbie bates do do you give a good account of himself maybe you better ask him here he comes i’m sorry mr drucker we lost again oh now don’t feel too bad herby it’s how you play the game that counts and from the looks of you you really gave him a scrap i didn’t even get off the bench or herbie if you were just warming the bench how did all this happen the team ran over me on the way to shower they were in a hurry to get off the field and who can blame them oh here comes that hooterville coach have i got a few words to say to him no sam take pity on the poor man his team just lost 63 to nothing last week it was 72 to nothing the week before that it [Music] was hermie you were pathetic how can you blame herbie you didn’t even let him play no sam that was my tragedy he was supposed to run up and down the sidelines and keep the pixie boys worried wondering when i’d send him in instead of that he worried our boys they were afraid i would send him in cheer up herbie that’s right you’ll get to play next week you’re the one that’s pathetic why don’t you resign you know uncle joe this might be a good time you could retire with a perfect record a perfect record an unbroken string of defeats i got news for you monday morning quarterbacks we ain’t gonna lose next week oh what makes you think so we ain’t gonna play train’s coming mom just turn her whistle for the station oh we’ve got groceries in the back room come on ten come on coach order resign now sam did the mail come oh yeah but i ain’t had a chance to sort it yet it’s in a sack back in the hi floyd hi friends hi floyd hi i didn’t know floyd could move that fast what if i get by using him in the hooterville backfield come on uncle joe yeah we better not stay here too long you know there’s an angry mob farming when i left the football field let’s get into floyd [Music] controller patch come in real handy i might have you wear those all the time [Music] i’m afraid floyd’s got himself woman trouble floyd’s smooth that durn fool jumped down out of the cab before i could stop the train he’s going to kill himself over that woman what woman what what camille tewksbury camille tewksberry she must be new in town well she’s not local kate she’s one of them mail order sweethearts floyd met her through the lonely hearts column of a magazine oh well there’s nothing wrong with that what does she look like nobody knows including floyd but floyd sent camille his picture and today he figures he’ll get sent her picture and that’s why he’s back there rooting through the mail site well i hope for floyd’s sake she’s nice looking come on come on let’s get that train rolling bad as this is better than riding out of town on one rail yeah but i’m gonna jump i want it uh i see floyd on the train floyd come on let’s get rolling [Music] [Applause] got your letter from camille huh yeah let’s see her picture floyd you got yourself a dogger i warned you come in betty joe next time you listen to me ah floyd don’t pay any attention to him the worst mistake a man can make is to marry a beautiful woman they’re nothing but trouble you remember that yeah i’m serious floyd the plain ones make the best wives in fact the homie are the better yeah oh come on floyd now she can’t be that bad ugh holy smoke she is it looks like a man wearing a wig it is what that’s my picture she sent it back oh floyd listen wait a minute [Music] [Music] you know something betty joe i’ll bet you’re the only kid in the country that drives a train home from school charlie what could have happened to floyd usually he’s up here every five minutes checking on us love life got derailed i bet he’ll get over it who’s he in love with well he’s been writing letters to some woman a camille tewksbury but they exchanged pictures and ended that gee i hope he is taking it too hard oh don’t worry about it honey you’re mowing sam drucker’s back there with him they’ll switch him back on the track floyd come on now unlock this door go away leave me alone that’s no use kate he’s just gonna set out there and pine over camille tewksbury sam let me try something moon i got a bone to pick with you now you unlock that door or i’m climbing out this window all right no no hey don’t do that i’ll open the door you traitor you double crosser you love pirate so we’re not good enough for you huh the women in this valley that’s who all these years you’ve been leading us on keeping us dangling on a string playing fast and loose with our hearts me yes you and now that you’ve reached your prime and we’re all wondering who’s going to be the lucky woman you turn your back on us and you take up with an out-of-towner you’re nothing but a typical playboy bachelor floyd smoot me you don’t don’t act innocent with me i’ve heard all about you and that camille tewkesbury but let me tell you something at the next church social there isn’t a woman in this valley that’ll sell you a cake or a box lunch we’ve given you the best years of our lives now you’re tossing us aside like a bunch of old shoes please no i’m i’m just not gonna cry because you’re not worthy of my tears listen i i don’t understand listen kate open the door i want to talk to you no sir you stay out there and cool your hot blood you fickle casting over you [Music] me well who won the game coach oh pixley managed to squeeze by him 60 free to nothing my boys weren’t able to execute the plays i gave him they need more skull practice pixley was sharp well organized well-trained uncle joe pixley hasn’t even got a coach or neither has hooterville [Music] pixley hasn’t got a coach maybe we could protest the game on that basis [Music] we’re gonna have pigs knuckles and sauerkraut tonight charlie yeah and for dessert i’m gonna bake fresh blueberry pie blueberry pie that’s my favorite i don’t remember inviting you to suffer floyd’s mood so why don’t you put on your mail-order court in here and go call in on your mail-order sweetheart and maybe she’ll cook you up a mail-order supper but kate i always eat supper here that was before i found out what kind of a man you are what kind of a man am i kate tell me again you’re nothing but a fickle heartbreaking casanova that’s what come on chuck [Music] kane wants to hold supper for a while see if floyd shows up wait a minute joel the big game with elm cities coming up now the merchants of hooterville are prepared to pay you a cash bonus as good as mine i’m gonna win that game the bonus ain’t for winning it’s for resigning i’ll chip in on that now look here just a minute you boys i got a surprise ready for that elm city team the hooterville hornets are gonna really sting worse than today oh sting what’s the surprise joe speed in the backfield i got a brand new scat back under graphs a little bit faster than jack rabbit them big heavy on city boys ain’t going to lay a hand on her all right i mean him yeah the scan back uncle joe you’re not by any chance talking about betty belly betty who betty my fast-running daughter that’s who oh kate i know you’d never stand still for a thing like that would you you better believe i wouldn’t neither would she neither would hooterville i’ll bet the elm city boys would go for it i was just joking but victory don’t mean that much to me by the way where is betty jo how would i know betty jones you better let her do her homework you know her grades aren’t going to be did you talk her into it counts you’re into what no he didn’t but mom we’ve just got to win one game and uncle jokes says that with me as a scat back you scat back upstairs and take off that uniform sorry coach i was ready and willing well i wasn’t nice pass mom forget it can’t we start serving now everything’s ready yeah bring on the pigs knuckles and sauerkraut well then no let’s just wait a little longer for floyd hey kate i sure thought you had him cured when you told him all the women in the valley were mad about him what oh it’s smooth mother well i was just trying to help him mend his broken heart by convincing him he had sex appeal i guess he didn’t believe me well there’s only so big a lie a man can swallow that’s a whopper oh you two are a great help all that floyd needs is enough people to tell him that he’s a handsome desirable bachelor and he’ll be fine well go ahead girls get the blueberry pies out of the oven i don’t think floyd’s coming handsome desirable bachelor he’s pathetic joe floyd is warm he’s affectionate he’s gentle he’s pathetic floyd what are you doing oh i just couldn’t resist the smell of the blueberry pie but please don’t tell your mom i was here she’s mad at me mom mad at you why she’s crazy about you all the women are but especially mom and who can blame her you’re strong you’re handsome and you’re charming and you’re intelligent why you’re everything a woman could ask for then how come no woman’s ever asked for it [Music] well because it’s a man’s place to do the asking not a woman’s yes a woman’s place is in the kitchen now mom’s waiting for you in the dining room thanks you girls you set me straight [Music] i still say you men could help give floyd some self-confidence if you would jake bradley come to my arms what you’re my woman and i’m your man let’s do something about it right smoked have you been into mccook and sherry no kate i don’t need no false courage i got love [Music] now kate don’t you cry i know you’ve waited a long time for this but it’s come at last i want to marry you how about a judge drucker well uh don’t you think you better wait for kate’s answer yes you heard her she said yes hey somebody hid my indian again don’t they realize that was carved out of living oak by my great-great-great uncle kit carson thing like that gives the hotel oh class if that pig had known what kate was going to do to his knuckles he’d have died happily hey that’s my chair [Music] hey droid out here i can’t find him anyplace why are you in a hurry to marry him of course not but kate’s in her spot she might need our help if you find her asking where she hid my indians oh betty joel you got any idea where floyd and your mother disappeared to no mr drucker last i saw floyd he was going down to the train to get his guitar no no poor kate there he is yeah but where where is he can anybody tell what direction that’s coming from no but i hope the wind changes them knuckles i eats clinching into a fist maybe he’s down at the railroad track come on betty job [Music] whoo i’m glad that’s over hold it charlie what’s the matter now let’s start over again when i rock you rock now nothing makes me head you’re people rocking against me listen girls we’ve got to scare floyd out of the notion of wanting to marry me now i got an idea you know how shy and bashful he is not tonight well not with me but when he comes back to help you with the dishes i’m going to leave and i want you to ask him to explain about the birds and the bees oh don’t worry you’ll run like a scared rabbit oh blanca come in come in floyd the girls are so happy you’re gonna help them with the dishes aren’t you girls so we sure are daddy floyd oh bailey why daddy floyd you’re blushing from ear to ear you don’t mind us calling your daddy but we never had a father to tell us things and there’s something we’d like you to tell us right now what’s that all about the birds and the bees you don’t know about the birds and the bees well i guess you’re old enough to learn about the birds and the bees the birds i’ll tell you about the birds first and they go like this and you take the bees they do it all together i mean the sound now you don’t want to get them mixed up because the bees are stinging just remember the birds is bigger and goes like that bees is littler and goes like this you sure you don’t know about this we’ve been looking everywhere where’s floyd oh he’s in the kitchen but he’s gonna come running in here any minute what are you playing hide and seek he’s gonna wanna hide because bobby joe and billy joe are asking daddy floyd to explain about the birds and the bees oh this i gotta hear you go get your books and do your homework this i gotta hear me too but you don’t think that with floyd it’s possible lied just a minute kate girls have you learned enough yes [Music] i’ll be back later tell you about the pigs and the chickens should i bring my guitar no no no i just want to talk to you what am i gonna do sam ma yes dear were either one of you good in history i’m having an awful time yes kate dear here’s the man who’s going to help you with your homework from now on your daddy floyd yeah boy now that’s something you got to get used to helping the girls with their homework my teacher always said i was exceptional bright what do you want to know betty uh right now we’re studying the mexican war are we having a while with mexico floyd the mexican war started over a hundred years ago and we’ve whipped him yet [Music] dog got it charlie if you’re gonna rock rock don’t sit there side saddle i’m sorry joe all right now we’ll get a fresh start a one a two a rock [Music] listen boys if we don’t help kate she’s liable to wind up being mrs floyd smoot terrible name it’s pathetic oh i have no objection to the name i do this is smooth sounds like a truck dumping wet cement i’ll tell you what kate i can fire up the cannonball and sneak you into pixley that’ll give floyd overnight to kind of sleep off this love binge he’s on flight wouldn’t sleep awake worried about his train boys you’ve done it i’ve done what solve the problem listen [Music] i’m glad you changed your mind about a moonlight stroll you sure you don’t want me to go get my guitar no floyd we have things to talk about nothing uh you’ll run the hotel oh sure when i’m not running a train oh well i don’t think you’ll have time for both but kate i’m the only one that knows how to follow them why they ain’t nothing to fire in a locomotive boys easiest job on earth come on i tell you what joe you fired one day and i’ll fire at the next sounds good to me sam get down on there and get your grimey hands off of my wood why lord you won’t have time for the cannonball no more you’re going to be a married man you’ll be busy firing kate stoke ain’t nobody going to fire the hood of el cannon ball but me now get yourself down out of there why lloyd did i hear right you mean you’re choosing the train instead of me now kate please don’t go to balling again dog gonna if i got to make a choice uh and i’m gonna be very brave about it i guess you were a bachelor too long to get snared now kate i’ll take you back to the hotel i’ll come along me too me too oh no floyd you stay here with your first and only love it in my only love kate please let me go along no ploy in that final well then kate what is it floyd will you bring me back down some blueberry pie sure i will how do you like that i finished third behind blueberry pie [Music] so [Music] this has been a filmways presentation [Music]

    By Amjad Izhar
    Contact: amjad.izhar@gmail.com
    https://amjadizhar.blog

  • The Autobiography of Bertrand Russell, 1872-1914

    The Autobiography of Bertrand Russell, 1872-1914

    The text comprises excerpts from Bertrand Russell’s autobiography, covering the period from his childhood to 1914. It details his early life, including family relationships, intellectual development, and education. Russell recounts experiences at Cambridge, his relationships with prominent figures like Whitehead and Moore, and his evolving philosophical views. The autobiography also explores Russell’s personal life, including his first marriage and romantic interests, interwoven with his professional endeavors. Furthermore, the text provides glimpses into his social and political engagements, such as his involvement in the women’s suffrage movement and his evolving stance on pacifism. The selected passages offer a multifaceted view of Russell’s formative years, revealing the personal and intellectual forces that shaped his life and thought.

    Bertrand Russell: A Study Guide to His Autobiography (1872-1914)

    I. Quiz

    Answer each question in 2-3 sentences based on the provided excerpts.

    1. What role did Bertrand Russell’s grandmother, Lady Russell, play in his upbringing?
    2. Describe one anecdote from Russell’s childhood that illustrates his curiosity or precociousness.
    3. What was Russell’s initial attitude toward religion during his adolescence, and what influenced this view?
    4. What did Russell study at Cambridge University?
    5. Who was Alys Pearsall Smith, and what was her significance in Russell’s life during this period?
    6. What is the “Society” that is mentioned in some of the included letters, and what was its significance to Russell and his peers?
    7. What is Principia Mathematica, and why is it important?
    8. What political and social issues was Russell involved with or interested in during the early 1900s?
    9. Name some of the prominent intellectuals or figures who corresponded with Russell, as revealed in the letters.
    10. What was Russell’s attitude toward World War I?

    II. Quiz Answer Key

    1. Lady Russell became Bertrand’s guardian after his parents’ death and instilled in him the values of Victorian aristocracy, including a rigorous home education. She shaped his early intellectual development and social outlook.
    2. The story of the sponge cake exemplifies his early desire for something sweet. He remained still for the photograph, which was wholly successful, but he never got the sponge cake.
    3. Russell initially questioned religious dogma and began doubting traditional beliefs. His reading and intellectual development led him to question many conventional views.
    4. Russell studied mathematics at Cambridge University, excelling in the subject and eventually earning the title of Seventh Wrangler.
    5. Alys Pearsall Smith became Russell’s first wife and was a significant intellectual and emotional influence on him. Their relationship marked a departure from his upbringing and shaped his views on love and marriage.
    6. The “Society” was a Cambridge discussion group where Russell and his peers debated philosophical and moral issues. It was a space that challenged intellectual debate among bright young men.
    7. Principia Mathematica is a monumental work of mathematical logic co-authored by Russell and Alfred North Whitehead. It was a key publication that attempted to derive mathematical truths from logical axioms and was an incredibly influential publication.
    8. Russell was concerned with issues like pacifism, social reform, and women’s suffrage. These pursuits reflected his growing social consciousness and commitment to progressive causes.
    9. Russell corresponded with figures such as Gilbert Murray, Logan Pearsall Smith, and Beatrice Webb, engaging in discussions on diverse topics from ethics to politics.
    10. The source material does not reveal Russell’s attitude toward World War I.

    III. Essay Questions

    1. Discuss the influence of Bertrand Russell’s family, particularly his grandmother, on his intellectual and personal development as portrayed in the autobiography.
    2. Analyze the role of Cambridge University and the intellectual climate there in shaping Russell’s philosophical and mathematical pursuits.
    3. Examine the significance of Russell’s relationships, including his marriage to Alys Pearsall Smith, in understanding his evolving views on love, society, and personal freedom.
    4. Explore the ways in which Russell’s autobiography reflects the broader social, political, and intellectual currents of the late 19th and early 20th centuries in England.
    5. Trace the development of Russell’s philosophical and political ideas as presented in the autobiography, focusing on key influences, turning points, and recurring themes.

    IV. Glossary of Key Terms

    • Alys Pearsall Smith: Bertrand Russell’s first wife, an American Quaker, and a significant intellectual and emotional influence on him.
    • Cambridge Apostles (The Society): A secret intellectual society at Cambridge University, known for its members’ progressive and often unconventional views.
    • Fabian Society: A British socialist organization founded to advance socialist principles through gradual reform, rather than revolution.
    • Freethinker: A person who forms their own opinions about religion and other matters, rather than accepting what they are told.
    • Gilbert Murray: A prominent classicist and intellectual who corresponded with Russell on topics ranging from Greek tragedy to ethical theory.
    • Lady Russell: Bertrand Russell’s paternal grandmother, who raised him and instilled in him a strong sense of Victorian morality and intellectual rigor.
    • Logan Pearsall Smith: An American-born essayist and critic who was Alys Pearsall Smith’s brother and a friend of Russell’s, known for his witty and insightful letters.
    • Principia Mathematica: Russell’s most important publication and a landmark achievement in the history of logic and mathematics.
    • Seventh Wrangler: An academic rank achieved by Russell in the Cambridge University Mathematical Tripos examination.
    • Unitarianism: A liberal religious movement that emphasizes reason, individual conscience, and the inherent worth of every person, rejecting traditional doctrines such as the Trinity.
    • Whig: An old political party that favored reform.

    Bertrand Russell: Formative Years, Philosophy, and Relationships

    Okay, here is a briefing document summarizing the main themes and ideas from the provided excerpts of Bertrand Russell’s autobiography:

    Briefing Document: Bertrand Russell’s Autobiography (1872-1914)

    Source: Excerpts from “The Autobiography of Bertrand Russell – 1872-1914”

    Main Themes:

    • Early Life and Family: The excerpts cover Russell’s childhood, adolescence, and early adulthood, focusing on his upbringing by his grandmother, his relationships with family members (often complex and sometimes strained), and his intellectual development.
    • Intellectual Development & Cambridge: A significant portion focuses on his time at Cambridge, his engagement with mathematics and philosophy, his relationships with influential figures like Ward and Moore, and his involvement in intellectual societies.
    • Personal Relationships and Marriage: Russell’s relationships, especially his courtship and early marriage to Alys Pearsall Smith, are explored, revealing the emotional and intellectual dynamics of these connections.
    • Social and Political Engagement: The excerpts touch upon Russell’s evolving political views, including his interest in socialism, his engagement with social reform movements (like those associated with the Webbs), and his developing pacifist stance.
    • Religious and Moral Doubts: Russell grapples with questions of religion, morality, and the existence of God, particularly during his adolescence, documenting his shift away from traditional religious beliefs.
    • The Development of his Philosophical Work: The excerpts show Russell’s initial philosophical pursuits, including his work on geometry and economics, which developed into mathematical philosophy.

    Key Ideas and Facts:

    • Orphaned and Raised by Grandmother: Russell was orphaned at an early age and raised by his grandmother, Lady Russell, under strict Victorian principles. This upbringing significantly shaped his early life and intellectual development.
    • “Bertrand Russell was orphaned at an early age. His grandmother, Lady Russell, reared him according to the tenets of Victorian aristocracy.”
    • Early Education and Intellectual Curiosity: He was tutored at home and displayed early intellectual curiosity, engaging in discussions on scientific matters and religious questions.
    • “He talked to me about scientific matters, of which he had considerable knowledge… I remember asking him once why they had coloured glass in church windows.”
    • Family Eccentricities: The Russell and Stanley families were filled with eccentric characters, including a Mohammedan uncle, a Roman Catholic priest, and a relative obsessed with esoteric Buddhism.
    • “Her eldest son was a Mohammedan, and almost stone deaf. Her second son, Lyulph, was a free-thinker… Her third son, Algernon, was a Roman Catholic priest.”
    • Religious Skepticism: Russell’s diaries from his adolescence reveal his grappling with religious faith and his questioning of traditional arguments for God’s existence. He questioned the need for a divine power, the existence of miracles, and immortality.
    • “I think they are only attributable to a divine controlling power, which I accordingly call God.”
    • “For if God is the maker of the laws, surely it would imply an imperfection in the law if it had to be altered occasionally.”
    • Cambridge and Intellectual Influences: Cambridge was a pivotal experience for Russell, where he encountered influential thinkers like G.E. Moore and James Ward, and engaged in intense intellectual discussions within societies like “The Society.”
    • “We were still Victorian; they were Edwardian. We believed in ordered progress by means of politics and free discussion”
    • The “Principia Mathematica”: The excerpts document the early stages of Russell’s work on his monumental Principia Mathematica, a landmark achievement in mathematical logic.
    • Early Social and Political Views: Russell was exposed to socialist ideas and engaged with social reform movements, influenced by figures like Beatrice and Sidney Webb.
    • Relationship with Alys Pearsall Smith: The autobiography details Russell’s growing affection for and eventual marriage to Alys Pearsall Smith, showcasing the intellectual and emotional connection between them, as well as the disapproval from his family due to religious differences.
    • “I came of age in May 1893, and from this moment my relations with Alys began to be something more than distant admiration.”
    • Pacifism: His developing pacifist views are evident. “But I think it is a good thing that we should win diplomatically, if possible, without a…”

    Notable Quotes:

    • “What? Destroy my library? Never!” (Illustrates a character’s priorities and perhaps a certain intellectual detachment.)
    • “Passion” (Lytton Strachey’s single-word answer to what literature should aim at.)
    • “the world is too serious a place, at times, for the barriers of reserve and good manners.”
    • “Die to Self is an old maxim; Love thy neighbour as thyself is new in this connexion, but also has an element of truth.”

    Overall Impression:

    The excerpts provide a glimpse into the formative years of Bertrand Russell, highlighting his intellectual curiosity, his evolving philosophical and political beliefs, and the complex relationships that shaped his life. The autobiography reveals a mind grappling with fundamental questions of existence, morality, and the role of the individual in society.

    Bertrand Russell: Life, Philosophy, and Social Impact

    What were Bertrand Russell’s early influences and how did they shape his intellectual development?

    Bertrand Russell was orphaned at a young age and raised by his grandmother, Lady Russell, according to strict Victorian aristocratic principles. He was tutored at home, which allowed him to pursue his own intellectual interests without the constraints of a traditional schooling environment. He had an uncle who was scientifically inclined that exposed him to scientific matters and encouraged his intellectual curiosity from an early age. He studied Georg Cantor’s Mannichfaltigkeitslehre, and Gottlob Frege’s Begriffsschrift, which would later become the basis of his work in mathematical philosophy.

    How did Bertrand Russell’s views on religion evolve throughout his life?

    Russell’s early religious upbringing was rooted in the Christian faith, influenced by his family. However, as he matured intellectually, he began to question and eventually reject religious dogma. His notes during adolescence reflect his internal debates about God, free will, and immortality, questioning the reasonableness of religious reasoning and the existence of miracles. Over time, Russell transitioned to a more secular worldview, prioritizing reason and evidence over faith-based beliefs. Despite his rejection of traditional religion, Russell expressed a deep interest in mystical experiences.

    What role did mathematics and logic play in Bertrand Russell’s philosophical pursuits?

    Mathematics and logic were central to Russell’s philosophical work. He believed that philosophical problems could be clarified and resolved through the application of logical analysis and mathematical principles. He viewed mathematics as a source of intellectual stimulation. Russell worked to establish philosophy as a rigorous and systematic discipline, emphasizing precision and clarity of thought. Principia Mathematica, co-authored with Alfred North Whitehead, was a landmark achievement in this endeavor.

    How did Bertrand Russell engage with political and social issues of his time?

    Russell was deeply involved in the political and social issues of his time, advocating for causes such as socialism, free trade, and women’s suffrage. He admired figures like Lloyd George and supported policies aimed at social reform and economic equality. Russell was also a vocal critic of imperialism and militarism, particularly during World War I, which led to imprisonment due to his pacifist stance. He was associated with the Fabian Society.

    What were some of the key relationships that influenced Bertrand Russell’s life and work?

    Russell had a complex and rich web of relationships that significantly influenced his life and intellectual development. His relationships with his grandmother, figures like G.E. Moore and the Cambridge Apostles, and his first wife Alys Pearsall Smith, all played important roles. His correspondence with individuals like Gilbert Murray and Lucy Martin Donnelly reveals the depth of his intellectual and emotional connections.

    What were Bertrand Russell’s views on pacifism and war?

    Russell was a committed pacifist, particularly during World War I, when his opposition to the war led to social ostracism and imprisonment. He was critical of imperialism and militarism, advocating for peaceful solutions to international conflicts.

    How did Bertrand Russell approach the writing and development of his philosophical works?

    Russell often engaged in extensive correspondence with friends and colleagues, discussing and refining his ideas. He was deeply involved in the writing of Principia Mathematica, a monumental work in mathematical logic, and sought feedback and collaboration from others. He also wrote on a variety of philosophical and social topics.

    What were some of the personal struggles and challenges that Bertrand Russell faced throughout his life?

    Russell experienced personal struggles related to family relationships, romantic relationships, and intellectual pursuits. His upbringing as an orphan and his strained relationship with his grandmother created emotional challenges. He had difficulty driving self-love from this entrenchment. He faced social isolation and criticism due to his political views and unconventional lifestyle.

    Bertrand Russell: Early Life and Cambridge

    Bertrand Russell’s early life was marked by the loss of his parents at a young age and subsequent upbringing by his grandmother, Lady Russell. Here’s a summary of his childhood and adolescence:

    Childhood:

    • Early Memories: Russell’s first vivid recollection was arriving at Pembroke Lodge in February 1876. He recalls tea in the servants’ hall and being the subject of discussion among notable persons.
    • Family:
    • His parents were considered radicals and admired those opposing slavery in America.
    • His mother wrote a description of him shortly after his birth, noting his resemblance to his brother Frank.
    • He was made a ward in Chancery and given to his grandparents.
    • Pembroke Lodge:
    • The Lodge had eleven acres of garden that played a large part in his life up to the age of eighteen.
    • He spent time collecting bird eggs and meditating in the garden.
    • Grandparents:
    • His grandfather, whom he remembered as a man well past eighty, died when Russell was six.
    • His grandmother was well-read in English literature and history but lacked understanding of reasoning. She was austere and unworldly.
    • Aunts and Uncles:
    • His Uncle Rollo stimulated his scientific interests by discussing the effects of the Krakatoa eruption.
    • His Aunt Agatha taught him English Constitutional history.
    • Education:
    • He had German nursery governesses and spoke German fluently.
    • His aunt taught him lessons on colors and reading.
    • Nature: Russell had an increasing sense of loneliness, and nature, books, and mathematics saved him from despondency.
    • Unhappiness: He wished his parents had lived and his grandmother told him it was fortunate that they had died.

    Adolescence:

    • Loneliness and Secrecy: Russell’s adolescence was lonely and unhappy, marked by the need to keep his emotional and intellectual life secret.
    • Interests: He was divided between sex, religion, and mathematics.
    • Sexuality:
    • He first learned about sex at twelve and considered free love the only rational system.
    • He experienced intense sexual passions at fifteen and began masturbating.
    • Religion:
    • He was taught Unitarianism but began questioning fundamental Christian beliefs at fifteen.
    • He read Gibbon and Milman, which influenced his views.
    • Social Life: He was shy, awkward, and well-behaved, envying those who could manage social interactions.
    • Education:
    • He was sent to an Army crammer at Old Southgate to prepare for a scholarship examination at Trinity College, Cambridge.
    • He read Mill’s Political Economy and Herbert Spencer.
    • Politics and Economics: Aunt Agatha introduced him to Henry George’s books, and Russell believed that land nationalization would secure the benefits that Socialists hoped to obtain from Socialism.
    • Loss of Faith: Russell felt disappointed and pained by thought.
    • Rule of Life: To act in the manner he believed to be most likely to produce the greatest happiness, considering both the intensity of the happiness and the number of people made happy.

    Cambridge:

    • Mathematics: Russell went to Cambridge because of his interest in mathematics.
    • The Apostles: Russell was elected to The Society in 1892. The Society was supposed to be The World of Reality; everything else was Appearance. People who were not members of The Society were called “phenomena”.
    • Friends: He made friends through Whitehead’s recommendation.
    • Logic: Keynes’s father taught old-fashioned formal logic in Cambridge.

    Bertrand Russell: Cambridge Years, Intellectual Growth, and Friendships

    Bertrand Russell’s time at Cambridge University was a transformative period in his life, shaping his intellectual development and providing him with lifelong friendships.

    Reasons for Attending & First Impressions:

    • Russell chose Cambridge because of his interest in mathematics, although his father had attended Cambridge and his brother was at Oxford.
    • His initial experience involved staying in New Court rooms for scholarship examinations in December 1889.
    • Shyness prevented him from exploring the grounds, but he was invited to dine with the Master.
    • He obtained a minor scholarship, marking his first opportunity to compare himself with able contemporaries.

    Relationships & Social Life:

    • Cambridge provided Russell with friends and intellectual discussions.
    • He became very sociable and found that being home-schooled was not an impediment.
    • Congenial society helped him become less solemn.
    • He initially assumed there were many clever people at the university, but he later realized he knew the cleverest people by his second year.
    • In his third year, he met G. E. Moore, who he considered a genius for some years.

    The Society (The Apostles):

    • Election: Russell was elected to The Society early in 1892.
    • Secrecy: The Society was secretive to keep potential members unaware of consideration for election.
    • Purpose: It was a small discussion society that included one or two people from each year.
    • Membership: It has included most people of intellectual eminence at Cambridge since 1820.
    • Impact: It allowed him to get to know the people best worth knowing, such as Whitehead.
    • Principles: Discussions had no taboos or limitations, allowing absolute freedom of speculation.
    • Habits: Meetings typically ended around one o’clock at night, followed by discussions in the cloisters of Neville’s Court.
    • Values: Members valued intellectual honesty.
    • Beliefs: The Society considered itself “The World of Reality”; non-members were “phenomena”.

    Academic & Intellectual Development:

    • Russell had been interested in philosophy before Cambridge but hadn’t read much beyond Mill.
    • He desired a reason to believe mathematics true, finding Mill’s arguments inadequate.
    • His mathematical tutors hadn’t shown him why calculus wasn’t a tissue of fallacies.
    • He discovered Continental work that addressed his mathematical questions after leaving Cambridge.
    • He initially struggled with understanding Kant’s Critique of Pure Reason.
    • By his fourth year, he had become more lighthearted and flippant, even jokingly proclaiming himself as God.
    • He enjoyed exploring different philosophical perspectives.

    Teaching & Dons:

    • He derived no benefit from lectures as an undergraduate and vowed not to assume lecturing did any good when he became a lecturer.
    • He regarded the dons as unnecessary figures of fun as an undergraduate.
    • He later found them to be serious forces of evil when he became a Fellow and attended College meetings.

    Friends Made at Cambridge

    • Another friend during his Cambridge years was McTaggart, a philosopher even shyer than Russell.
    • Two other friends he met in his early days in Cambridge were Lowes Dickinson and Roger Fry.
    • From his first moment at Cambridge, in spite of shyness, he was exceedingly sociable.
    • He became friends with Whitehead who told the younger members to investigate Sanger and Russell.

    Changes & Disappointments:

    • By the time Russell was in his fourth year, he had ceased to be the shy prig that he was when he first went to Cambridge.
    • During his time at Cambridge he began to disagree with his people in everything except politics.
    • He gradually unlearned habits of thought acquired there, considering most of what he learned in philosophy erroneous.
    • He valued intellectual honesty but realized its limitations, even at Cambridge.

    Overview: Cambridge provided Russell with friends, experience of intellectual discussion, and the development of intellectual honesty.

    Bertrand Russell: Love, Marriage, and Relationships

    Bertrand Russell’s life included a complex web of romantic relationships, marked by intense emotions, intellectual connections, and evolving views on love and marriage.

    Three Passions Russell said that three passions governed his life: love, knowledge, and pity. He sought the union of love as a prefiguring vision of heaven and found it at last.

    Alys Pearsall Smith

    • Early relationship: Russell’s relationship with Alys began to develop after he turned 21.
    • Engagement: They discussed divorce and free love. Although deeply in love, he felt no conscious desire for physical relations and he felt his love had been desecrated by physical contact.
    • Marriage Plans & Family Objections: Russell faced strong opposition from his family regarding his relationship with Alys. His grandmother’s objections echoed those she had voiced against his father’s relationship. Russell disregarded his family’s concerns, relying on his inheritance of £20,000 from his father.
    • Concerns About Heredity: There were concerns, based on medical opinion, about the couple having children due to heredity. Russell was initially willing to break off the engagement because he greatly desired children. Alys, however, did not have a strong wish for children.
    • Paris: Russell accepted a position in Paris, partly to appease his grandmother, during which time Alys grew jealous of his interactions with her sister.
    • Intimacy: Russell and Alys grew increasingly intimate, but his family continued their attempts to end the relationship.
    • Sexuality: Neither Russell nor Alys had previous experience of sexual intercourse when they married.
    • Differing Views on Sex: Alys believed that sex was beastly, that all women hated it, and that men’s brutal lusts were the chief obstacle to happiness in marriage. She believed intercourse should only take place when children were desired. Because they decided not to have children, she had to modify her position.
    • Marriage: Russell shared his thoughts about his upcoming marriage. He felt a pure joy of mutual love, but also feared the terrible things life may bring to his wife.
    • Travels: They spent the first three months of 1895 in Berlin, where Russell studied economics at the university and they attended concerts.
    • End of the Relationship: Russell lost his instinctive sexual impulse toward Alys. He began to criticize her morally. He eventually confessed that he no longer wished to share a room. Despite the lack of attraction, Russell attempted sex with Alys twice a year to alleviate her misery.
    • Separation: Those around them had noticed the waning affection in their relationship.

    Other relationships

    • Housemaid: Russell had kissed and hugged a housemaid and asked her to spend the night with him, but she refused.
    • Sally Fairchild: Russell became very fond of Sally Fairchild, but he did not consider himself in love with her. He never kissed her hand.
    • Lady Ottoline Morrell: Russell had intense feelings for Ottoline, and did not care what might be involved, even if her husband murdered them both. Their conditions were that they should never spend the night together.
    • Girl from Chicago: While in America, he met a girl from Chicago, and they agreed to live together openly, perhaps marrying later on if a divorce could be obtained.
    • Other Women: He wished to make love to another woman, but thought he ought to explain about Ottoline first.
    • Friendship with Lucy Donnelly: Lucy suffered profoundly when Helen, a mutual friend, became engaged. Russell attempted to comfort her.

    Views on Marriage and Love

    • Russell believed that real life means a life in some kind of intimate relation to other human beings and emotions.
    • He saw the ghastly competition in most marriages as to which partner is to be the torturer, which the tortured.
    • Marriage as an institution was viewed as a social construct designed to fit instinct into a legal framework.
    • Romantic love was viewed as love from a certain distance, sufficient to leave the romance untarnished.
    • His grandmother thought the affection of husbands and wives was sometimes a little selfish.

    Bertrand Russell: Philosophical Views and Development

    Bertrand Russell’s philosophical views evolved throughout his life, encompassing a wide range of topics from metaphysics and ethics to logic and the philosophy of mathematics.

    Early Influences and Development:

    • Russell’s grandmother held an animus against metaphysics, considering the subject nonsensical.
    • At a young age, Russell considered himself a materialist, believing that the human body is a machine.
    • He read extensively, including Mill’s Political Economy and Logic, and initially accepted Mill’s views completely.
    • He also read Herbert Spencer, agreeing with his bias but finding him too doctrinaire.
    • Russell sought a reason for supposing mathematics to be true and found Mill’s arguments inadequate.
    • He found the arguments in Mill’s Logic on this subject very inadequate.

    Beliefs and Doubts:

    • Russell once believed truth was a good thing to get hold of, but later developed doubts and uncertainty.
    • He thought that a theological proposition should not be accepted unless supported by reason.
    • He believed in God and considered himself a theist, seeking scientific arguments for his belief.
    • He also explored the question of free will, considering arguments from the omnipotence of God, the reign of law, and the determination of actions by motives.

    Ethics and Morality:

    • Russell’s “rule of life” was to act in a manner most likely to produce the greatest happiness, considering both intensity and the number of people made happy.
    • He saw primitive morality as originating in the idea of the preservation of the species, but did not believe a civilized community ought to follow this rule.
    • He believed conscience is due to evolution, forming instincts of self-preservation.
    • Russell thought it was a mistake to believe that general maxims are to be found in conscience.
    • He thought the true method of Ethics should be inference from empirically ascertained facts, obtained in that moral laboratory which life offers to those whose eyes are open to it.
    • Russell’s views on ethical subjects put him at odds with Gilbert Murray, who Russell observed was a utilitarian, whereas Russell judged pleasure and pain to be of small importance compared to knowledge, the appreciation and contemplation of beauty, and a certain intrinsic excellence of mind.
    • He was critical of those who hate self-control and make up theories to prove that self-control is pernicious.
    • He also believed the ethical faith which is warranted yields most of what is necessary to the highest life conceivable, and all that is necessary to the highest life that is possible.
    • Russell was known to engage in arguments about ethics.

    Metaphysics and Epistemology:

    • Russell was interested in metaphysics.
    • He found metaphysics interesting and enjoyed the curious ways of conceiving the world that the great philosophers offer to the imagination.
    • Russell also believed that accident led him to read Leibniz, because he had to be lectured upon, and McTaggart wanted to go to New Zealand, so that the College asked him to take his place so far as this one course was concerned.
    • In the study and criticism of Leibniz, Russell found occasion to examine the metaphysics of the subject-predicate form.
    • He was interested to discover the dire effect upon metaphysics of the belief that all propositions are of the subject-predicate form.
    • He thought it philosophically and practically erroneous to believe that philosophy tells us only what is, not what ought to be.
    • Russell thought that to believe that our knowledge is caused by the object perceived depends upon the sensational theory of knowledge.
    • He also thought that circumstances are apt to generate perfectly concrete moral convictions, which may make it impossible to judge beforehand what our moral opinion of a fact will be.
    • Russell believed one’s work is never so bad as it appears on bad days, nor so good as it appears on good days.

    Political and Social Views:

    • Russell initially believed that land nationalization would secure all the benefits that Socialists hoped to obtain from Socialism.
    • He was interested in politics and economics.

    Logic and Mathematics:

    • Russell’s most important work was in mathematical logic, which he initially pursued to find reason to believe mathematics was true.
    • He found Continental work that addressed his mathematical questions after leaving Cambridge.
    • He aimed at bringing discomfort to philosophers; one who favored his outlook might retort that while he pleased the philosophers, he amused the plain people.
    • Russell imagined conversations with Leibniz, telling him how fruitful his ideas have proved, and how much more beautiful the result is than he could have foreseen.
    • Russell translated the Deceased Wife’s Sister Bill into symbolism.
    • Russell constructed Principia Mathematica to seek refuge in pure contemplation.
    • He considered the solution he found to a puzzle involving George IV and Walter Scott threw a flood of light on the foundations of mathematics and on the whole problem of the relation of words to things.

    Relationships with Other Philosophers:

    • Moore influenced Russell to abandon both Kant and Hegel.
    • Russell disagreed with Whitehead in philosophy, so collaboration was no longer possible.
    • Russell found it a great bond to dislike the same things, and dislike is perhaps a deeper indication of our real nature than explicit affections, since the latter may be effects of circumstances, while dislike is a reaction against them.

    Religion:

    • Russell was opposed to Old Kant, who, in his eyes has done much harm and mischief to philosophy, even to mankind.
    • He also thought there was no shred or particle of truth in any of the metaphysics it has suggested.

    Bertrand Russell on War and Pacifism

    Bertrand Russell’s experiences and views regarding war, particularly World War I, are detailed throughout the provided text.

    Early Views and Influences:

    • Russell’s grandmother was a “fierce Little Englander” who disapproved strongly of Colonial wars.
    • She told him that the Zulu War was wicked and largely the fault of Sir Bartle Frere.

    Shifting Views on War:

    • Initially, Russell was a Liberal Imperialist and not a pro-Boer at the start of the Boer War.
    • British defeats in the Boer War caused him anxiety.
    • By early 1901, he became a pro-Boer.
    • In 1901, Russell’s Imperialistic views evaporated during a crisis.

    World War I and Pacifism:

    • At the beginning of the 1914-1918 War, Crompton was solicitor to the Post Office, but his agreement with his wife’s Irish Nationalist opinions made his position untenable, and he was dismissed.
    • Russell began his opposition to World War I at the earliest possible moment.
    • His attitude during the First World War was influenced by a sort of mystic illumination that remained with him.
    • Russell felt that heroic and almost hopeless defiance appeared splendid.
    • Russell was against the war although the old Adam wanted the latter.
    • Opposition to the Entente: He stated his objections to the policy of the Entente, pointing out the likelihood of it leading to war.
    • He found Amery’s blood-lust at the thought of a war with America shocking.
    • Russell was preparing to go to jail to protest Britain’s involvement in World War I.
    • During the war, Russell found that intellectual honesty had its limitations, even at Cambridge, which he regarded as home.
    • Impact on Personal Relationships: Whitehead completely disagreed with Russell’s pacifist position during the First World War, which caused a diminution in their friendship.
    • In the last months of the war, Whitehead’s younger son was killed, which caused him appalling grief and had a great deal to do with turning his thoughts to philosophy.
    • The shock of the war killed Russell’s passion for a girl from Chicago, and he broke her heart.

    Motivations and Beliefs:

    • He was deeply moved by the suffering of others.
    • Russell’s pacifism seemed to have a connection with alcohol.
    • He was against the war and thought it could not be justified unless there was a likelihood of victory.
    • Russell believed that a war could not be justified unless there was a likelihood of victory.

    Post-War Reflections:

    • After the war, Russell did not go back to Italy until 1949 because Mussolini sent word that any Italian who spoke to him should be assassinated.
    • He was invited to give the Lowell lectures in Boston during the spring of 1914, and concurrently to act as temporary professor of philosophy at Harvard.
    • He traveled straight from New York to Boston and was made to feel at home in the train by the fact that his two neighbors were reading detective stories.

    By Amjad Izhar
    Contact: amjad.izhar@gmail.com
    https://amjadizhar.blog

  • Backend Full Course NodeJS ExpressJS PostgreSQL Prisma & Docker Full Stack Backend and Database Development

    Backend Full Course NodeJS ExpressJS PostgreSQL Prisma & Docker Full Stack Backend and Database Development

    The text is from a programming tutorial focused on building a backend application with Node.js and related technologies. It guides the learner through creating server endpoints, handling HTTP requests (GET, POST, PUT, DELETE), and managing authentication. The tutorial covers setting up a database (SQLite and PostgreSQL), using an ORM (Prisma), and containerizing the application with Docker. Emphasis is put on building a full-stack application, managing user data, and securing endpoints using middleware and JSON Web Tokens (JWT). The process begins with a simple server and scales up to a production-ready application. Specific tasks include creating REST APIs, interacting with databases, and deploying the application in isolated environments.

    Back-End Server Study Guide

    Quiz

    Answer each question in 2-3 sentences.

    1. What is a callback function in the context of the listen function for a server?
    2. Why is it important to kill the execution of the server during development?
    3. Explain the purpose of npm run dev in the context of the source material.
    4. What is a developer dependency, and how is it installed using npm?
    5. What is the significance of “localhost:8383” (or a similar address) in server development?
    6. Explain the difference between HTTP verbs (e.g., GET, POST, PUT, DELETE) and routes/paths in server requests.
    7. Explain the difference between the “require” syntax and “import” syntax used for adding a javascript package.
    8. What is an environment variable and why is it useful in server configuration?
    9. What is an ORM and why is it useful?
    10. What is a Docker container and what is it used for?

    Quiz Answer Key

    1. A callback function is a function passed as an argument to another function (in this case, listen), to be executed after the first function has completed its operation. In the context of the server’s listen function, the callback is executed once the server is up and running, usually to log a message indicating that the server has started.
    2. Killing the server execution during development is important to reflect changes made to the server files. Without restarting the server, the changes won’t be implemented, and debugging becomes difficult.
    3. npm run dev is a command defined in the package.json file to start the server using a script, often involving tools like Nodemon. This automates the server startup process and can include additional commands beyond just running the server file.
    4. A developer dependency is a package needed only during development, not in production. It is installed using npm install –save-dev <package_name>, which adds the package to the devDependencies section of package.json.
    5. “localhost:8383” is the address (URL) used to access the server running on the local machine. localhost refers to the local machine’s IP address, and 8383 specifies the port number the server is listening on for incoming requests.
    6. HTTP verbs define the action the client wants to perform (e.g., GET to retrieve data, POST to send data to create a resource, PUT to update a resource, DELETE to remove a resource). Routes/paths are the specific locations (URLs) on the server where these actions are directed (e.g., /, /dashboard, /api/items).
    7. The “require” syntax is the older syntax for adding a javascript package, where you could write, const express = require(‘express’). The import syntax is more modern and you can write import express from ‘express’.
    8. An environment variable is a key-value pair stored outside the application code, often in an .env file or system settings, used to configure the application’s behavior. They’re useful for storing sensitive information (like API keys or database passwords) and for configuring different environments (development, production).
    9. An ORM is an Object Relational Mapper, a tool that allows developers to interact with a database using an object-oriented paradigm. It simplifies database interactions by mapping database tables to objects, reducing the need to write raw SQL queries.
    10. A Docker container is a lightweight, standalone, executable package that includes everything needed to run a piece of software, including the code, runtime, system tools, system libraries, and settings. It ensures consistency and portability across different environments.

    Essay Questions

    1. Discuss the evolution of server setup throughout the source material. Compare and contrast using node server.js, using npm scripts, and using Nodemon. What are the advantages and disadvantages of each approach?
    2. Explain the role and implementation of middleware in the context of authenticating users for specific routes (e.g., to-do routes). How does the middleware intercept and process incoming requests, and what actions does it take based on the request’s authentication status?
    3. Describe the process of setting up and interacting with a database using SQL queries. Explain the purpose of each table, the columns within each table, and the relationships between the tables.
    4. Explain the process of containerization with Docker. Be sure to discuss Dockerfiles and Docker Compose, and describe the benefits of using Docker containers for application development and deployment.
    5. Discuss the importance of security in back-end development as illustrated in the source material. Describe the techniques used to protect user passwords and to authorize users to access certain data.

    Glossary of Key Terms

    • Port: A virtual communication endpoint on a device that allows different applications or services to send and receive data over a network.
    • Callback Function: A function passed as an argument to another function, to be executed after the first function has completed its operation.
    • npm (Node Package Manager): A package manager for Node.js that allows developers to easily install, manage, and share JavaScript packages and libraries.
    • Script (package.json): A set of commands defined in the package.json file that can be executed using npm run <script_name>.
    • Developer Dependency: A package required only during development, not in production, and specified using the –save-dev flag during installation.
    • Localhost: The standard hostname given to the address of the local computer.
    • URL (Uniform Resource Locator): A reference to a web resource that specifies its location on a computer network and a mechanism for retrieving it.
    • IP Address: A numerical label assigned to each device connected to a computer network that uses the Internet Protocol for communication.
    • HTTP Verb: A method used to indicate the desired action to be performed on a resource (e.g., GET, POST, PUT, DELETE).
    • Route/Path: A specific location (URL) on the server that corresponds to a particular resource or function.
    • Endpoint: A specific URL on the server that represents a particular function or resource, and that listens for incoming network requests.
    • REST (Representational State Transfer): An architectural style for designing networked applications, based on transferring representations of resources.
    • API (Application Programming Interface): A set of rules and specifications that software programs can follow to communicate with each other.
    • JSON (JavaScript Object Notation): A lightweight data-interchange format that is easy for humans to read and write, and easy for machines to parse and generate.
    • Environment Variable: A variable whose value is set outside the application code, often in an .env file or system settings, used to configure the application’s behavior.
    • Middleware: Functions that intercept and process incoming requests before they reach the final route handler.
    • bcrypt: A password-hashing function that is used to securely store passwords.
    • JWT (JSON Web Token): A compact, URL-safe means of representing claims to be transferred between two parties.
    • ORM (Object-Relational Mapping): A technique that lets you query and manipulate data from a database using an object-oriented paradigm.
    • Docker Container: A lightweight, standalone, executable package that includes everything needed to run a piece of software, including the code, runtime, system tools, system libraries, and settings.
    • Docker file: A text document that contains all the commands a user could call on the command line to assemble an image.
    • Docker Compose: A tool for defining and running multi-container Docker applications.
    • SQL (Structured Query Language): A standard language for accessing and manipulating databases.

    Backend Development: Node, Express, PostgreSQL, and Docker

    Okay, here’s a detailed briefing document summarizing the main themes and important ideas from the provided source.

    Briefing Document: Backend Development Concepts and Project Setup

    Overview:

    This document summarizes key concepts and steps involved in setting up and developing a backend application, using Node.js, Express, and transitioning from SQLite to PostgreSQL with Prisma. It covers topics such as server initialization, routing, middleware, database management, authentication, and containerization using Docker. The main focus is on creating a to-do list application with authentication and data persistence.

    Key Themes and Ideas:

    1. Server Initialization and Basic Routing (Chapter 2 & 3):
    • The initial setup involves creating a Node.js server using Express.js to listen for incoming network requests on a specified port.
    • A simple server can be created with minimal code:
    • const port = 8383;
    • app.listen(port, () => {
    • console.log(`Server has started on Port ${port}`);
    • });
    • The use of npm scripts (defined in package.json) to manage server startup and development processes.
    • Use of nodemon to automatically restart the server upon file changes during development, improving the development workflow.
    • npm install nodemon –save-dev
    • Adjust the scripts in package.json to use nodemon server.js
    1. Handling HTTP Requests and Responses:
    • Servers need to be configured to interpret incoming requests, including HTTP verbs (GET, POST, PUT, DELETE) and routes/paths (e.g., /, /dashboard).
    • The server uses a callback function for each route to handle the request and send an appropriate response.
    • A 404 error indicates that the server could not find a route that matches the requested URL.
    • The server can send back files, such as index.html, to serve a website to the client.
    1. Project Structure and Modularization:
    • Organizing the project into folders like routes, middleware, public, and src for better code management and separation of concerns.
    • The public folder contains static assets (CSS, HTML, JavaScript) that are served to the client.
    • The routes folder contains separate files for handling different types of routes (e.g., API routes, website routes).
    • Using middleware for handling authentication and other request processing tasks.
    1. Modular Syntax and Package Management:
    • Adopting the newer JavaScript import syntax (import Express from ‘express’) instead of the older require syntax.
    • Configuring the package.json file with type: “module” to enable the new syntax.
    1. Database Management (Chapter 3):
    • Using SQLite as a simple SQL database for storing user data and to-do items.
    • Creating database tables (e.g., users, to-dos) with specific columns and data types using SQL commands.
    • SQL databases use “tables” like sheets for managing different data.
    • Example: CREATE TABLE users (id INTEGER, username TEXT UNIQUE, password TEXT)
    • The primary key is used to enable communication between tables (e.g., associating a to-do item with a user).
    • Using database.execute() to execute SQL commands.
    1. Authentication and Security (Chapter 3):
    • Implementing user registration and login functionality.
    • Encrypting passwords using bcrypt to protect user data.
    • “bcrypt has all the code for encrypting the passwords and creating a truly secure application”
    • Generating JSON Web Tokens (JWT) for user authentication.
    • Using middleware to verify JWTs and protect routes that require authentication.
    1. Client-Side Emulation and Testing:
    • Using a REST client (e.g., VS Code extension) to emulate browser network requests and test backend endpoints.
    • Defining different emulations for various functionalities, such as registering a user, logging in, creating to-dos, etc.
    1. Transitioning to PostgreSQL and Prisma (Chapter 4):
    • Upgrading from SQLite to PostgreSQL for better scalability and reliability in a production environment.
    • Using Prisma as an Object-Relational Mapper (ORM) to interact with the PostgreSQL database as if it were a JavaScript entity.
    • “Prisma as an ORM to interact with our PostgreSQL database as if it were a JavaScript entity”
    • Prisma simplifies database interactions and provides additional advantages.
    1. Dockerization (Chapter 4):
    • Containerizing the backend application using Docker for easy deployment and portability.
    • Using a Dockerfile to define the steps for building a Docker image for the Node.js application.
    • “Docker file is essentially an instruction sheet for creating one Docker container”
    • Using docker-compose.yml to orchestrate multiple Docker containers (e.g., the Node.js server and the PostgreSQL database).
    • Defining environment variables and port mappings in the docker-compose.yml file.
    • Using volumes to persist data and configuration settings across container restarts.
    1. Prisma Schema and Migrations (Chapter 4):
    • Defining the database schema using Prisma’s schema language (e.g., schema.prisma).
    • Using Prisma migrations to manage changes to the database schema over time.

    Code Snippets and Examples:

    • Creating a table in SQLite:database.execute(`
    • CREATE TABLE users (
    • id INTEGER PRIMARY KEY AUTOINCREMENT,
    • username TEXT UNIQUE,
    • password TEXT
    • )
    • `);
    • Encrypting a password with bcrypt:const hashedPassword = bcrypt.hashSync(password, 8);
    • Signing a JWT:const token = jwt.sign({ id: result.lastInsertRowID }, process.env.JWT_SECRET, {expiresIn: ’24h’});
    • Verifying a JWT in middleware:jwt.verify(token, process.env.JWT_SECRET, (err, decoded) => {
    • // …
    • });
    • Docker Compose:version: “3”
    • services:
    • app:
    • build: .
    • container_name: todo-app
    • environment:
    • DATABASE_URL: “postgresql://postgres:password@database:5432/todos?schema=public”
    • JWT_SECRET: “your_jwt_secret_here”
    • NODE_ENV: development
    • PORT: 5003
    • ports:
    • – “5003:5003”
    • depends_on:
    • – database
    • volumes:
    • – .:/app
    • database:
    • image: postgres:13-alpine
    • container_name: postgres-db
    • environment:
    • POSTGRES_USER: postgres
    • POSTGRES_PASSWORD: password
    • POSTGRES_DB: todos
    • ports:
    • – “5432:5432”
    • volumes:
    • – db_data:/var/lib/postgresql/data
    • volumes:
    • db_data:

    Conclusion:

    The source material covers a comprehensive guide to backend development, starting from basic server setup to advanced concepts like database management, security, and containerization. The progression from SQLite to PostgreSQL with Prisma, and the introduction of Docker, represents a significant shift towards production-ready backend applications. The key takeaway is the importance of structuring code, managing dependencies, and implementing security measures to build robust and scalable backend systems.

    Server-Side Development: Key Concepts and Practices

    ### 1. What is a port in the context of server development, and why is it important to define one?

    A port is a virtual communication endpoint on a device that allows different applications to listen for and receive network requests. Defining a port is crucial because it tells the server exactly where to listen for incoming connections. Common port numbers include 3000 and 8000, but any four-digit number can be used. Without a defined port, the server wouldn’t know where to “listen” for requests, and clients wouldn’t be able to connect to it.

    ### 2. What is a callback function in JavaScript, and how is it used in the context of creating a server?

    A callback function is a function that is passed as an argument to another function, to be executed at a later time. In server creation, a callback function is often used with the `listen` function. This callback is executed when the server is successfully started and is listening for incoming requests. It can be used to log a message to the console, indicating that the server is running and on which port.

    ### 3. Why is it beneficial to use `npm` scripts for running a server, and how do they work?

    Using `npm` scripts, defined in the `package.json` file, offers a structured and repeatable way to run server commands. They allow you to define shortcuts for complex commands, making it easier to start, stop, or restart the server. `npm` scripts work by defining a key (e.g., “dev”) in the “scripts” section of `package.json`, and assigning a command string to that key (e.g., “node server.js”). To run the script, you use the command `npm run [key]`, which executes the associated command.

    ### 4. What is `nodemon`, and why is it used as a developer dependency?

    `nodemon` is a tool that automatically restarts the server whenever changes are made to the code. It’s used as a developer dependency (installed with `npm install –save-dev nodemon`) because it significantly improves the development workflow by eliminating the need to manually restart the server after each code modification. It’s not needed in production because the code shouldn’t be constantly changing.

    ### 5. What is the difference between a URL and an IP address, and how do they relate to a server?

    A URL (Uniform Resource Locator) is a human-readable address that points to a specific resource on the internet, often a server. An IP address (Internet Protocol address) is a numerical label assigned to each device connected to a computer network that uses the Internet Protocol for communication. Every URL is mapped to an IP address, allowing browsers to locate the server. A URL is easier for humans to remember, while the IP address is the actual address used for network communication.

    ### 6. What are HTTP verbs (methods) and routes (paths), and how are they used to handle incoming network requests?

    HTTP verbs (e.g., GET, POST, PUT, DELETE) define the action a client wants to perform on a resource. Routes (or paths) specify the specific location or “endpoint” on the server that the client is trying to access (e.g., “/”, “/dashboard”, “/api/users”). The server is configured to listen for specific HTTP verbs on specific routes. When a request arrives, the server examines the verb and route to determine how to handle the request and what action to perform.

    ### 7. What is an environment variable, and why are they used in server-side applications?

    An environment variable is a key-value pair that stores configuration information outside of the application’s code. They are used to store sensitive information like API keys, database passwords, and other settings that might vary between development, testing, and production environments. Storing these values in environment variables keeps them secure and allows you to change configurations without modifying the application’s code.

    ### 8. Explain the purpose and organization of the file structure created for a more sophisticated backend application (e.g., `src`, `routes`, `middleware`, `db.js`, `public`, `.env`, and `docker-compose.yaml`).

    This structure aims to separate concerns and improve code organization. Here’s a breakdown:

    * **`src`**: Contains the source code of the application.

    * **`routes`**: Holds files that define the different API endpoints and their associated logic (e.g., `auth-routes.js`, `todo-routes.js`).

    * **`middleware`**: Contains functions that intercept incoming requests and perform tasks like authentication or data validation before the request reaches the route handlers (e.g., `authMiddleware.js`).

    * **`db.js`**: Contains the logic for connecting to and interacting with the database. Includes SQL queries.

    * **`public`**: Stores static assets like HTML, CSS, and JavaScript files that make up the front-end of the application. These files are served directly to the client.

    * **`.env`**: Stores environment variables (sensitive configuration information).

    * **`docker-compose.yaml`**: Defines the configuration for running multiple Docker containers together, such as the application server and the database server.

    Routes, Endpoints, and HTTP Verbs: A Server-Side Guide

    Here’s a discussion of routes and endpoints, based on the provided source:

    • A route can be understood as a specific path or destination within a server that is set up to listen for incoming network requests. In the context of a URL, the route is the sub-portion that directs the request to a specific area. For example, in the URL http://www.youtube.com/dashboard, /dashboard would be the route.
    • HTTP verbs, also referred to as methods, describe the action that the network request intends to perform. Common HTTP verbs include:
    • GET: Used to retrieve information
    • POST: Used to create new data
    • PUT: Used to update existing data
    • DELETE: Used to delete data
    • Endpoints are created within the server-side application using HTTP verbs and routes. An endpoint is a specific subregion within the server that is set up to listen to incoming network requests, and to which code can be directed to execute a body of logic and respond appropriately.
    • Together, the route and the HTTP verb create the endpoint. For instance, a GET request to the /home route would define one endpoint, while a POST request to the /api/data route would define another. The method informs the nature of the request, and the route is a further subdirectory.
    • Types of endpoints:Website endpoints: These send back HTML and are accessed when a user enters a URL in a browser.
    • API endpoints: These do not send back a website, but rather facilitate behind-the-scenes operations, like saving data when a user clicks “save”. They might send back data in JSON format.
    • Middleware can be used to handle authentication between the client and server side.
    • Client emulators, such as the REST client in VS Code, can be used to emulate the process of sending out a network request and testing endpoints.
    • Express.js is a framework that can be used to configure servers to listen to and interpret incoming requests.

    Database Interactions: Storage, Management, and Security

    Here’s a discussion of database interaction, as described in the sources:

    • Databases store data from user interactions. When a user performs an action like saving data, a network request is sent to the back end, which may then send another network request to the database to persist the information.
    • Backend applications are set up to listen for incoming network requests and can then interact with a database to store, retrieve, update, or delete data.
    • SQL (Structured Query Language) commands are used to interact with databases. These commands can create tables, define columns, and establish relationships between tables.
    • CRUD (Create, Read, Update, Delete) actions are the four basic operations that can be performed on data in a database. These actions correspond to specific HTTP verbs:
    • Create corresponds to POST
    • Read corresponds to GET
    • Update corresponds to PUT
    • Delete corresponds to DELETE
    • When building an application, it is important to configure endpoints to manage authentication and to ensure data is properly saved to the database when a new user registers.
    • Prepared statements can be used to safely insert values into SQL queries, preventing SQL injection vulnerabilities.
    • ORMs (Object-Relational Mappers) provide an abstraction layer that allows developers to interact with databases using a more object-oriented approach.
    • Migrations provide a way to manage changes to the database schema over time, ensuring that all instances of the database are updated to the most recent version.
    • Docker allows containerization of databases, making it easier to deploy and manage them.
    • Directly logging into the database via the command line provides the ability to modify it directly using SQL queries.

    Authentication and Tokens: A Concise Guide

    Here’s a discussion of authentication and tokens, based on the provided sources:

    • Authentication is the process of verifying the identity of a user or client attempting to access a protected resource.
    • Authentication middleware can intercept network requests to verify the token associated with the request and ensure that the user is authorized to access the requested resource.

    Here are key aspects of authentication and tokens:

    • Encryption: Instead of storing passwords verbatim, they are encrypted using libraries such as bcrypt. Bcrypt generates a hash for a given string, enhancing security. Encryption algorithms are deterministic, meaning that encrypting the same password using the same algorithm will always produce the same encrypted key.
    • Tokens:
    • After successful registration or login, the server generates a unique token (typically a JSON Web Token or JWT) and sends it back to the client.
    • The client then stores this token (e.g., in local storage or cookies) and includes it in the headers of subsequent network requests.
    • JSON Web Tokens (JWT) are a standard for securely transmitting information between parties as a JSON object.
    • JWT Authentication: This involves the use of a JSON token as a secure password associated with a user, used to authenticate them without requiring repeated sign-ups.
    • Token validation:
    • When a network request is made to a protected endpoint, the server’s middleware extracts the token from the request headers.
    • The server then uses a secret key to verify the authenticity and integrity of the token.
    • If the token is valid, the middleware grants access to the requested resource. If the token is invalid or missing, the middleware rejects the request and returns an error response.
    • Token expiration: Tokens typically have an expiration time, after which they are no longer valid and the user must re-authenticate.
    • Environment variables: Secret keys used for token verification (e.g., JWT_SECRET) are stored as environment variables to prevent them from being exposed.
    • Benefits of tokens: They allow a user to authenticate once and then access protected resources for a specified period without needing to re-enter their credentials.

    Client Emulation: Testing Server Endpoints and Network Requests

    Here’s a discussion of client emulations, based on the provided sources:

    • Client emulators are tools that mimic the behavior of a client (such as a web browser) in order to test server endpoints and ensure they are functioning correctly.
    • A client emulator emulates the process of sending out a network request.
    • REST clients, such as the REST client in VS Code, are a commonly used type of client emulator. These tools allow developers to define and send HTTP requests to specific URLs and inspect the responses returned by the server.
    • Client emulators can be used to test different types of endpoints, including website endpoints and API endpoints.
    • When using a client emulator, it is possible to specify various parameters of the network request, such as the HTTP verb (e.g., GET, POST, PUT, DELETE), headers, and body.
    • Client emulations can be created to mirror user actions. This is analogous to running tests in JavaScript or another programming language.
    • To create a client emulation, you would specify the HTTP verb, URL, headers, and any data to be sent in the body of the request.
    • Client emulators can be particularly useful when developing APIs, as they allow developers to test endpoints and verify that they are functioning correctly without having to build a full user interface.
    • By predefining interactions, developers can emulate functionalities as if a user were using an application.
    • The process involves thinking of everything a user could possibly do and then creating those actions programmatically. This ensures that everything is working adequately.
    • Client emulations can be used to test authentication flows, such as user registration and login.
    • The REST client in VS Code uses a .rest file extension to emulate browser network requests.
    • The REST client separates all the template Network requests using triple pound signs (###).
    • The REST client displays the response from the server, including the status code, headers, and body.
    • Content type is a parameter of the network request that can be configured.
    • The authorization token is encoded into the request so that middleware can intercept and interpret it, and consequently authorize the client.

    Server Configuration: Back End, Ports, and Middleware

    Here’s a discussion of server configuration, based on the provided sources:

    • Back end requests: The back end is all of the code that goes on behind the scenes to facilitate user experience of the internet. The back end consists of external devices all around the world that communicate through a network via network requests. These network requests are encoded with all the information that allows communication to happen and for both parties to contribute to someone’s experience of the internet.
    • Listening for requests: A back end application must be set up to listen to incoming Network requests; otherwise, no websites will load. The back end is just hardware running software that is connected to the internet and that listens to incoming requests to its IP address.
    • Full stack interaction: The moment the network request leaves a computer and goes into the network, everything on the other end of that equation is the back end. The full stack is the front end, which is on the client side, and the back end, which happens server side.
    • Ports: To tell an app to listen, one parameter that must be provided is a port, which is a subdirectory within the IP address.
    • Middleware:
    • Middleware is part of configuring a server.
    • It is configuration that is set in between an incoming request and the interpretation of that request.
    • It can be thought of as a protective layer.
    • A common type of middleware is authentication middleware which handles all of the authentication between the client and the server side.
    • File organization:
    • The specs file contains all the specifications for a project.
    • Modern project directories should contain source code, which is all the code that creates an application.
    • The server should be the hub of the application.
    • Node.js:
    • Node.js is a Javascript runtime.
    • With the experimental features available in the later versions of Node.js, the server reboots automatically when changes are saved.
    • Express.js:
    • Express.js is a minimalist web framework for Node.js.
    • It is commonly used to build back end applications.
    • Docker:
    • Docker allows containerization of applications.
    • Docker is an alternative to having software installed on a computer.
    • Environment Variables:
    • Environment variables are a storage of keys and values; the key is the lookup term, and the value is a potentially secret string of characters that needs to be referenced for the configuration of a project.
    • Any top secret information is thrown in the .env file so that it can be avoided from being uploaded, for example, to GitHub.
    • File types:
    • .js is the Javascript file extension.
    • .rest or .http are extensions used for client emulators.
    • .env files are for environment variables.
    • .yaml files are used to indicate instruction sheets.
    • Setting up a server:
    • Setting up a basic server only takes about four lines of code.
    • The code must define a variable called Express and set it equal to the Express package.
    • The code defines the back end application.
    • The code configures the app and tells it to listen to incoming requests.
    Backend Full Course | NodeJS ExpressJS PostgreSQL Prisma & Docker

    The Original Text

    hello there my name is James and welcome to this backend full course where we’re going to go from being complete amateurs at backend development to Absolute Pros building all sorts of amazing backend applications for the internet now I am very excited to have this course live on the channel because I personally know how hard it can be to learn the art of backend development I went through that on my learned code journey and now I’m very excited to have this course available because ever since that experience I’ve wanted to create a course that does three things first it’s super beginner friendly which means that even if you have absolutely no experience with backend development you will be able to complete this course we’ll start from the very beginning from scratch and build up from there the second thing the course does is it teaches you all of the Core Concepts and foundational knowledge you need to know all of the best practices the latest and greatest technologies that you need to know to go off and become these Supreme wizards of backend development so that should be super cool and last but absolutely not least if you get to the end of the course you will be left with a portfolio of projects of the caliber needed to get you hired as a full stack developer backend developer software Dev you name it these projects in your GitHub will knock the socks of prospective employers your family and friends and it will just be loads of fun now the course itself is broken down into four chapters chapter one is a theory lesson where I want you to sit back and open up your brain to the universe as I share with you some theory about how the internet works what the full stack is and consequently what the backend is what we can expect from it and how we can actually go about coding out some backend infrastructure now this is not for you to sit down and memorize everything that I’m saying or take some notes it’s just an exercise and gaining some familiarity with some of the concept that we will then put into practice in the latter three chapters in Chapter 2 project number one we’re going to build a very rudimentary backend application that just demonstrates some of these Core Concepts in code form pretty simple doesn’t look the best but you know it serves a purpose as it allows us to dive into the last two projects which is super cool project number two chapter 3 is a complete complete backend application it’s kind of like a quick start backend application where we develop a very comprehensive backend server and have complimentary database interactions we’ll be using the sqlite database which is a super lightweight FAS to get up and running database very popular maybe not so much for production but if you’re just looking to get your backend application up and running it is a great choice so we’ll learn how we can develop a backend application that serves up a front end uses authentication and has database interactions and then in the last project we’ll take that code base to the next level God te mode we’re going to be using postgress for the database we’re going to be using an OM which is an object relational mapper and that’s going to be Prisma we’ll serve up a front end we’ll handle all of the authentication and database and then at the end we’ll dockerize all of these uh backend services so that we have this containerized application it should just be absolutely Wicked a brilliant code base to have on your GitHub page for once again prospective employeers to have a look at and be like yes this is the person we want to hire to develop our backend infrastructure now as for the prerequisites what do you need to know to complete the course well the list is pretty short all you need to know is some JavaScript if you’re looking for a course to brush up on your JavaScript I’ve got one Linked In the description down below but everything else you need to know in this course will be taught to you so you just need some pretty you know reasonable JavaScript skills and you will be absolutely sweet getting to the end of the course and last but not least what do you do should you get stuck at any point well I’ve got you covered there too first up we have some cheat sheet notes you can just keep these open if you want they’re available for JavaScript and they just cover all the basic JavaScript techniques that you should be aware of Linked In the description down below and as for any questions qus or queries you may have if you head over to the GitHub page for this product project that has all of my code there that you can look at and compare and with all your questions you can hit over to the issues tab click on issues and then just write your question and either myself or someone else will respond and help you understand whatever it is that you may have been struggling with so that should be absolutely Wicked finally if you want to support the channel you can obviously become a channel member and unlock the Discord where I’m super active so you could ask any questions there too at the end of today should be an absolute Wicked course I’ve been so excited to release it I’ve worked on it for ages so proud of this material and I hope you thoroughly enjoy it and with that all said if you do enjoy the course don’t forget to smash the like And subscribe button so that I can continue to feed Doug a healthy diet all right it’s time to dive into chapter one which is a theory lesson about how the internet actually works now as I said in the intro I don’t want you to take any notes I just want you to sit back get comfortable and be a sponge for the information that I am about to share with you you don’t have to memorize all of it I just want you to be familiar with some of the terms and Concepts so that when I refer back to them later in this course you can be like I know exactly what’s going on here and the first concept I’m going to introduce you to is known as the full stack now when you open up your computer load up a browser and come to YouTube essentially the programming that goes into creating that experience is known as the full stack it’s the overarching body where the C the culmination of all of these individual puzzle pieces working together to create that experience it’s kind of like a burger a burger is the end product just like you loading up youtube.com that’s the end product you experience YouTube and a burger has a whole lot of subcomponents that to to create this experience of enjoying a burger and that’s the same with YouTube a programmer or lots of programers have sat down and worked together and creating all of these puzzle pieces that come together to creating your experience of YouTube and as a collective they are referred to as the full stack it’s the full stack of things coming together to create that experience now the full stack can be broken down into two primary components one is known as the front end and the other is known as the back end obviously this is a backend course but to understand what the back end does we also have to understand what the front end is responsible for now the front end kind of is summarized in three Core Concepts one is the user you using YouTube are a user we’ve all got experience being a user the second concept is known as the client the client is the medium through which you interact with the internet in most cases says it’s a browser so Google Chrome would be the client through which you interact with the internet whether that’s entering a URL or clicking a save button on a website that’s all a client side experience because it’s happening on your device it’s on the side of the equation that is associated with the user or the client now the last term that comes together to create the front end is actually the front end itself what is the front end well at the end of the day it’s pretty much just the website so when you load up youtube.com at its core that’s some HTML CSS and Java script that your browser runs to create this visualization of the website and that is referred to as the front end so if the client is the medium through which you interact with the internet the front end is the legitimate interface that you can interact with to have this internet experience and so that is the side of the equation that is the user side it’s on the user’s device and that collectively creates the front end it’s the tangible side of this full stack experience now at the same time when you load up YouTube you’re probably connected to the internet what’s going on in the background to facilitate this experience well the answer is a lot of stuff there is so much magic going on behind the scenes to create your front-end experience that without it you just wouldn’t be able to enjoy the internet and that’s what this course is all about how can we program these systems now the way that I want to describe how the backend works is actually by step by-step explaining to you what happens when you open up your computer load up a browser and enter ww.youtube.com and in a split second get that website displaying on your screen and essentially how it actually works at a very technical level is that when you type in this URL http://www.youtube.com and hit enter your browser sends out what’s known as a network request now we can imagine that your computer doesn’t have every single website on the internet saved on it so that means that when you load these websites your browser is actually having to request these websites from an external network now there’s loads of examples of what a network could be it could be a cellular network for mobile phones it could be a Wi-Fi network or Network there’s lots of different examples of networks but essentially your browser when you type in and hit enter on this URL your browser emits this electromagnetic wave that is encoded with the information that your browser needs access to so it is encoded with the URL the network request has this URL saved into it and then it emits this network request into the network where the URL actually locates a destination now a lot of us might actually think of the internet as being this ethereal thing that exists around us and you know we magically pull the information out of the air when we type in these URLs but actually it’s slightly more complex than that the URL a lot of people might not know this but it actually is an address so when you type in http://www.youtube.com your network request is directed to an address in the network and that address doesn’t actually locate a website it actually locates another device connected to the internet now every device connected to the internet has what’s known as an IP address and that is its metaphysical address in the network so when we encode these addresses into the network request you know the network is set up to navigate and ultimately locate these devices now a URL is just a human friendly address so every device connected to the internet has an IP address not all of them have URLs but when you type in a URL that gets converted to an address via what’s known as the DNS which is the domain naming service the domain is the sub portion of the URL so in the case of http://www.youtube.com the domain is youtube.com that gets converted into an IP address which is a sequence of numbers not very easy to remember which is why we have a URL and that’s where these Network requests get directed to they get directed to these addresses these IP addresses in the network which are corresponding to another device connected to the network now that doesn’t really explain how you end up with a website on your screen but we will get there these devices at these uh IP addresses are set up to listen to incoming Network requests so when we develop servers like we will do in this course we set them up we connect them to the internet and we set them up to listen to incoming Network requests to their IP addresses now when a server receives our Network request which is asking for a website if you’ve entered the YouTube the URL ww.youtube.com it can then decode that Network request see that oh this individual is looking for the HTML code to load up the YouTube homepage and it can then go to its little database where it stored all of this code or read some files that are available on this external device and it can then encode them into a response so these Network requests that these servers receive also have a Return to sender address so they interpret the intent of the network request they do any actions that they need to and then they respond with the appropriate information data or service and that all happens in the split second that it takes to load a website like youtube.com you hit enter on that URL your browser adits the network request it’s encoded with all of this information that describes the intent of that Network request which in the case of entering a url url is to retrieve a website or gain access to a website it hits This Server and then the response is sent back across the network as literally electromagnetic uh waves whether or not it’s through a fiber optic cable or you know the air your browser receives this response that is encoded with all of the appropriate information in the case of a website it’s h2l code it then interprets the HTML code and displays it on your screen and you get the website now that is a full stack interaction where the moment the network request leaves your computer and goes into you know the ethos or the network everything on the other end of that equation is the back end the back end is all of that code that goes on behind the scenes potentially on other devices all around the world to facilitate your experience of the internet if we didn’t have backend applications set up to listen to these incoming Network requests there would be no websites to load it would just be you know you’d send out a network request into Oblivion and you would never get any response back and so that’s what we’re going to be coding in this course today there is so much you can do with backend applications and we’re going to get a really solid understanding of some of the most core operations that you can expect from backend applications now now there’s two more things I just want to throw on this Theory lesson at the very end number one is what the front end actually does I’ve talked about that entering a URL sends out this network request to gain us access to the HTML code to display the website well the website itself is typically just an interface to make sending out these Network requests even easier so when you hit save the same operation occurs to persist that data in a database a network request is sent out saying save this information to the database the server listening for these incoming Network requests receive the network request interprets that oh the user wants to save the data it then might even send out another Network request to the database with that information and that then gets persisted in the database so the backend can be you know a whole lot of separate servers interconnected to facilitate your experience of the internet and the other thing I wanted to point now is that you know nowadays we have a lot of modern solutions for developing backend applications such as Cloud infrastructure or serverless backend infrastructure all of this is just more servers running more code set up to listen to incoming Network request so at the end of the day the full stack is just the front end which is on the client side on your computer where the user interacts with it they get access to a front end and the back end is everything that happens service side where the server is all of these external devices all around the world and they communicate through a network via Network requests which are encoded with all the information that allows that communication to happen and for both parties to contribute to someone’s experience of the internet now I know that’s a lot of information to swallow once again you don’t have to remember all of it just being familiar with the concept of a backend okay it’s this external device that’s connected to the internet that’s list listening to incoming requests sent to its IP address uh a URL is just an IP address that’s written in a human friendly form the client side is everything that happens on the user’s device the client is the browser the front end is the website they are the user just being familiar with these Core Concepts will really help give you a solid foundational understanding of all of the decisions we make throughout this course and just make you that much better of a backend programmer anyway that’s the theory lesson over well done on getting to the end of chapter 1 and with that all said it’s time to get our hands dirty with some code as we dive into our first project and chapter 2 of this backend full course all righty you know what they say with the theory done the fun may now begun that’s a good one for you now it’s time to dive into chapter 2 which is our first practical introduction to backend development and at this point we’re going to quickly introduce some of the technology you will need you probably already have it installed on your device to complete this backend full course there’s three particular installations you will be needing number one is a code editor we’re going to be using a vs code on my device uh the link to download visual studio code is available in the description down below uh you can select your operating system and install it you probably used it before if you know JavaScript but this is obviously what the window looks like when you open it up something a bit resemblant of this you might have a different theme but yes you will need Visual Studio code a place to write all of the JavaScript and build out our backend infrastructure now for the second installation is going to be node.js where we have JavaScript we write JavaScript as JavaScript set of instructions we need what is known as a JavaScript runtime a runtime interprets and executes your instructions and the runtime we’re going to be using is nodejs we are going to install it on our device and the link to download is available in the description down below you can either install it via package manager or pre-built installer once again choose your operating system as for the version I’d recommend the LTS version or current version we’re going to learn how we can middle with the versions later in this course so at the end of the day it’s not that important and the last installation is going to be darker now if you’re unfamiliar with darker essentially what darker does is it allows you to containerize your applications now the reason Docker is brilliant is because the code that we write is often dependent on a particular operating system and what Docker allows you to do is containerize your application and create a set of instructions for this container which is just a virtual environment that can be consistently run across all sorts of uh systems architecture or operating systems and it just means that you don’t run into issues where one person can’t run your code you can someone else can someone else can’t or you go to deploy your code and it’s complicated because you you know are locked into a particular operating system so Docker allows you to wrap your application inside of a virtual environment and then just Define the set of instructions to configure the virtual environment and you’re absolutely sorted brilliant technology looks brilliant on a resume and this will come into play in chapter 4 our final project for this full course once again the link to install Docker is available in the description down below and when you boot it up you should end up with a window that looks a little bit like this now that we have all our installations done it’s time to jump into the code so what I’m going to do to begin is open up a blank window of Visual Studio code so here we have my blank window open and what we’re going to do from within here is Select this open but button and we’re going to come to a folder personally this is where I keep all of my coding projects and in here you’re going to create a new folder called back end- f-c course now I’ve already created my folder so I’m just going to go ahead and select it and once you have created or selected that folder then you just want to select open and that is going to open up that folder inside of Visual Studio code and this is where our projects in this course are going to go now over on the side here we can in fact confirm that we are in the correct folder directory because if we open up our Explorer here it says back in full course and I already have one folder in here it’s the chapter one folder which contains a document that is just the you know written version of the theory lesson so if you want to refer back to that at any point you can find this chapter 1 folder and download it to your project Direct or just refer to in the GitHub Reaper which is linked in the description down below so you just want to open up the chapter one directory and look for the theory. markdown file totally not necessary just if you want to have it there now the first thing we’re going to do in here is create a folder for our chapter 2 project so what I’m going to do is right click and create a new folder equally you can use these action buttons right here and I’m just going to create a folder called chapter to if I hit enter on that that is now created and it is within this folder that we are going to house our very first backend project now when we’re first getting started with a backend project using node.js and JavaScript there’s a number of ways that we can go about configuring this project now typically when you’re developing backend code inside of JavaScript we like to take advantage of what is known as the node package manager ecosystem where essentially what that is is a whole lot of different packages or code libraries that we can very easily gain access to and utilize them inside of our codebase to save us having to do everything from the very beginning super standard practice if you’re developing backend applications inside of nodejs and JavaScript and we’re going to see how to do it but essentially what it requires us to do is initialize our project as a node mpm project now to do that super simple the first thing we’re going to do is open up a terminal instance inside of our Visual Studio code I like to use the control backtick keys to toggle a terminal instance just like that now if you want to know all the key shortcuts that I use inside of vs code to speed up my coding process there is a link to a website that that explains all of them in the description down below including the key command to toggle your terminal you can also open up a terminal instance from the folder options just as equally now with this terminal instance open the first thing we note is that we are in the backend full course project directory inside of our terminal we actually want to be inside of chapter2 so the First Command we’re going to type in here is the CD or change directory command and then we’re just going to specify the folder to which we want to enter which is is the chapter2 folder directory and if I go ahead and hit enter on that we can see that our folder directory inside of our terminal has now been updated to chapter 2 so we’re officially inside of our folder directory and this is where we’re going to initialize our project now to be able to initialize a project the first thing we’re going to have to do is ensure that we have node installed on our device correctly now there’s a very easy way to check whether or not you have node installed on your device all you do is you type the node space- v the- v flag inside of your terminal and hit enter and if you have successfully installed node.js on your device you should get a version popping up right here if you receive any type of error that means your installation is either incomplete or incorrect so that’s just one thing we need to do to confirm that node is installed on our device once again you don’t have to worry about the version of node that you have installed on your computer we’re going to learn how we can mdle with the node version later in this course that will become important in chapters 3 and four so now that we’ve confirmed node is working we should also be able to access npm DV so you can run the npm – V command that will ensure that you have access to node package manager on your computer and then what we can do is run the mpm init command with the dasy flag now this command is going to initialize our nodejs backend project inside of our chapter 2 folder directory so if I go ahead and hit enter there we can see that we get an output in our terminal we get told that we wrote to a particular file the package.json file and we wrote this code to this Json to that file so now if we come over to what was originally an empty chapter 2 folder we can see that we have a file inside called our package.json file now what this file is it’s a Json file so it’s basically just a glorified string object it essentially just qualifies exactly what our project is all about it’s kind of like a Specs or specification file so up here we can see we have some Fields the name of the project the version you can modify the version when you you know publish your project to production we can give our project a description so we could say a simple backend ser Ser underneath that we have a main that’s not really relevant to us we have a scripts field this is going to be very relevant to us we’ll see exactly how later and then we just have some other fields that at the moment aren’t really that important but the morel of the story is that this file is going to contain all the specifications for our project now I’m going to go ahead and create another file inside of this folder so I’m going to select that folder create a new file and this is going to be called server. JavaScript or. JS which is the JavaScript file extension now if I go ahead and click enter on that server.js that’s going to initialize that Javascript file and we can see in here I can open it up type a whole lot of code it’s all sorts of JavaScript uh and this is where for this project we are going to create our server application our backend project now when it comes to creating server side applications or backend applications inside of node.js a common framework or package that is used available within the node package manager ecosystem is called Express now you technically don’t need Express to create a Javascript file and set it up to listen to incoming Network requests and act as a server however it is infinitely easier using a package like Express because Express is designed specifically for that purpose as we can see here it’s a fast unopinionated minimalist web framework for node.js so we’re going to be using that inside of this course it’s incredibly common you’ll find most big uh Enterprise level backend applications built out of nodejs will use express or an equivalent package to basically allow them to build these backend applications so what we need to do now that we have a server is add this package to our project and we can see that you know in this website right here for this package they tell us exactly how we go about doing that and we just run this npm install Express command inside of our terminal so if I come back over to the code specifically the terminal what I’m going to do down here is clear everything out and I’m now going to run npm install which is how we install packages from the npm or node package manager ecosystem and I’m going to install the Express package so if I hit enter on that that’s going to go ahead and download everything we need to utilize that package inside of our project so here we can see it’s added a whole lot of packages in a very short time span and we can also see that a whole bunch of files and folders have been added to our chapter 2 project directory now we obviously had the server.js and we originally had a package.json after we initial ized our nodejs project if we come into the package.json we can see that one thing has changed we now have this dependencies field and within the dependencies field we have the Express package listed this is super important because once again our specs for the project need to specify what code packages our project is dependent on hence we list Express as a dependency we also specify what version we use for our project once again it’s not too important if your version isn’t exactly equal to mine as long as they’re approximately the same the code should all be equivalent now the other reason this dependencies field is important is because if someone else downloads your code base they need to be able to install all the necessary packages to run your code just as equally if you were to publish this to a production environment a live environment for the internet when you configure that environment it needs to know what dependencies to install to get your project up and running now as for where these packages have been downloaded to everything is within the node modules file so if we go ahead and open that up that’s got a whole lot of files and folders in there we will not be touching any of them all of those packages just get thrown into the node modules file or folder and that just sits there and we don’t need to do anything about it we can see we also have this package-lock do Json that’s that’s another complicated file that we’re really just not going to be touching it’s not one that you want to Middle with the folders and files that are important to us are specifically for now this package.json and the server. JavaScript now to initialize a server inside of a node.js or JavaScript file using Express that literally only takes about four lines of code so it’s incredibly easy to get a server up and running that doesn’t mean that the server is complete but you know it’s a good start so that’s exactly what we’re going to do now so to initialize a server using Express the first thing we need to do is Define a variable called Express and set it equal to and what we set it equal to is we require in the Express package so essentially what this line of code does is it requires the Express package and we assign whatever is in that pre exis and code that package to this variable so that we can then use it all throughout our project so that basically Imports Express into our code base right here the next thing we need to do now that we have access to express is to Define our backend application and we do that very simply by defining a variable called app and setting it equal to and we invoke Express as a function so that’s going to create our backend application for us and then the last step is we say app and if you recall from our lesson from our Theory lesson the back end is just Hardware running software that is connected to the internet that listens to incoming requests to its IP address so we have this server app and we the last thing we always do this line goes at the very bottom of our code is we configure it and then we tell it to listen to incoming requests so that’s what this line does now when it comes to telling an app to listen one parameter we need to provide it is known as a port which is basically just a subdirectory within the IP address so the IP address is the address of the device and the port is a location within that device and so what we’re going to do in here is Define a port I’m going to use all uppercase as a variable and I personally like to use 8383 typically it’s a four-digit number some common ones are 3,000 8,000 so on and so forth I just like 8383 those are my lucky numbers so now we have this port what we can do is we can pass in the port as an argument to the lesson and we can say all right Mr server app I now want you to listen to incoming requests to this IP address specifically at this this port now there’s one other argument we can pass to this listen function and that is a call back function so in here I’m going to create this Arrow function and once again just a reminder if your JavaScript needs a bit of brushing up on there is a course Linked In the description down below but anyway in here we have an arrow function this is a callback function to be executed when our server is up and running and all we’re going to do in here is console.log so we’re going to log something to the console it’s going to be a template literal string and it’s just going to say server has started on and then I’m going to use the dollar sign and the curly braces to inject the port variable into this template literal string so with that all done these are the four lines of code that we need to create a server that is officially listening to incoming requests over the Internet so obviously the next thing we need to do is actually run this file now there’s a number of different ways that we could go about doing that one is very simply to tell node to read the server.js file and go ahead and execute that file so if I hit enter on that we can see that right here I get this output server has started on Port 8383 absolutely brilliant our server is up and running and one other thing to note is that we never finished the execution of that file it’s kind of stuck in limbo and that will remain to be the case Cas while our server is indefinitely listening to these incoming requests so here it’s a continued execution of this file or basically we never ended the initial execution of this file it’s still running it’s still listening to these incoming Network requests so that’s pretty neat congratulations you have officially with four lines of code created a server that is technically connected to the internet listening to these incoming requests that is a solid backend application that doesn’t actually do anything but you know it’s a start now what I’m actually going to do at this point is kill the execution of the server and I’m going to do that using the control and C keys so there I type the control and C keys and that killed the execution of the file and I now have access to my terminal once again and can write some additional commands so that’s pretty cool now in this project that’s actually not how we’re going to go about booting up our server we’re going to do everything via npm and Via this package.json basically through the specs file now in here we have a field called scripts and we’re going to go ahead and add a script where the script is just some a set of instructions to basically get our server up and running now the first thing you have to do to add a new script is give it a title or key in this case it’s going to be called Dev and then what I do is I set it equal to a string adding a comma to the end of that field to keep the Json happy and in here I’m going to insert that command so that’s going to be node server.js now that’s just the same command we ran earlier and if you were just running between these two uh boot up instances it wouldn’t really matter which one you did but it’s good to get into the practice of doing it via the package.json and the script methodology because occasionally these scripts these startup commands become a lot more intricate and complicated so now I’ve gone ahead and added this simple Dev script right here I’m going to go ahead and save that package.json file now what we can do to boot up our server is we can tell npm to run that particular script so all we do in here is we type npm we say run the dev script and that’s going to execute that line of code and we can see here it checked what the line of code to be executed was no server.js and it booted up our server once again to the same outcome now that’s still actually not the way that we’re going to go about booting up our server throughout this course and the main reason for that is because if I now come in here and change something about my server let’s say I console.log this is an extra line and save that what’s happened absolutely nothing has happened so to get that that change to be reflected in the server execution I would now have to kill my file and boot it up again and now we can see we get that extra line but if you’re regularly making changes to your files that’s super annoying to have to restart your server every single time so what we’re going to do is once more kill our server uh applica kill our server execution our server application using the contrl c key and we’re going to install one more package called npm install node Monon so n o d m o n now node Monon is going to be what’s known as a developer dependency and what that means is that it’s not something you would use in production it’s specifically for development because that’s when we’re going to be making a mass amount of modifications to our file and needing it to regularly update so we’re going to install it slightly differently we’re going to write mpm install and then use a D- save-dev flag and then name the nodemon package now when I hit enter on this command it’s going to go ahead and stall that package all that code will be added to the node modules but now when I come into the package.json we can see that it actually hasn’t added it to the dependencies field but it has added it to the developer dependencies field and that means that when we publish this code to production it’s not going to worry about installing dependencies that are specifically for quality of life improvements when you’re developing the code now what we can do with nodemon installed as a developer dependency is slightly adjust this script to instead be node Monon instead of just node it’s now nodemon server. JavaScript and then we’re going to go ahead and save the package.json file with all of that done we can now run PM run run the developer script if I hit enter on that we can see once again we’ve got this as an extra line running consoling out we can see that our server has started we can see that it’s a continued execution however now when I come in here and remove this line and save this file we can see that our server was automatically restarted to reflect the changes in the code and that is going to be in definitely more convenient than constantly restarting our server every time we make adjustments to it so that’s absolutely brilliant now we have defined the code to initialize our server and we’ve got it set up to be really easy to work on modify update you know and add all the functionality we need to it to complete this project so now that I’ve created a server that is listening for incoming requests across a network network request does that mean I have a functional server well let’s go ahead and find out now earlier in the theory lesson I mentioned that one way we interact with these servers is via their address so what that means is that right now this server that is connected to the internet has an address that we can send Network requests to so from a technical standpoint the address of This Server connected to the network is Local Host it’s HTTP SL semicolon localhost 8383 this is the address or URL that is mapped to the IP address that locates this server in the network now where this instance is the URL so if we actually just specify the address uh let’s call this one the URL now I said earlier that every URL is mapped to an IP address so the IP equivalent is uh this series of numbers right here so if we were to go to a browser and enter this URL or this IP address both of them locate our server across the local network uh and that would be valid and I think technically that would actually have to be 8383 cuz we have to specify the port within the address of the device so why don’t we actually go ahead and try this and once again if you want to copy these these are available inside of the GitHub code inside of chapter 2 the server.js so if I come over to my browser I should technically be able to copy this URL paste it in here and hit enter but now we can see uh we actually get an error response that says cannot get and what we can do from within a browser is we can rightclick on this browser click inspect and that’s going to open up the Chrome developer tools and we can come over to to the network tab now within the network tab that’s going to let us see and keep in mind that this is the client side all of this is server side all of this is client side even though technically they’re on the same device for the sake of development the client when we hit enter on that URL sends out a network request across the local network for development that reaches this backend code and then it consequently responds and so that’s something that we can track our browser actually doing from within this network tab so if I refresh this whole process and hit enter on this URL we can see right here that a network request was emitted from our client from our browser if we take a look at the headers which are basically the properties or parameters that specify the intent of the network request here we can see we have the request URL which is the address we can see that we have a method which is the verb that describes the action of the request in this case it’s typically to get a website when you enter a URL into the browser it’s typically to get access to a website and so this network request has gone out into the local network it’s found our server has told the server that it wants a website but it’s received a response that says 404 that’s a status code it describes what is actually happening in the response but at the end of the day the summary is not found cannot get that website so the question becomes what’s actually happening here now one of the keys or answers is this little slash right here in addition to the get to be fair and essentially what we have to do is we actually need to configure our server to interpret these incoming requests now some of the things we need to set it up to interpret are known as the uh HTTP verbs and the routes or paths now in this case right up here this slash is known as the path and this get is the HTTP verb as far as a URL goes if I add a slash here that slash is for the home path if I were to have a dashboard that would be to the dashboard path if I were to have an or path you know that’s obviously an or path and then we could have SL or/ dasboard so on and so forth these are all of the routes or paths and they are part of the request URL now that is like once again in addition to how the port works that’s a further subdirectory to which we need to navigate these incoming requests and then we can Define actions at each of these end points these can be referred to as end points there’s specific sub regions within our server that is listening to this incoming or all of these incoming Network requests where we can direct the code to a specific endo and execute a body of logic to respond appropriately now that’s a whole lot of words right there you’ll see how it works very shortly but the other thing we have to throw into the mix here is that obviously we have these routes now where once again by default the uh the route is the slash so we can see that gives us the exact same response the second thing we need to do is configure our routes to listen to through these or interpret these specific verbs which help us further understand the intention of the network request now just before we do that one other thing I want to point out is that if we use this IP instead of the URL that’s going to give us the exact same response and that is because the URL is then converted to an IP address so we can just skip that step and use the IP address directly but obviously that’s going to be a lot harder for a human to remember now I’ve kind of said here the next step is to add in the HTTP verbs and routes but how does that work well essentially what we need to do is configure our app for these verbs and these routes now this is kind of like you know anticipating what a user is going to do so for example I could write app and I could assume that they want to get information when a user comes to my website I can assume that they want to get something you know that’s a pretty standard response and so what we do is we invoke this get method and we configure it now the first argument that gets passed into the get method is the route so in here what I’m going to do is use the slash route because up here we’re saying we cannot get that home default slash route well now I’m going to configure our server to handle incoming get requests to this Home Route and then what we do is we Define some logic to run when we get these incoming Network requests that want to get information at the Home Route and the way that we do that is by providing a second argument to this get method that is an arrow function that has two arguments a request and a response argument now that I have access to the request and the response in here I can Define some code to be executed when our server receives incoming requests to the slome route that I get request if we look at this network request right here we can see that the method is get so just to summarize that the method informs the nature of request and the route is a further subdirectory basically we direct the request to the body of code to respond appropriately and these locations or routes are called end points so technically this is end point number one and it’s the slash route once again this might be a little bit confusing initially but as we do it more and more it will become more apparent and obvious to you so now we have some code in here for the get request the income and get request which are the HTTP verbs to describe the intent of the request a verb is an action word get is to get information and these requests are directed to the slash endpoint within our server so in here what I’m going to do is have a console.log that says yay I hit an end point and I’m just going to go ahead and save that now that noon has restarted our server what I’m then going to do is go ahead and hit enter on this network request again and we can see this time we actually got a different response we didn’t get an error immediately we will soon but we actually got a different response and we can see that that website is still loading but if we come over to our console that was from earlier that’s not from this one we have absolutely nothing inside of our client side console but if we come over to our server side code we can see that we actually executed this console.log which means that that incoming Network request Was Heard by our server and this callback function was executed so in here we can now Define a whole lot of code to handle that incoming uh request now one thing I just want to point out once again is that the app needs to be configured first and then we tell it to listen second so this line of code always needs to be down the very bottom so to summarize what I just said we said that this endpoint has officially received this request and the way that we understand exactly what its intention is is via this request argument right here and the way that we respond is using this response argument so in here what I can do is I can console.log the request. method and the method is the HTTP verb so that’s pretty straightforward that could be kind of fun and to respond what we’re going to do is we’re going to call Rez and we’re going to learn our very first response type you know some responses send back HTML codes some send Json data some send all sorts of stuff some send straight up strings we’re going to send a status code of two 200 now I’ll explain what that means in just a second so if we go ahead and save that our code restarts and now if I refresh this page we no longer get an error here if I look at the network we can see that we get a 300 response right here we can see that we do in fact get something back and we get this little text element right here that just says okay and that is brilliant we officially removed all the errors from our application and we now have a full stack uh interaction where we can send out Network requests from the client they can reach our server that is listening to these incoming Network requests at its IP address specified right here it can interpret them by interpreting the intention or verb of the incoming request and also the route or path or endpoint destination which is in this case is the uh home or slash route now as for what the status codes mean whenever you have a network request we have a bunch of codes response codes or status codes that basically are a shorthand determination of the outcome of that initial request so any response code that is 200 level so 200 to 299 basically you know suggests that it was a successful request in this case we got back in okay Roger cool absolutely sweet 300 level responses are a bit more common just like uh 100 the most common are 200 so you could have 200 2011 203 202 uh then there’s 400 level responses so we saw earlier that we had a 404 that means not found and typically what that means is that there was an error in communication so anything 400 level is kind of an error in communication so 403 means that it’s a fidden request which means that you’re not authorized to do that uh 500 level requ requ ests mean that there was an error on the server side so we received your request and something went wrong so there’s a whole lot of different status codes that we can associate with the response now this line right here doing something with this resz is absolutely critical to Define how your server is meant to respond when an incoming request hits that particular end point or this body of code so for the minute what I’m going to do is just send back a 2011 one status in this case I’m going to go ahead and hit enter on that and we can see that gave us back a created response and that’s because the 201 status code says that you have created something we can see that information right here in that Network request we can see that the headers which specify the intent of the network request were to get some information at this IP address right here we can preview it doesn’t look like anything and we can see the response which is a created status now what happens if I type in/ dashboard well we get a very similar error to before where we cannot get the slash dashboard route there’s nothing there and if we look at this error code we can see that now it’s to the/ dashboard URL or route within our entire URL and we got back a 404 status code which means that we can’t find any so the way that we would have to go about configuring our server to receive incoming requests at this URL is by telling our app to listen to these get requests at this particular route so we can do that very easily just underneath right here we can say app.get we can pass in the/ dashboard route and then we can pass in this call back function receiving the request and response arguments and in here I can say console.log ooh now I hit the slash dashboard endpoint uh and what I can also do is res. send I’m not sending a status in this case I’m just going to send the string high now just before I save that one other thing I want to point out is that here we consoled get and that’s the request. method so that’s an example of how we can interpret these requests we’re listening for these incoming requests we can see what the method is we can do all sorts of stuff to understand what exactly is contained within this incoming request anyway we’ve now defined a second endpoint it’s a get endpoint at the/ dashboard route and it’s going to console some different text and it’s just ultimately going to send back a response which is a string that says hi so if I go ahead and save that our code our server restarts and if I now rehit enter on that URL we can see that we get back the response high so that is super cool we can see that the default status code of this response was 200 so in this case we didn’t specify a status but it just defaulted to 200 cuz it was a successful request all the information is okay this is the URL that we sent it out to the headers are basically the properties or parameters of the network request we got back a 200 level response there’s loads of information in here and we can preview the response that says hi and we can look at it and it’s just a string the other cool thing you can do is you can typically uh see exactly how long it took so up here I believe that would be the length of time taken to complete that uh that response which about 40 milliseconds which is very fast and just like that we have officially defined two endpoints where both of these endpoints use the get HTTP verb and we’ve seen how we can navigate or direct these incoming requests to certain end points or routes and consequently how we can respond back to them using the express Jazz framework so that’s super cool and now the next steps are to tidy up the code that we have here so that it actually resembles a more traditional web server so one of the weird things that’s happening right here is we’re obviously sending out these Network requests where the default method when you send out a network request from the client via the URL bar is of method get now one thing I wanted to clarify up here is that HTTP verbs are the same thing as the method which is just once again the action and together these create the end point which we can think of as literally the end destination of that Network request before it’s handled and consequently responded now these are both technically speaking how we actually create the endpoints within our server side application within our node.js backend and so we create loads of different endpoints to handle all sorts of different incoming Network requests so this is uh in obviously a get endpoint to this particular route uh so on and so forth now obviously in both of these cases you know in this one I send back a status code and that’s cool and all and this one I send back a string but that still doesn’t explain how we end up with a website on our screen you know when I come to a web URL in this case it’s localhost 8383 you can imagine that’s google.com is just for local development I hit enter I send out a network request across the network to the IP address that is associated with my backend server which is this one right here where’s the website well excellent question so what we’re going to do now is learn how we can convert these to actually send back a website now obviously a website is HTML code and so in future we’re going to learn how we can send back a whole HTML file but if we just really want to simplify that whole process at its very core all we’re doing is we’re res so we’re responding and we send back and I’m going to create a string and in here I’m going to create an H1 tag which is HTML we need the corresponding closing tag and this is literally just going to say this this is actually our website uh you know in Brackets HTML code so now what I’m going to do is save that our server is going to restart and now I’m going to refresh the page and this time we actually got back HTML code if I can just move this down somehow where’s the little scrolly bar it’s gone from the world or so it would seem come on you can do it uh if I can just inspect this properly elements everything’s up there but the the moral of the story is that there’s going to be HTML code in there but I can’t get to it silly silly silly anyway the point is that this is HTML code you know I could send back after that uh an input just like that that’s the HTML code for an input and now when I refresh the page we actually get an input sent back and this is how you end up with a website on your screen we go to this URL right here it’s to get URL it’s to get a website and when it rece when it hits this endpoint when our server receives that income and request we recognize that it’s a get request it’s loc you know its destination is this particular route it hits this end point so we know that the user wants a website so we can literally just send them back a website now that’s obviously one manner of communication you know one type of network request is to get websites and this is where we need to really tidy up our server because for example this dashboard route right here if someone’s going to the dashboard route you can imagine they want the homepage for the dashboard they don’t want some silly response that says hi but sometimes we literally need to send out responses that say hi and don’t load websites so what I like to do inside of my endpoints is I like to have some uh you know web it end points just like this and then I like to have a second type of websites which is more for like AP endpoints now the difference between the two of them is that website endpoints are specifically for sending back HTML so these endpoints are for sending back HTML and they typically come when a user enters a URL in a browser like we have been API endpoints are more like what happens when you type in your username and password and you hit submit that sends out the same network request it hits the back end we locate it to an API endpoint uh except these ones obviously don’t send back a website so we kind of you know that’s where the magic happens behind the scenes and we do some different with these and the point I’m trying to make with all of this is that just here these two end points I’m going to move them into this first division of endpoint which is the website endpoint so I might actually just label this this is going to be type one website endpoints and then these are going to be type two and we’ll call these nonvisual uh API end points and so now what I’m going to do is I’m going to change this first one the response so that it’s let’s just call it uh homage so that’s what it sends back as a homepage you can just imagine this is a whole lot more HTML code that is literally the homepage for our website and then in this one right here I’m going to send back the exact same thing except except for a dashboard so in this case instead of it being homepage it’s going to be Dash board uh if I can spell that correctly so here we’re going to send back the dashboard now when I refresh the page this one’s the homepage which is what we’d expect we get the website back now if I go to the dashboard link we locate the dashboard that’s absolutely Wicked for these ones down here I’m going to Define another endpoint this is going to be a get endpoint except this one’s going to be SL API SL dat now in this case we don’t have any data but what if our website you know was for example a job board and our server has all of these job listings that a user needs to get access to while the client is going to send out that Network request to the server and request all of that data all of those job listings and because we’re not sending back you know a website or anything it’s not really going to be a URL that gets entered inside of a browser I like to start them off with this SL API endpoint because it just basically signifies that this one isn’t sending back HTML it’s more of a nonvisual behind the scenes uh network communication request it’s still exactly the same you know request response Arrow function in here we can console.log this one was for data except now what I would do is I would send back I would res do send my data now for the minute I actually don’t have any data so what I’m going to do is just up here I’m going to Define some data going to say uh let data equal an object and that object is going to have uh you know a name that is equal to James so this is our object this is our data so now we have a backend server that serves up some HTML code when people will type in those URLs what we now also have is a means for letting someone get data now the question at this point becomes you know we’ve established that this endpoint here it’s a get request they want to get data but it’s not really a website that they’re getting so it doesn’t make sense that they would type it into a URL or a navbar is to this particular route how do we go about testing this you know what what do we actually do and this is where a tool known as a client emulator becomes extremely helpful so what we’re going to do is we are going to come over to the extensions tab right here and we’re going to look up a uh an extension called the rest client now it’s the top one right up here it’s uh by [Music] haow Ma it’s got over 5 million installations it’s absolutely brilliant and you just want to hit install on that extension and once we have that extension installed within our chapter two folder we’re going to create a new file and this one is going to be called test. rest or also client is uh valid as well but we’re going with rest rest is the extension that we’ll need now what this file allows us to do is emulate the process of sending out a network request so basically emulates a browser or a user for example in here we separate all the uh template Network requests we want to send out using the triple pound and the first one you know I can actually give this one a title this is going to be to test get uh homepage website and now to actually write the code in here what I would do is I’d specify it to get request and I’m sending it to http sl/ Local Host 8383 which is the URL now we can see that if I save that we get this tiny little send request button up here and if I click on that it emulates that whole functionality of us typing in the URL here except instead of displaying the HTML code as a website in the browser we can inspect literally what the response is which is the HTML code we can see that it was a 200 level response we can see it was powered by Express we can see that we get back HTML code and that was a successful request now I can do the exact same for test get SL dashboard uh website and then here I’ll just type uh get HTTP localhost 83 83 we give the address then we give the route which is the dashboard and we have defined this endpoint so that’s all good I had send it sends that request we can see it took 7 milliseconds down here and we now get back that HTML code for the dashboard and obviously these are our website uh endpoints that we’re testing here but I can also do it for a data end point so what I could do is I could use the get method the get verb and go HTTP localhost 8383 apiata which is our most recent endpoint that we’ve added and now let’s say I wanted to test that you know a user comes onto our website we need to ensure that they can fetch all of the job postings for our job board so we run this network request it has that endpoint and now we can see we get back Json data and we also console this one was for data and so this is a client emulator it sent out a network request from this client it hit our server our server directed it to this specific endpoint it interpreted the method of the request

    which is to get information it located it or navigated it to this particular route we ran this console and we responded with the data so that’s super cool it’s a different kind of endpoint it doesn’t really show up a website it’s not something you would typically type in a URL even though technically you could you know in here I can go API SL dat and it will show me the data which is Json but that is not a website so that’s just another endpoint that we have officially configured so now that we have these three end points it’s time we start looking at some of different HTTP verbs or methods now at the end of the day most of these methods all come under the umbrella term of crud actions so if we take a look at the term crud I’m actually going to write it in here crud crud stands for create read update and delete now if we think about you know these are the four actions that basically control all of data modification the read is get that’s obviously associated with a get request so to read information is to get information if we want to create the HTTP verb that’s associated with is called post you post a parcel to someone else and it creates that package in their hand if we wanted to update information we use a put because we put something in the place of something that was already existing we create that place for it with a post request and then we put something there with a put request and a delete request or the delete functionality is literally associated with the delete method so in here what this is actually doing is we have uh the method and the crud action so the method is on the left and the crud action is on the right and I actually got that around the wrong way uh it’s the crud action on the left and the method on the right so that’s pretty cool now what we’re going to do is really take this application to the next level by literally creating something that is going to display our data so in here what I’m going to do is for our homepage which we have right here I’m going to turn this into a template literal string and that’s going to allow us to inject some data so this is our template literal string I’m going to enter it onto a new line which we can do with template literal strings and we know that HTML code needs a body that’s where all the visible part of the website the visible HTML code goes within the body tags within the body opening and closing tags and in here what I’m going to do is create a paragraph opening and closing tag and within that I’m going to use the dollar sign curly braces and I’m going to Json do stringify our data so this is going to inject our data into this template literal string and send it over as the HTML code so now if I refresh this page and hit enter we can see now we get back the Json code for our data and I’m going to actually uh throw an H1 tag above that and that’s just going to be called Data uh and I’m actually going to give this a style tag the body of style tag and that’s going to be equal to background pink color uh blue that’s going to look shocking so now if I save and refresh that we can see we have a website and we have our data so that’s kind of fun we added the style tags we sent back the HTML code someone loaded the website and that’s how every website you ever go to on the internet actually works so that’s super cool now you know let’s get creative how do we actually add modify data all of that kind of stuff well that’s where we get to these funky API endpoints and that’s where we’re also going to use our client emulator so what I’m going to do in here is I’m going to create an endpoint that allows someone to add data so in this case I’m going to use the new method we talked about here which is to post so I’m going to expect that someone’s going to post some data and I’m going to ensure that it is sent to the apiata route because these are all the routes that are responsible for handling that data and and here once again I’m going to define the uh function to handle the incoming request when it hits this endpoint so that’s a request and a response as arguments and in here what I’m going to do if someone is actually sending information instead of just asking for information I actually need to take a second to investigate what they’re actually sending me which we haven’t had to do before now fortunately that’s very easy to do with Express with Express typically when people send information there’s a number of ways they can send information but most commonly it’s as Json it’s formatted as Json and so what I can do here is Define a variable called const new data or new entry and set that equal to I can access the request which is the incoming request and I can look for the body of the request now the body of the request is literally the data associated with that request and typically when you have uh the create and post methods and the update input and occasionally the deletes you can typically expect there to be a b associated with that request instead of just you know a request for information which would be um more related to the read or get request so because in this case someone wants to actually create a user wants to create a user uh let’s say for example when they click a sign up button what would happen is the user clicks the sign up button after after entering their credentials and their browser is wired up to send out a network request to the server to handle that action and that’s what this endpoint is for so let’s actually take a second to go ahead and program that from within our client emulator so here we have our get endpoint we need a new one we’re going to use the triple pound and we’re going to say data end point for adding a user now in this case as we saw inside of the server that is a post request cuz we’re creating a user so we specify the verb and now we still need the URL including the route where that request gets you know directed to when it reaches our server so this obviously locates the server with the domain and the IP address and and then this route locates specifically the endpoint within the server that contains the logic needed to handle this request now since we’re actually posting information we need to Define that information and so what we’re going to do is create a Json object just here you need to have an empty space uh Above This object for formatting and I’m going to have a name and that is going to be um gilgames because why not let’s support gilgames so that is the data formatted as Json that we want to send now when we’re sending data the one last thing we want to do is configure a parameter of the network request which is known as the content type and we just set that to application Json and that means when our server receives this request from our client emulator it knows that it’s looking for Json and it can cons consquently pass that body and you know interpret that Json and gain access to this value so now we have a post end point where we can access the body we currently haven’t defined how we want to respond to it but let’s just go ahead and send this request from our client emulator and you have to remember that this would be equivalent to you know if they had a website where it’s like submit a new account add a user add a to-do their click the button and this is what happens so we send out their request and we see that it’s just waiting indefinitely now there’s a few things to note first up that uh we’re not getting a response and that’s because we haven’t basically told this endpoint how to respond when it receives this request so that’s what we’re going to do next so I just went ahead and cancelled that request and we got nothing back now to respond respond this is where we would send the status code of 201 which if you recall was associated with the created you know outcome a user was created or added so that would be a perfect response to that so let’s go ahead and send that request and this time we get back at 2011 which says that we have created a new user so that’s pretty cool now what we’re also going to do before we respond is we’re actually going to console.log this new entry restart that and I’m going to go ahead and send this request again and we can see that even though we tried to console.log the body of the request the request. body we get undefined and that’s because up here where we configure our server there’s one last thing we have to do so I’m going to create a section in here called middleware and we just have to tell our app to use express. Json and we invoke the Json method now middleware is part of configuring your server and basically it’s configuration that you slam in between uh the incoming request and interpreting that request so request hits your server this is like a protective layer on the like a middle in between these interactions so it’s just before we actually hit these end points and this line just basically configures our server to expect Json data as an incoming request so now that we’ve added this line right here we’re telling our app that it needs to use this express. Json expect to receive this Json data now when I send that request we can see that we actually logged out the new data and we consequently our server responded to the client emulator and we got a successful Response Code now the one last thing I want to do here is actually add the data so what I’m going to do is reformat it and I’m just going to create a users field and it’s going to be an array and that’s just going to have the entry James and to be fair what I could do is uh get rid of a lot of this and just literally make it an array with the name James so now when I refresh the page we can see we get the array back James that’s the data rendered onto the screen now what I want to do is add a user so we come down to this particular endpoint where we handling this data it’s an API endpoint it’s not for rendering a website and we’re just going to say data. push and we’re going to push the uh new user here we can see that we have to access the name parameter within the new entry so we push the new entry. name field because that’s what we want gain access to that’s an object the request. body is an object uh and we access the name key within that object we push it to our data array and then we confirm that we have created that new entry so now when I save that if I refresh the page we get our singular entry inside of this array rendered as our website and now we come over to our client emulator we emulate the process of adding a new user by sending this request gilgames has been added we have confirmed that we have added gilgames and now when I refresh this page we can see that gilgames has been added to this list so that’s absolutely Wicked we can see now that you know we’re really developing a full stack interaction where we have a client where we can actually visually see all of these backend interactions that are just going on by telling our server to listen to these incom requests and defining end points that handle all sorts of different you know behaviors expectations and intentions of these requests now a big server might have a 100 different end points all for different things uh but at its very core all you’re doing is creating different destinations where each route and each verb together create the end point where each endpoint has you know a specific utility so that’s pretty cool you know if we just wanted to take this one last step we could say app. delete uh we specify the route which is/ API endpoint and we Define the arrow function to be executed when we receive an incoming request to this endpoint we give it the request and the response as arguments and in here what we do is we just say uh data. pop we’re going to to pop an entry off the end of that element we could also throw in a console.log just saying we uh we deleted I can’t remember if it pop is off the end I think it’s off the end so we’re going to say we deleted the element off the end of the array and then we just res. send the status code of 2003 for example I’m sure there’s one that’s associated with a successful delete but I’m not not sure what it is in this moment so what we could do now is emulate that request I could literally just copy and paste this code right here actually I’m not even going to do that I’m just going to create a new one so I’m going to use the triple pound and say delete endpoint uh and that’s going to be a delete request to the HTTP SL localhost SL API slata and we have to throw the port in here so that’s 8383 and then very simply I can go ahead and execute this request in this case this request does not need to contain any data because it’s not specific to an entry it just literally gets rid of the entry off the end of the uh array run that request oh and we got a 404 which means that that cannot be found and that’s because I specified the route incorrectly so here you can see that we are looking for a slash data whereas I’ve configured it for/ endpoint cuz I’m a muppet so let’s now change that to API sdata save that file reexecute that request and we get back at 203 which is definitely the wrong Response Code for this action but we have confirmed on our server side that we did in fact hit that end point and now if I refresh the page we actually got rid of both entries let’s go ahead and restart this once more that’s going to refresh everything so now we have James then we’re going to uh add an entry send that request that’s a user created now we have a new user and then we pop a value off the end of data so let’s go emulate that behavior a user clicks delete and a website it’s the same thing we’re just emulating it we delete an entry and then we refresh the page and now it’s worked appropriately so now we only have one entry on the end of our array and just like that you can really start to see our backend application coming along now the one last thing I want to do in here to really just make this feel a little bit more like a website is throw in some anchor tags that have an hre this anchor tag is going to have an hre to the/ dashboard route and that’s going to have the text dashboard and then we have to close that anchor tag and then within the dashboard I want to do the exact same thing uh so this is going to have the H1 we have to put that inside of a body tag so I’m going to change that from a quote to a template literal string throw the dashboard down on a new line put it within the body tag so let’s create the opening and closing body tag into that onto a new line just do some text formatting really quickly throw that up there and then throw in an anchor tag just here that takes a user back to the homepage and that’s just going to say home and just with these extra lines of code so on the homepage I’ve added a link to the dashboard page and on the dashboard page I’ve added a link home if I save that and refresh this page now we get that link I can click it it routes us to the dashboard page we render that HTML code and that’s going to take us home so that is really cool and you could absolutely take this to the moon for example I could add a button in here where the button actually adds a new user and instead of having to emulate that functionality that could actually be done within that HTML code I could also throw in a script in here you know I can throw in a script close the script consequently and that’s going to say console.log this is my script and now you know let’s say up here I throw in a cons cons. log user requested the homepage uh website now if I save this back in server and refresh the page we can see that on the back end we received an incoming request to this endpoint where a user is requesting the homepage website and we can see that we responded with the HTML code which I just cannot find where what if I close that there it is there’s the HTML code and then in the console we executed that script which is so cool because that was just sent back as text but our browser interpreted the HTML it executed the script and Bob’s your uncle which is absolutely Wicked now the one last thing I want to do just to demonstrate a small concept before we jump into the next project which actually looks reasonable obviously this is a whole hodg podge of stuff that doesn’t look very attractive but it demonstrates a lot of important Concepts is I just want to show you an extra feature that we can kind of amend to this response right here and essentially what we can do is we can throw in a custom status code in front of the do send and in here you know we can we can specify the status code that we want so if I did like a 500 or 599 that wouldn’t make any sense but now if I come back to that get request for for that endpoint and send that we can see that we’ve also specified that status code in addition to getting back the data and then just from within here you know I can add a new user I can rquest the data and the data has been updated and it’s super cool because this little object right here has basically been a database for us it’s obviously a very simplified version of what you’d typically find in production level environment but you know for all intents and purposes this has been a great little database that’s storing users we can access new users and all of this can just be scaled to the Moon based off the same Core Concepts so anyway that’s it for chapter 2 this is our very introductory project just to once again demonstrate how we can actually build a functional server and do some of the core things that we talked about inside of our Theory lesson it obviously doesn’t look great but that’s what chapter 3 and chapter 4 are all about they are absolutely brilliant projects they’re super exciting and it’s time that we absolutely dive into them all right we just wrapped up project number one in Chapter 2 of this back in full course it is now time for us to jump into the second project which is Chapter 3 of this back in full course now this is a really cool project we’re going to be using node.js as we did in the first example to develop our web server we’re also going to learn how we can take advantage of what are currently known as experimental features within node one specifically is the sqlite database now if you’re unfamiliar with sqlite basically it’s just a very lightweight SQL database that is very popular very common place and very easy to get up and running with and it’s now built into the latest version of node which we are going to unlock by utilizing the most recent experimental versions of node via something known as node version manager now the backend application itself that we’ll be building in this course is going to take a lot of the beginner Concepts we learned in the first project in Chapter 2 and really just extend upon them by ultimately just building out a more comprehensive backend application we’re also going to learn how we can tidy up our project directory really develop a project directory that is going to set us up for success as our projects become more complex so it should be loads of fun and the first thing on our list is to upgrade our version of node now when you download node you probably selected a version to download and we can check what that version is by typing in the node DV command in our terminal now this is just the exact same terminal we were using inside of uh chapter 2 in our last project and we can see that the node version I’m using in here is version 20.10.19 to set the minimum version we’re going to need to use these tools if these features are available in the standard node.js version then you won’t have to add any of the experimental Flags but we’ll see what that’s all about shortly but yeah should be absolutely loads of fun now to kick off this process the first thing we’re going to do is get NVM node version manager up and running on our device now to do so what I’ve done is I’ve linked node version manager in the description down below and this tells you exactly how to configure it it might look a little bit overwhelming but at its very core all you have to do is come down to the installing and updating section which we can see right here here’s the install and update and when you see this curl command you just need to copy it and paste it into a terminal instance for example the one that we have open right here so I could literally paste that command and that should theoretically install no version manager on our device once it’s installed you can check that it’s installed uh using the NVM DV command so just in here if I type in NVM DV that will tell me what version of node version manager is installed on our device and if that command’s not working for you then there’s been an issue in your installation and all of that should be covered in this guide how to basically get it up and running once again this is linked in the description down below and if you have any challenges or the documentation isn’t very clear you’re welcome to once again post them as an issue on the GitHub repo for this back and full course which is also linked in the description down below and either myself or someone else can help you overcome that hurdle lastly chat GPT can actually be pretty good at helping you get these installations up and running as well but ultimately should just be copy and PAs in this one line and then you should be able to just type the NVM – V command and that should all be good to go now once nbm is installed on your device it’s so incredibly easy to use here are some examples of the commands basically all you have to do is type NVM install or NVM use that particular node version so in the case of our project the node version we’re wanting to use is is I believe it’s 22 but once again you can use 22 or later so what we’re going to do here is we’re going to type envm install and then we’re just going to type in the version that we want to install which is going to be 22 once again if you have a version that is more recent which would result in it being a higher number then you really don’t have to worry about this part this is just if we have an older version or if you’re watching this tutorial at the time of release so if we hit install that’s going to go ahead and download the version of node that we will need for this tutorial essentially to take advantage of some of the more experimental features so now that that is installed on our computer we can just type NVM use 202 and that’s going to set us up to use that version of node now the next thing we have to do obviously we can close chapter 2 and we’re going to create a new folder which is just going to be chapter3 it’s our chapter 3 project now in here what we’re going to do is absolutely nothing cuz it’s all going to begin within our terminal I’m just going to type the clear command that’s going to clear out the terminal currently we are in the chapter 2 directory from the previous project we need to change that so first what we need to do is go up a directory level so from the chapter 2 directory to to the backend full course level and we can do that by typing the change directory or CD command and then using the double period that’s going to jump us up a level so now you can see we’re within the backend full course directory and from within here we can change directory into the chapter 3 folder and now we can go ahead and initialize our node backend project now if you remember from chapter 2 the way that we did that is by typing npm in it and then the dashy flag and if we hit enter on that command that once again creates this magical package.json file which is the specs file for our project so if I click on that once again we have a pretty rudimentary package.json file and that’s how we initialize a backend project using node.js that allows us to leverage the npm node package manager ecosystem as we learned a little bit about in Chapter 2 so now that we have this package.json file the next thing we’re going to do is actually set up our folder directories and just finish the configuration of our project so that we can then get our hands dirty with all of the code now the first thing I’m going to do is within the chapter 3 folder directory I’m going to create a file called server. javascript. JS the JavaScript file extension and hit enter on that just like we did in Chapter 2 very similar and now that we have these two files we can go ahead and set up some folder directories that are going to be used for their own sub files which is just going to help keep our code a whole lot cleaner than the example in Chapter 2 so the folders we’re going to need for this project number one is called middleware now if you remember inside of chapter 2 we only had a very brief look at middleware right here middleware exists in a lot of different shapes and forms we’re going to have some middle Weare in this chapter 3 project that is all about authenticating a user so that’s going to be super cool so we need a folder where inside that folder we can keep the files that maintain that code now the second folder we’re going to need inside of chapter 3 is called routes now obviously once again in chapter 2 we had all of our routes set up inside this file but you can see how it’s already getting kind of long and that’s not really ideal if you want to have you know a best practice implementation of a server side application so in this case what we’re going to do is we’re going to move all of our logic that sets up how all the different endpoints work into this folder called routes now after that we need another file and this one is going to be a new file in the chapter 3 directory Factory that’s called db. JS now this file is going to have all of the logic for the database which is going to be sqlite which is just a SQL database where SQL stands for structured query language and it has to be the most popular type of database used globally there’s a lot of different examples of SQL databases in this case we’re using SQL light another common one is my SQL and then there’s also postgress SQL which will be using in chapter 4 that’s going to be loads of fun but essentially this db. Javascript file is where we’re going to have all the code to configure our extra special database and then we’re going to need another folder this folder is going to be called public we’ll learn all about what the public folder is for very shortly uh and then last but not least we’re going to need two more files one is going to be another rest file which is used for for emulating the browser so that’s going to be called Todo app. rest just like that we’ll create that file and then lucky last is going to be one called Dov now EnV files are for environment variables and if you’re unfamiliar with what an environment variable is it’s essentially just a storage of keys and values the key is the lookup term and the value is potentially a secret uh string of characters that needs to be referenced for the configuration of our project so any top secret information is going to be thrown in this EnV file and that way we can avoid uploading it for example to GitHub it can stay local on our device and that means we don’t end up in a situation where we accidentally share all of our secret passwords with the world now we’ll learn more about how the EMV file Works shortly but first we’re just going to finish up with the configuration of the files for this chapter another folder that we need to create is going to be called Source now SRC stands for source and I actually made a little oopsy here all of this code that we have created so far with exception to the EnV file and the Todo app. rest needs to go inside of the source code so our server. j s is going to go into the source I’m going to drag that in there the database. JS goes into the source and then the middleware folder goes into the source and so does the routes folder so if we close the source directory we should only have the package.json the EnV and the rest file in addition to the public folder as direct Children of the chapter 3 directory within the Source folder we have this middleware folder we have a routes folder and we have the database. JS and the server.js now once again just a reminder that at any point you can compare your code to mine via the GitHub repo the link to the GitHub repo is available in the description down below and if you do go and check it out if you could star the repo love that support that would be super appreciated now we’re almost done setting up the folder directories the last things we have to do is create the files to go Within These folders now within the middleware we just have one file called orth aut middleware do JavaScript this is where we’re going to write all the JavaScript to handle the middleware and for the routes as you learned in the previous tutorial in Chapter 2 we had two kinds of routes we had API routes and then we had website routes well there can actually be you know a pleora of different types of routes and consequently API endpoints and so we’re just going to subdivide them in this project to orth routes. JavaScript and finally to do routes. JavaScript now in this project we’re going to create a full stack application from a backend application where we serve up a website where the website is essentially an authenticated protected uh to-do application looks absolutely excellent and we’ll see how we can get that up and running shortly but the two types of backend endpoints we’ll need will actually three we’ll need one to serve up our website we’ll need some logic or end points to handle all of the authentication and then we’ll need some logic or endpoints to handle all of the crud operations where we’re creating reading updating and deleting different to-dos in our to-do list and so that’s what these files are for we’re just going to have all of our auth indication routes in here and all of our to-do routes in here and with all of those files now created that is our project configuration 99% complete we’re not going to do the 1% just yet but essentially just to summarize what we’ve done within chapter 3 we have created two folders one is called public and one is called Source we also initialize this package.json via the terminal using the npm in n-y command that is the project specification for our chapter 3 project and we also defined a file a rest file which takes advantage of the uh rest client vs code extension and this file is going to be where we can emulate some browser network requests emulate the client so that’s going to be handy for when we test our endpoints later now for now the public directory is empty we’ll change that shortly and within the source directory the source code if you’re familiar with that expression the source code is basically all the code that creates our application and that is within the source folder now directly within that folder we have two files two JavaScript files one is the server that’s going to be the Hub of our application and the database. JS is going to be for all the database configuration logic and then finally we have two folders one is called middleware we’re going to learn all about the middleware very shortly but it’s essentially what it’s going to do is just handle all of the authentication between the client and the server side between the front end and the back end and that logic is going to go within the or middleware do JavaScript and finally we have two routes folders which is just going to separate all the logic out for the different types of API endpoints that we will need for this application now don’t stress if that’s a lot of information as we work with these files will all become very comfortable and familiar to you especially as we jump into chapter 4 it’s just the same thing but slightly more advanced once again but do make sure that you have configured this folder directory properly because that will be important for linking between different files as we code out this project so now that we’ve set up all the folder directories the next thing we need is to add in all of the npm packages we’re going to need for this project now the list is not actually that great but it’s not as small as it was in the first example for this project we’re going to need to install the Express package so we’re going to use the npm install Express command once again however in this case we have a couple of other packages that we want to throw in so we’re going to type them all in the one line so here we have mpm install Express that’s package number one the next one we’re going to need is called bcrypt JS just like that and that’s the library that is responsible for encrypting data specifically usernames and passwords and that is super important when you’re developing your own authentication system now in this project we use an authentication system known as JWT authentication or Json web token authentic a same thing as we get to that I’m going to explain very explicitly how that system works to create a very secure authentication system and consequently full stack application but for now all you need to know is that we will be needing a package called bt. JS specifically because we don’t want to have to write all the code to create these encryption algorithms now the last package we’ll need is called Json web token and that is once again just another package to facilitate our authentication system so now that I’ve typed out those three packages I’m going to hit enter and npm is going to add them to our project we can see that was super fast and now we have this node modules folder within the chapter 3 directory once again we’re not going to go touching any of those folders because someone else has coded them we’re just going to add them to our project so that we can Leverage that code if you wanted you could check out the documentation for all of these different packages to really understand how they work but I’m going to cover it all in this course anyway but we can also see that these packages have been added to our dependencies list within our project specs which is the package.json file now within here I’m just going to change the description of this uh project this is going to be a full stack uh to-do application that us uses a nodejs backend a seq light database and JWT authentication so that’s pretty cool and now with those packages installed you might remember in chapter 2 we installed a developer dependency called nodon well we actually don’t need that in this project because one of the experimental features that is available in the later versions of node.js is A system that essentially does the same thing automatically reboots our server when we save or create adjustments to the code base so that’s super exciting now with all of that done the last thing we need to do to boot up our application is create a script that defines exactly how npm or node package manager should start up our application now I’m going to call this the dev script and here needs a comma at the end it’s within the scripts field and essentially how this is going to work is it’s going to be node space and then we’re going to use some Flags now the first flag we’re going to need is en mv- file is equal Tov now historically to add EnV files or use environment variables which are the secret protected Keys within a noj application you used to need a package called EnV now within the later versions of node it’s built in and this is how we tell node where to look for our environment variables inside of thatv file the second flag we’re going to need is D- experimental Das strip D types once again this process is specifically if you are only using no node version 22 which is the experimental version of node as node officially releases these features you will not necessarily need these flags you’ll probably need this one but you might not need the experimental flags and then after this experimental flag we need another one this one’s going to be experimental Das seq light as I said earlier if you’re using a more recent version of node at some time in the future sqlite will probably just be built into the official release and then last but not least we need to specify the file that we want node to run and that is going to be dot slash because we have to enter the source directory and then we want it to boot up the server. Javascript file now I actually forgot one flag in here the last flag is to uh tell it to emulate the feature that nodemon used to do which is automatically restarting everything and that’s just a d-at flag now with all of these flags and this whole command set up Suddenly you can really understand why it’s beneficial to have these scripts because instead of writing that out technically I could write this out in the terminal every single time but now if I save that package.json I can just run that script using this simple command every single time and you know we can just have the magic happen so technically we can go ahead and run that command and that is absolutely completed and our application is up and running but there’s nothing currently to run so I’m going to go ahead and kill that uh that’s pretty easy uh and now we can get to the actual code so the file we’re going to start off with is our server. JavaScript and if you recall in our previous project it only took about four lines to get us up and running with a server and the very first thing we did was we imported the Express package so in this case what I’m going to do is type const Express is equal to and then I’m going to use the require command and I’m going to require the Express package and that’s going to bring it into our application or at least that’s what we would have done in Chapter 2 in this project we’re going to look at a different way of importing files and folders into our uh basically application this is a more modern synx so in the newer versions of node it’s now best practice to instead of using the old require syntax instead we just use almost a more uh logical syntax where we just import Express from the Express package so this is a slightly different syntax and this is actually one of the criticisms of node.js is that they jump between these different importing syntaxes and it’s a whole can of worms that I’m not really going to open up right here but the moral of the story is that for this project we’re going to use this slightly different importing sent TX it’s my personal preference I think it’s much easier to work with but to configure our uh node.js application to work with this new syntax we need to come to the spec file and make sure that there’s a little line inside of the spec file that basically configures our application to use this new syntax so underneath the main line right here within our package.json I’m just going to add a field that’s called type and the value associated with that is going to be called module now you might have noticed just there that there were two options actually come up one was called module and one was called commonjs commonjs is for the uh previous syntax that we were using that’s the default value if you want to use the modular syntax then you just have to specify this line right here if we save that then we’re all good to use that newer uh import syntax throughout our project so now that we’ve imported that package we can go ahead and do what we always did which was Define our app and invoke Express that’s pretty straightforward we’re going to need a port so I can define a port just here const Port is equal to and in this one I’m going to use 5,000 we’ll do something a little bit differently and the other thing I’m going to do just here is actually set the 5,000 as a backup because what I want to do is provide a value from the environment variables instead if it exists now when we Define variables Within These EnV files and we’ll see how we can do that later we can read them into our application by typing in process.env and then accessing the name of the environment variable which will be port and once again we’ll see how we can configure that shortly but essentially what this slightly improved syntax does is it just checks if there’s a port environment variable if there is we use that if there isn’t then we default to 5,000 and then lucky last we just tell our app to listen app. listen at the port and then we pass in a function to be executed if our uh server boots up adequately and so that’ll just be an arrow function where we can console.log a template literal string that just says server has started on Port semicolon dollar sign curly braces and we can inject the port just there so that’s our few lines to create the boiler plate code needed to build our entire server now with that done I can save that file and run the npm Run Dev commands and that is going to fail to boot up our application and that’s because that Port is already in use so I must have something running on my device at that Port so I’m going to change it to 5003 I would expect that that would have worked for you if it has then that’s totally sweet let me just change that backup port and now we can see that our server has started on Port 503 and we can test that that watch flag is working because if I say console.log hello world just throw that in there and save that file we can see that our server automatically restarts due to that watch flag and we print out hello world and just like that we have done everything we need to set up a more modern project directory it’s going to make it much easier to develop a more sophisticated back-end application we’ve created the code we need to configure the beginnings of our server side application and from here we can really start to flesh out all of the end points and all of the features and functionalities of our backend application starting off with serving a front end now currently as with the previous project we have started up a server on our local network at our local host on Port 50003 so you know let’s expect we can come across to our browser and look at Local Host 5003 or if you’re using Port 5000 it will be Port 5000 and just like with chapter 2 when I hit enter we get a cannot get it’s a 404 which means that the browser sent out the network request to that address and it potentially may have got there but we didn’t handle that incoming request there was no endpoint that had method get at this particular route to receive that request and consequently respond in this case with what we would expect to be a website so that’s going to be step number one is sending back a website we need to send back a front end that a user can interact with to have a full stack experience and that’s going to be our Authentication protected to-do list so that’s step one let’s send back a website now the question becomes where’s the website well I’ve got you covered I built the website in advance so the front end is completely developed all of the logic is available there I’m going to copy it across to this public directory right here within chapter 3 now for you you will need to go over to the GitHub code which is linked in the description down below check out chap three and copy across the files within the public directory and while you’re there if you could star the repo love that support that would be super appreciated so just here I’m going to copy across the three files into this public directory so there’s a fan. CSS an index.html and a styles.css now Fanta CSS is just my little uh child it’s like a design Library so that just Styles everything styles.css is is all of the layout Styles so not so much of the prettiness of the application but the functional layouts and then the index.html is just some HTML code with some scripts and uh JavaScript at the bottom to handle all of the different uh crud actions and all that good stuff so that’s super important if you want you can totally go through and have a look at the code there’s a little bit of it but at the same time there’s not an infinite amount uh I’ve commented out a lot of it so it should be uh pretty self-explained but once again you know this is a backend full course and the front end comes pre-completed so you just need to copy these files across now as for why we’re copying them into a public directory well the public directory is the canonical folder from within which we serve up everything any assets from our uh for our project so in this case we need to serve up a front-end application and consequently here is our front-end applic so now what we have to do is actually take this you know these files and when we get this network request we need to send them all back across to the browser and then the browser receives the CSS sheets and the HTML and loads that website now that should be relatively straightforward and most of that logic is going to go within our server. JavaScript now what I’m going to do is start off by getting rid of that little console log that’s going to re booot our server and the first thing we need to do is Define this endpoint the end destination for this network request that our browser emits when we go to the URL which when you deploy the application could be you know to-do app.com or whatever it might be so as we saw in the previous uh chapter defining that particular endpoint is actually pretty easy so we can see that the method or verb is a get so we type app we access our server app and we use the get method to Define that endpoint the next thing we need to provide as an argument to the get method is obviously the route and here we can see the route is the slash route so we’re just going to have the slash route and then the second argument is the call back function that’s going to be an arrow function and that receives the request and the response as arguments and now I can open that up onto a new line now we saw in the previous chapter how we can send back some code how we can send back status codes all that good stuff if we want to send some files like we do in this instance we want to send back the index.html and all that good stuff we need to res. send file now in here we need a little bit of code to basically determine or locate the files that we need to send so this might be a little bit complicated but we need to take advant vage of a JavaScript module known as path now we need to import path into our project that’s step number one and that’s native to express so we just import path from the module called path now there’s something else we have to import from this uh path module so what we’re going to do is after this path we’re going to throw in a comma and we need to destructure out this particular input so we’re going to use the curly parenthesis just here and the the uh the item we need to import is called directory name or dur name so that also needs to be imported and while we’re up here there’s one other import we need that’s also native to JavaScript so we’re just going to import and this one also needs to be destructured and it’s going to be file URL to path from a module called URL so that is super cool so these are the two Imports we need and they’re going to enable our Javascript server.js file to look for the HTML files and consequently send them back as a response so we’ve got those Imports now the next thing we have to do is just above this endpoint we need to get the file path from the URL of the current module now that’s a slightly confusing sentence but essentially it’s just a configuration line to allow us to navigate the folder directory that we have just here from within our code so we need to Define a variable called underscore uncore file name and that’s going to be equal to and we’re going to call the file URL to path and then as an argument we’re just passing uh import met. URL so that’s going to give us access to the file name and then underneath that we need to get the directory name from the file path and that’s going to basically tell our operating system okay this is the directory where the files can be found so we need a variable just in here called const and this is the double underscore once again and that’s called directory name and that’s equal to and we’re going to invoke dur name which we um imported above and we’re going to pass in the double uncore file name now this is going to come in handy in a number of places and we’ll see uh all about that just shortly but the first thing we have to do is we have to send this file and what path does all of this ultimately comes down to allowing our code to uh locate files and folders on our device or whatever device it’s running on so path allows us to construct the ultimate path to find these files and folders and so in this case this endpoint is for serving up the HTML file from the slash public directory and so this path we’re going to join the underscore directory name which is basically the directory of our project and onto that we’re going to throw the public directory and then we’re going to throw the index.html which is specifically the file name so this code essentially isolates our directory and then what we do is we join together the current directory and I think I’ve got one too many underscores right there there should only be two so I’ll just get rid of one of them two underscores it joins together the directory with the public folder and consequently the file and that’s how our body of code knows to find that file that it can then send back across the network and that’s literally all we need to do to send back the HTML file so if I now go ahead and save that our server restarts and I can refresh this page and loading that website gives us one error and we can see here this is the ultimately resolved path name and we can see that the issue is that it’s looking for the public folder within the source directory so there’s one last line we need to add and this is known as middleware which we kind of saw earlier and it’s just a bit of configuration and so we need to tell it exactly where the public directory is cuz currently it thinks that the public directory is in the same level directory as the server.js but it’s actually one above so we’re going to basically uh add a line that serves the HTML file from thepublic directory and also tells Express to serve all files from the public folder as static files so that’s what I was talking about the assets the static assets SL files now this is important because uh any requests for the CSS files will be resolved to the public directory and we’ll see exactly how that Works in just a second so we need to throw one little line in here it’s just a little bit of middle Weare it’s part of configuration for our app so we just say app.use and in this case we use an Express method so we access Express and we tell it to use the static method and basically this is saying okay where do we serve the static content from well it’s from the public directory so we call Path We join once again to create the ultimate path or the the absolute path for the public directory and we just go from theore directory name again and onto it we add the double dot /pu directory so that basically says okay you can find the public directory but it’s actually not within the source directory it’s one up and that’s why we have the double dot because that’s how we go up a level of folders so if I now go ahead and save that once again and refresh the page we can see now we actually get back the website so this is the endpoint that literally serves back the file and this is a configuration line that basically says you can find the public directory not quite where we are right now but if you go up one level that’s where it’ll be from so the express. static line basically is used to tell our code where to find the public directory and the public directory is what serves up all of our assets so that’s really cool we’ve literally actually sent back a website where in the previous chapter we just sent back some HTML code written as a string now one thing we’ll note if we right click and inspect and then come across to the network tab we can see that when I hit enter a bunch of requests are sent first is the local host and this sends back the HTML file which itself doesn’t have any Styles it’s not a styled file and so here you can see the styled equivalent of this web page and that set back is all this HTML code however at the top we have these links now when there’s a link essentially what happens is the browser goes out and fetches the information at that link so we can see there’s a SL Styles and a slant. CSS sheet so it consequently our browser went out and sent those requests out fetching the CSS files so here we have this styles.css sheet and that’s the URL at when to fetch it from and because of this line our app knows to serve up these files from our public directory and so that’s what it got back it previewed it got back all the CSS and then it could apply it and it did exactly the same for the fan. CSS sheet here and consequently we load a styled application this is the authentication page it’s super responsive looks pretty nice and neat so that’s hunky dory we now have a website a front end being served up from our backend code now the front end is super cool because when we can later authenticate it’s wired up to send out all sorts of network requests for all sorts of different interactions logging in registering a new user create read update and delete different to-dos and all that good stuff and that will just allow our browser to send out all of those Network requests that will reach the different endpoints that will’ll code throughout this tutorial that are going to go in these routes just here but this this code just for sending up this home website is uh definitely some code that we can have within our server.js now the one other line I want to add before we move on to some of these other endpoints or these routes is just one that allows our server to receive Json information when it receives Network requests that have the method of post or put potentially if you recall that’s something we did in Chapter 2 to enable our endpoints to actually interpret that Json data which could be a username it could be a to-do or anything so whenever the client is actually sending information instead of just asking for something via a get request and that’s just one other line of middleware so this is going to be the middle Weare right here this app.use and we’re just going to add one other line of middleware and that’s just going to be called app.use Express Json so that basically just configures our app to expect Json and consequently enables it to pass or interpret that information so we’re just going to throw that in there as well and I’m actually just going to move that up uh directly under the middle Weare line above this other one because this is uh specific to this line here now once again if you’re wondering how I magically mve that line around I’ve got a link to all of the vs code shortcuts that I use in the description down below there’s a website I made that uh basically tells you all about them and just like that we’re almost done with our server.js most of the code that we’re going to write from here is going to be within all these other routes so now that we’ve just about done all the logic for our server the next file I wanted to get started on is the database because we’re going to need our database up and running if we want to do any authentication if we want to have any data storage and we can get started on that by heading into the db.jpg package and the way that we do that is we import and we need to destructure out this particular import and it’s called database sync so it’s a synchronous database and we import that from node uh semicolon from the sqlite package once we have that imported we can go ahead and just like we created our app with Express we create a database by creating a variable called DB and setting it equal to and we have a new synchronous database and in here we pass in a string and that string as you can see just here in the documentation is going to be an inmemory database and so that means we don’t have to manage any external files and so we’re just going to have the semicolon and type memory now this isn’t what you would use for a production database we’ll see how we can configure that in the last project in chapter 4 but if you just want to get up and running with a SQL database then we’re just going to use memory for that and that will be more than adequate so now that we have our database the next thing we need to do is basically set up our database when we boot up our applic a now for that we need to execute uh some SQL statements from strings now the way that SQL works or structured query language is it’s you can almost think about it as an Excel spreadsheet where you have different columns and different tables where a table is kind of like an Excel sheet so you can have different sheets for managing different data now unfortunately when we first create our database none of these sheets exist or none of these tables exist the table is the literal term for it so in this case we’re going to have two different tables where each table is like a sheet one sheet is or one table is going to handle our users and then the other is going to handle all of the to-dos and for every to-do it’s going to associate them with a user now to actually make this happen within the database we write command using the SQL language and using this node package we can get our JavaScript to execute these commands and configure our database so what we’re going to do is we use the database do ex uh execute method that’s going to execute a SQL command and act it upon our uh database now that takes a template literal string as that’s going to allow us to uh write some um strings across different lines now the SQL command to create a table where once again we need two tables where each table manages different data one table is specifically for users and once again you can just think of that as like a tab as a tab as a tabular database like an Excel spreadsheet we need to actually create it so we create a table called users just like that now after we create the table we need to specify some of the different columns in our table so we have these circular parentheses and we’re going to enter the circular parentheses on some new lines now in here we enter the different columns and we specify what kind of data type they’re going to be in addition to some other information so the first field is going to be an ID now the ID is going to be of type integer so that by itself is pretty straightforward after that we’re going to have a comma and then on the next line we’re going to need the username and that’s going to be a text field and it has to be unique so we can throw the unique key onto it and then lastly we have a password field and that’s also going to be of type text so this SQL command right here is going to be executed upon our database and will configure that table so that our database is up and listing and it’s ready to accept the new users where each user has a username and a password and that gets save to the database now the second command we’re going to need is for our second table and that’s going to be for all of our to-dos so we’re going to have a database. execute and we want to execute a SQL command I’m going to open that onto a new line and this one’s going to be very similar we’re going to create another table which once again is just like a sheet and one is going to be called to-dos and then we’re going to have the circular braces where in here we’re going to specify the different columns it’s pretty straightforward the first one is also going to be a unique ID the ID is the best way for referencing different lists or different elements in the uh table and that is once again going to be an integer field then we have a comma for the next column the second one is going to be a user ID now this field is going to also be an integer field but more importantly it’s going to be the field that Associates it to do with a particular user so every user is going to have an ID right here and the user ID field is going to keep track of which user a to-do is for and that’s super important so that when someone authenticates they only receive to-dos that are specific to them now to create this level of community communication between tables we need to essentially uh configure a field to be what’s known as a primary key so what that means is that in this ID field here since I’m saying that the users table needs to be able to be referenced from other tables and we’re going to reference by the ID we need to set this key up to have superpowers and essentially create it as a primary key so we use the fields primary key right there and that’s going to set up this d as like a superpow key that can be referenced within other tables so such as the uh to-do table right here now the last element I just want to add onto this one is called Auto increment and that’s because when we create a user we’re not going to specify an ID we want it to be automatically assigned to the new user and we just want them to Auto increment so our first user is going to have the ID of one the second user is going to have an ID of two now now all of the SQL stuff will become more and more clear to you as we continue to use it and also as we actually look in the database as we create all of these interactions and finally in chapter 4 as we build out a more complex database and literally start interacting with it obviously in both of these applications using the application will save data to the database but there’s like basically uh hacker ways that you can overwrite it and work in the background and we’ll see how all of that works and it should be a really beneficial experience to help you understand exactly what’s going on but anyway the moral of the story what happened there is we have this user ID which Associates a to-do entry with a particular user but to allow that communication method we need to set up this ID to be a primary key so that it has superpowers so that we can reference it from within other tables so now we have this integer field and this one right here does not need to be a primary key because it just refers to this primary key however the ID of the to-do every to-do also has its own unique ID this actually also is going to be set up as a primary key and uh we’ll configure it so that it auto increments after the user ID filled we obviously have the task and that’s just going to be a text type uh and then we have a completed status which is going to be a Boolean yes or no so it’s either complete or it’s not complete and that’s going to track the status of it and that is going to have a default of zero which is going to be false so we’re going to use a numeric value to track the true or false state so zero is false one is going to be true and finally we have a foreign key which is going to be the user ID uh and that is just going to reference the users ID field so that’s obviously quite a few SQL commands once again this is a backend full course we’re definitely uh not giving SQL the attention it deserves the SQL ecosystem is incredibly you know

    sophisticated and it’s you could spend 100 hours looking into it uh and becoming more and more competent with SQL but you know for now we just need to configure the tables and as I said earlier when we start going behind the scenes and modifying uh our SQL databases using SQL commands all of this will become much clearer and apparent to you but for now we just need to get them up and running so if we go ahead and save that that’s the code we need to create our two database tables set them up with some columns and give them the means to communicate and reference one another now the last thing we need to do from this file is export have a default export of DB and this line right here is going to allow us to interact with this database file this database variable from other folders and files such as from within our server such as our or routes our to-do routes and our middleware and as you can see it also allows us to keep a very tidy project directory because you know I’ve got 23 lines to configure an entire database which is super sleek and now I can you know quickly know where to reference that code it’s not all just jammed into one file uh everything is compartmentalized so what our database created it’s time we start setting up some endpoints to manage our authentication which is going to be step one of getting this front end working properly with our database and backend now just before we jump into our orth routs and the next section I noticed there’s one little error I made in this particular file so we’ve configured our database so that when we boot up our application it basically creates these tables where we can then save all of our data however when we reference between the two tables when we assign a user to a to-do or associate a to-do with a particular user we’re referencing the users table however we only named it user so that just needs to be pluralized and we can go and save that and that is now fixed so for the next step let’s actually come back to the application which is now being served up I refresh the page this is what loads and let’s try enter a user I like using test @gmail.com and I just do a password uh it’s a couple of digits and now let’s see what happens when I click submit having this network tab open so I click submit and I get an error showing up and if I look at the network tab we sent out a request to let’s take a look where let’s have a look at the headers section here we have the header it’s got the general information including the URL and this is the end point that we sent this network request out to from the client to the back end there’s a post request which means that it contains a payload and that’s got a username and password and that is specified as Json information and we did configure our server to pass that information with this line here we said to our server expect this Json information however even after having done that you know we got a 404 which basically said there was no response we didn’t hear anything back no idea what happened and that’s because we don’t have an endpoint set up for this particular route and that’s why we got the 404 so what we need to do now if we want to log in a user or let’s say I want to sign up instead let’s go and submit that to the registration endpoint we need to create both of these end points cuz right now we’re getting back 404s because they don’t exist we haven’t made them and they are the endpoints we’re going to be creating within this orth routes. Javascript file now obviously uh it’s super fun to use this interface to be able to send out these Network requests but we’re going to also do the exact same thing from our client emulator which is this to-do app. rest uh but we’ll see how to do that very shortly first let’s actually create the endpoints cuz there’s no point in emulating these Network requests if there’s no code to rece reive them so from within the or routes this is where we’re going to Define all of these endpoints for handling the authentication functionalities now in here we need to do a few things one is we need to import Express from the Express package two we need to import bcrypt from the bcrypt JS package now if you’re called bcrypt has all the code for encrypting the passwords and creating a truly secure application and as we come to this code and implement it I’ll explain a little bit about how the encryption algorithms work we also need to import a package called JWT from Json web token and that’s just going to allow us to create a Json token which is just uh an alpha numeric key that is essentially you know a sec password that we can associate with a user to authenticate them uh when they make future network requests but without needing them to uh sign up again so that’s going to be important and the last thing we need to import is our database and that’s going to be from our data uh our db.jpg import it into another so these are the four Imports we’re going to need the databases obviously because if we’re registering a new user we need to write that new user to the database and if we’re logging them in we need to check the database to see if that user actually exists now one New Concept we’re going to introduce here is how to configure endpoints or routes when you’re not defining them in the original file if we come back to chapter 2 just here I made a whole lot of endpoints in this server JS it was pretty straightforward we just called our app and we configured the endpoint for the method and the route and consequently wrote the logic to respond and obviously that works we’ve already done one example of that with this endpoint right here this home get endpoint that serves up our HTML website however when we’re subdividing uh or compartmentalizing our routes into these sub files we need one extra basically configuration layer and so what we’re going to do is Define something called router it’s a variable called router and that’s equal to express. router now the reason we do this is because what we do from here is we export our default router and then and it needs to be a lowercase R and our main application just in here what we can do within the server.js is we can Divi uh Define a section here called routes and instead of writing out all of our end points we just say app.use and for any authentication routes so any routes that are within this path we just use orth routes now or routes is an import that we need so we come up to the top just here we’re going to import orth routes from / routes slor routes. JavaScript so there’s a few steps there let me just go over them once again obviously we just got an issue right here cannot find this particular module uh rout slor routes we’ll get to that in a second but anyway the moral of the story is that we inside of or routes just here create this router and it’s to this router that we’re going to assign all of these methods so it’s basically like a you know a subordinate app or like a a a subsection of our app where we can create all these methods so for example I’m going to have one method that is a register method right here so it’s a post request to the register endpoint which if we look just here this is the post request to the register endpoint and we’re not throwing in this SL or throut on the front and we’ll see why in a second it’s just to the register endpoint and then just as we have been I can provide a second argument which is the request and the response and that is the function to be executed when some code hits this endpoint now when we export this router and we import it into our server ensuring that I save it and then we use this line right here it basically takes all of the routes that we Define for the or routes and It Slam them on the end of the SL or route so it combines basically the paths or the routes and so that will Define this particular endpoint so we’ve got the SL or and then all of the end points we Define within our or routes will just be these sub routes within that now we’ve got an issue just here that says we cannot find the module uh DB so that is within our orth routes and I think that just needs to be db. JS instead so let’s save that and now that’s working perfectly so what we’re going to do inside of this uh authentication routes section we’re going to create two endpoints so instead of using the app now we’re using the router and that just allows us to subdivide all of our uh endpoints and routes into these nice little files and so we’re going to have a secondary post request and this one is going to for/ login and that’s once again just going to have a call back function that receives the request and the response as an argument and this is going to contain the logic to log in a user when we hit that endpoint now when we save that these two endpoints that we’ve defined in here are added to this router which is exported from this file and then in our server.js we import all of that as orth routes and then we just slam that on the end of any slor request we tell our app to use all of these or routes when we hit end points that contain the/ oror route now the exact same thing is going to happen within the to-do route so we may as well just configure that before we actually get into the nitty-gritty of uh defining all the routes so in this uh to-do routes file we’re also going to import Express from the Express package we’ll also need our database so we’ll uh import the DB from DOD database. JS then we’ll Define the router which allows us to create these uh specific routes Within These sub files so router is equal to express. router and then we can just have a router doget method and this is going to be to get all todos for a user so this one get all todos for logged in user and then we just have the request response set that up that’s going to be an arrow function then we need another one this is going to be to create a new to-do and in here we’re going to have router. poost cuz if we’re creating a new to-do we’re not just asking for information we’re actually sending over what the new to-do is going to be that’s going to be some information entered into the front end that’s sent over the Network as a network request our back end is going to receive that post request uh and that’s going to be to the/ route and then we’re going to have the function to handle that and consequently save it to our database we’ll need one for update a to-do and so this one is going to be a put method which if you recall putting is for when the networker quest wants to put information in the place of an already existing thing so post is for creating and put is for modification so this is going to be to the slash now the route for the put is slightly more complex if you recall within the database here when we create a to-do all of them get an ID now if we’re updating a to-do the way that we do that is we update the to-do with the ID that matches the ID of the one that we’re updating so we check the database we match up the ID and then we make the modification specifically to that task so that means that when we send out this request we actually need to specify the ID now one way we could specify that ID is by posting that as Json but another way is by using a dynamic query parameter essentially what we do is we use the semicolon and then we provide the parameter where you know if I actually created the request I could use IDE of three in the place of this now we’ll see how all of that works when we go and create or uh when we create the emulations of all of these Network requests but for now this is just a dynamic ID which is going to allow us to identify exactly which to do we need to make the modifications to now in here we still have the Callback function or the function to be executed when our Network request hits that endpoint we’ll still have a body of information that is sent over is Json but we just specify the ID just here so that’s nice and neat and then the last one we need is to uh delete a to-do and that’s going to be router. delete and that is also going to be to the ID parameter or ID path and that’s going to allow us to basically say only delete the to-do entry that has this particular ID and that’s also going to have the function now with all of these endpoints done obviously we’ll come back and create the code for each of them later but I’m just going to export default the router that’s going to assign all of these end points to this router entity and then once we’ve exported it we can save this file and come into our server.js and we can import from the/ routes f older and then consequently the to-do routes.js and I’m going to import these as a variable called to-do routs now the name that I use to basically assign all of these Imports to doesn’t have to match the one that was exported from the file so in both of these cases I’ve exported a variable called router but when I’m importing them I’m importing the value and assigning it to this name essentially so to do routes and now what we can do is we can just duplicate this line right here except in this instance it’s going to uh be for routs that are to the to-dos endpoint and ultimately we can see with this configuration it’s just going to allows to have a whole lot more end points but basically uh subdivide them into their own files which is just going to keep any everything a whole lot cleaner uh so that’s super nice and neat and technically we’re actually finished with our server.js file everything from here is just uh filling out all of these routes and their functionalities now that’s actually a lie there’s one last thing we need to do uh and that is add some middle wear to the to-do routes because when we assign them we have to throw in some middleware that authenticates a user before they can actually have access to those end points so we’ll come back and make one little modification to this line later but that should do for the minute now one little error we just need to clean up real quick is uh in the server I noticed that this was not the we meant to assign the to-do routes to this/ too route so now that we have all of this code done we’re ready to go and fill out our endpoints the next thing we’re going to do is actually create the emulations for all of the functionalities now the reason we do this is because obviously when a user uses our complete application they’re going to be able to do all of it from the user interface however while we’re still developing our application it can be useful to basically predefine all of these interactions and we’ll set them up in here and that way we can emulate these functionalities as if a user was using our application and we can ensure that our backend is set up to handle everything now this process is kind of analogous to running tests in JavaScript or any particular particular programming language the testing where testing Works in a similar manner essentially you just basically think of everything a user could possibly do and then you create those actions programmatically and then you can be sure that they are working adequately so the ones that we’re going to start off with are the authentication routes is the first thing the user would come in and do they’re going to have to register user so I think that’s a good one to start off with and so what I’m going to do here is I’m just going to add in some triple pounds inside of our uh too- app. rest and they’re going to separate the different emulations now I actually lied the first one is going to be the get/ endpoint and that’s going to be to check that the website loads so when a user hits this endpoint they are sending out a get request to the HTTP localhost and then our app is on for me it’s uh port 50003 for you it’s potentially 5,000 or whatever other Port you specified and so they’re going to enter this URL and they expect to get back a website so we can now go ahead and test this endpoint and we see that we do in fact get a successful response we get a 200 response which means success excellent and we can see just here we have all of the HTML code that if we did this via a browser or if a user did it via a browser uh would then be interpreted by the browser and rendered onto the screen and all of the JavaScript would be run so that’s pretty cool that’s our first endpoint setup and it’s working we’ve created an emulation for it and we can see that uh that communication between the client emulator and the back end is successful now the next one is for registration so what I’m going to do to test exactly how the registration works is actually come over to the index. HTML and we’re going to look at how the the client or the front end actually creates this registration Network request from the browser and this is also going to be beneficial if you wanted to come and Fiddle with this front end code uh it’s pretty self-explanatory but we’re just going to uh run through it together so let’s look for the function that registers a user here it is authenticate so if we come down we see this line of code just here obviously there’s a bunch of guard Clauses up the top that basically just say if a user doesn’t have a username or password then let’s not even bother sending out a network request but we check to see if the status is is registration and if it’s true then this code right here is the code that creates that Network request via the fetch API and here we have the URL or the to which we send this network request so we can see that the Point is/ orregister API base is just up here the API base is uh the Local Host in this case it would just be slash because it matches the uh host from which we serve up this uh front end now if we look at the network request we can see that the method is post we can see that we specify the content type and we can see that we have a body which contains some Json which is originally an object and that contains a username and a password field so what we’re going to do is literally just create these three fields from within our client emulator so first up is the method it’s a post request and then we provide the URL so that’s going to be HTTP sl/ localhost 003 and then that’s to an orth route and a register route and together that creates that specific end point point and if we check our server we can see that all of our or routes go to the/ or route right here and then within our or routes we can see that we then have the register endpoint and this is the endpoint that we would expect to receive this network request so that’s the post request set up now if we’re posting information we need to actually create that information and that’s going to be a Json object right here and that has two Fields it has a username as we saw I’m just going to leave that as an empty string for a second and then we also have a password field and that is also going to be an empty string and that’s just set up like an object except we uh ensure that all keys are stringified using the double quotations now just for formatting reasons it’s important that we keep a space above the Json that we’re creating for this network request now the last thing we saw inside of our index. HTML is that we had some headers where we specify the content type to be Json so we just have to do that as well that’s what our front end would do so that’s what our emulator has to do so in here we’re just going to specify the content type parameter and that’s going to be application Json not JavaScript just like that and so now we have set up our client emulator to emulate that Network request as if it were a user actually using the front end and since we’ve created that endpoint we should be able to send this request and because within that particular endpoint which we have right here we don’t have any code to respond what will actually happen is the network request will find this endpoint the endpoint exists and it will just wait indefinitely it will wait for a response which it never gets and typically there is a timeout associated with uh receiving a response and if the timeout is reached basically we don’t get a response within a period of time then it will default fail that request and that will take a second and that’s opposed to if we hadn’t Define this endpoint we would get an instant 404 saying the endpoint itself actually doesn’t exist so now if I send that request we can see that we just sit here waiting for a response nothing’s happening and that’s because the code has hit this end point but this we don’t get a response back so we just sit waiting indefinitely now because that’s not what we actually want but we have confirmed that we’re reaching this endpoint we can cancel that little Network request and we can start defining some of the logic to register a new user so the way that this works first what I’m going to do is just comment that this is a uh a register a new user endpoint and that is the/ or SL register route and it’s obviously a post request now the first thing we need to do if we’re registering a new user is our back and receive this uh Network request we need to figure out what the username and password associated with this network request are we know they’re posting information and when we post information specifically as Json that is always contained within the request the incoming request and it’s associated with the body key now once again this particular line right here allows us to read that Json body of the incoming Network request so what we can do is we can say const body is equal to requestbody and that’s going to give us access to the Json body of the incoming request now one thing I’m actually going to do is save ourselves a step right here and instead of creating this variable variable body from within which we would have to access the username and password key I’m actually just going to destruct rure out the username and password directly and so now if I wanted to console.log the username and the password and for the second we can end this communication system and just confirm that it’s working by res. send status and we can send a 2011 which would be typical for creating a new user so we can now test the completion of this uh communication cycle from client to backend and back to the client and the reason I want to do this is I just want to confirm that I can in fact access the username and password which we will then need and I also want to confirm that we can successfully respond to the network request so let’s go ahead and save that that restarts our application and I can now emulate this request now notice just here I am printing a line but I’m not actually getting anything out of it and that’s because I’m a nump d and I haven’t Associated any values so let’s go ahead and throw in some values here let’s just say the username is Giles gmail.com I’d be so curious to see if that’s someone’s actual email and I just like doing 1 123123123 as an easy password and now uh and just before we can see that we did in fact get back a successful request but I really want to console out the username and password so I’m going to rerun that and we can see now that we do fact log from the back end in the backend uh terminal in the backend console the username and password and that implies that we can access this code inside of our or routes and save it now to register it to our uh database so that is super Nifty absolutely excellent step one complete now while we’re here I’m just going to do the exact same thing underneath this other pound key except this one is going to be for the login uh route and that is just going to mean that once we’ve created the login code login backing code we should be able to emulate the registration and then emulate the login actions that quickly as opposed to faffing around with a front end and that is actually the beauty of API API endpoints when you get more accustomed to them sometimes it can actually be convenient to cut out the front end all together and just do it programmatically via an API but obviously if you’re having a to-do application then it makes sense to have a front end to do that uh the one thing I want to do before I exit this file is just explain what these end points do so this is to register a user and that is a post to the/ orregister Route uh and I’ll do the same for the following one the triple hash key basically creates a code comment so this is going to be log in a user to the or/ login route so we’re finished our emulations for these routes let’s go and dive into the code so now that we’re ready to write the logic to register a user it’s time to get technical and let’s talk about Security in developing a full stack authentication protected application now one of the biggest oopsies companies make is they get into the habit of when a user creates you know a new account with a username and a password they save the username and the password to the database and the problem with this is that if they got hacked for any reason suddenly everyone’s password is exposed to the world now usernames aren’t as important as long as you don’t have both username and password so what we do instead of just verbatim saving people’s passwords as a string to a database is we Crypt them and the way that we’re going to do that is with this bcrypt package now the thing that becomes challenging as an outcome of encrypting every password is that when we go to login we look up the username and we check the password and we need to see if it matches the one that they’ve just entered the problem is when we look up in our database the password associated with the user we only get back the encrypted one now the problem with that you might say well why don’t we just decrypt it these encryption algorithms are one way now there’s a whole lot of technical information that explains exactly how that system works I’m not going to dive into it but the purpose is is that you can’t actually decrypt that password and that makes it so incredibly secure but it also means that we can’t match the password in the database with the password that’s just been entered or at least that’s what you might think so just to really give you an example let’s say just here we get this username and password and then we save Gilgamesh gmail.com and as the password we end up with some long series of keys like that and it just looks like mumbo jumbo and this is what we save to the database so we let’s just say we save the usern name and an irreversibly encrypted password so that’s what get puts in the database now when we log in a user we get their email and we look up the password associated with that email in the database but we get it back and see it’s encrypted which means that we cannot compare it to the one the user just used trying to log in so what we have to do is Again One Way encrypt the password the user just entered now the encryption algorithms are deterministic which means that when you encrypt a particular word using a particular key they encryption algorithms always have a key associated with them and that’s basically just a way of them to create mumbo jumbo consistently and so what we do is when a user enters a password if we encrypt that using the exact same algorithm it will get to the exact same outcome and then we can compare the two encrypted passwords and that’s how we authentic to us it now security is a whole big topic and I really just did a 5minute overview as we practice it and implement it it will become more obvious but yeah on the whole essentially what we do is we use a special key we encrypt a password we save it to the database when a user registers and we do that so if we ever get hacked the passwords are totally meaningless cuz they’re encrypted irreversibly when a user logs in we take the password they’ve just entered and we encrypt it using the same algorithm and that because it’s deterministic will produce the same encrypted key and if they’re exactly the same we know that the password the user just entered must have been the one they used when they registered their account and therefore they are equal and the user is the correct individual now the way that this works from a programming standpoint is the first step we need to do is to encrypt the part password and so what we do is we say const hashed password we create a variable called const hash password and that is equal to the bcrypt library we’re going to use it to encrypt our password and we use the synchronous and we use the hash sync method and we put in the password and we provide a secondary key which is the salt which here we can see in the prompt the salt length to generate or salt to use defaults to 10 and that’s just going to help us synchronously generate a hash for the given string so in this case we’re going to use the value 8 now we need to use that consistently as well so now we have an encrypted password and I could actually console.log the hashed password just here and remove this other uh line and now we could emulate that request and see what the hash password looks like so let’s go ahead and run that and here you can see this is the hash password that we would save the encrypted password we would associate with a user and this is irreversible so if our database was hacked no one could ever undo these encryptions and figure out what the passwords originally were and so that’s what we securely saved to the database instead of just saving the plain old string that a hacker could totally take advantage of so now that we’ve done that we can come back to our orth routes and we now have a hash password that we can save to a database now now when we’re interacting with a database in production environments typically a database is actually a separate server entity and this case we’re having it all within the same server entity and there’s nothing wrong with that it’s great for development in chapter 4 we will separate them into their own backend entities but because in production basically we’re creating a new communication uh Bridge or system so now we have frontend server and the database I like to throw this code in inside of a TR catch block where we catch the error and that’s just going to allow us to handle any errors we might encounter in this process and that’s super important for having a functional robust backend so in here I actually like to do the catch case first what we’re going to do is we’re just going to console.log the error. message if we get a message and we’re actually going to respond to the user we’re going to respond and send a status of 503 now if you remember 500 LEL codes which are between 500 and 599 mean that the server has broken down somewhere in this process and that’s exactly what would have happened if we fail to save a user to the database so let me just add a line here this is going to be to uh save the new user and hashed password to the DB so we send back the status now one important thing to note is that if we send back this status we can’t then send back another status that’s going to give you an error there can only be one status one response and so we’re going to either send back one if we bug out the code or if we successfully run this Tri block then we’ll send back a 2011 or actually we’re going to send back a token we’ll see how that works in a second so as for the logic that we need to save to the database well what we’re going to do here is actually run some more SQL queries so the first thing we’re going to do is create a variable and it’s going to be called const insert user and that’s equal to DB and we run a prepare method now the prepare method is pretty equivalent to the ex execute method right here exec where we basically just run a SQL query however the prepare method allows us to inject some values into these SQL queries so what that actually looks like is we write a SQL command right here and the SQL command to add an entry to an existing table inside of a database is we say insert into then we specify the DAT the table within the database so that’s going to be the users uh database and then we have some circular parentheses es and in here we have a username and a password those are the two columns that we want to insert into if you remember in when we configured this we had the fields uh username and password those are the two columns within the users table the ID is automatically assigned so we don’t have to worry about that so we’re going to insert into the users table specifically The Columns username and password and then we specify the values and in here that’s just some circular parentheses and for the minute that’s going to be a question mark and a question mark so that’s going to prepare the SQL command that we’re going to run and then what we do is we Define a second variable called the result and this logic is a little bit specific to the sqlite database that’s part of the node ecosystem but now what we do is we take the prepared query and we run it but we call run as a method and we pass in the values that we want to save to the database so in this case it’s the username which we destructured out of the incoming request the body of the request and the hashed password so just to summarize those two lines we first prepare a SQL query where this is just our SQL query we’re to insert data into a table that exists within the database we insert into then we spe specify the table and then we further specify the exact columns to which we want to add information so we want to add information to the username and the password columns and then we specify the values and we basically leave them as blanks until we then run that SQL command and then we provide the actual values which will be injected into these places and that will be sent into the database now one thing I like to do just in here is when we register a new user and consequently create a new user in the database I want to give them a default to-do so I actually want to create a to-do for them that will then be shown on the screen and that’s just going to basically give them you know an entry in their to-do list to prompt them to create some more and understand how the application works so now that we have a user I want to add their first t to do for them now in this case the to-do or the default to-do that I want to add is going to be called const uh default to-do and that’s equal to a string that just says hello exclamation mark and says add your first to-do and we can kind of see how that is just going to I mean technically it’s a to-do and I’m actually going to change that for a smiley face that’s technically a to-do they can complete it when it’s done and that’s just going to prompt them to create some more to-dos and now that we have that line we can create a variable called const insert too and that’s equal to database. prepare and we prepare another SQL line or command so that’s going to be the template literal string and inside here we’re just going to insert into the todos table and then inside the circular parentheses is just here we’re going to specify the columns to which we want to add information so if we come look at the schema for the database we can see right here we have the ID that’s automatic we have the completed status that’s automatically assigned as default to incomplete when we add a new to-do so the two fields we need to specify are the user ID that the to-do entry is going to be associated with and the actual text for the to do so in this case we’re going to enter information into the user ID and the task columns and those are going to have the values and once again that’s just going to be a question mark and a question mark so that’s going to prepare the SQL query and then we’re going to go ahead and run it so we’re going to just say in this case we actually don’t need uh to assign it to a variable we can just uh type insert Tod do. run and now the first entry we have to provide as an argument to this run method is the user ID and the user ID can actually be uh found from within the result of creating the first user so we can in here get the result and we can access the field called last insert row ID so what that does is when we get back the result we just check the uh ID of the last row or entry added to that table in which case it’s going to be the ID associated with the most recently entered uh new user so we get that ID that’s the ID that we want to associate the to-do with and then we just provide the value which is going to be the default to-do and that’s going to go ahead and insert the to-do and then the last thing we need to do now that we have within our database added the user to the user’s column and created their very first to do is we now create a token now the token is super important because once we log in a user they are then in a position to create new to-dos update to-dos delete to-dos but those to-dos are specifically associated with that user and we can’t let them modify everyone’s to-dos just theirs so whenever they run those actions whenever they try to add a new to-do we need to associate a special token or key with that Network request that confirms they are in fact an authenticated user so this is kind of like an API key in a sense we create a token and in this case the way that we do that is we say const token and that’s equal to Json web token. sign and in here we pass an object and the object has a key ID and that’s just the result. last insert row ID as we had just up here uh so we get the ID of the most recently added user and then the second value that we have to provide to the signing method is an environment variable and this is a secret key so what we’re going to do is we’re going to say process.env and that’s going to read the environment variables file and then we’re going to access the JWT secret key and then finally we’re going to have a third argument and that’s going to be an object and that’s just going to have a value expires in and the associated value is going to be uh 24 hours so that means that the special token that a user can attach to their Network request will expire in 24 hours at which time they’ll have to reauthenticate to gain access to a new token now as for this JWT secret we don’t have it yet now this is a secret key that only we know and because it’s a secret key our immediate first thought is to throw it in the environment variables because if people gain access to this key they’re one step closer to decrypting all of the passwords and being able to basically fraudulently act on behalf of a user so in here we’re going to create this environment variable called JWT secret the name needs to match whatever we throw on the end of process.env so that name has to match so we’re going to create this uh key inside of our environment variables and we’re going to set it equal to and in this case it can just be a string it can be any particular string so I’m just going to say your JWT secretor key you could fill it out with anything that’s going to work for us for the second now one other value I want to pass in here is a port and I just want to set that equal to 53 and that means that uh our server is actually going to use that port instead of defaulting to the 5,000 three so now that we have that done we’ve created this token we now have to send it back to the user so we just say res. Json that’s going to send back some Json as a response and in here we provide an object and we use the key value of the token so this syntax right here is going to create the key token and assign the value associated with this new token now what happens when we emulate this request we’ finished with the logic for this particular endpoint we add a new user we assign a default to-do to that user and then we create a special token that we can use later to confirm they are in fact the correct user well let’s go ahead and emulate this request so I go ahead and run that and we can see that we have now added a new user to our database and we get back the special token that looks a little bit like the hashed password we assign to the data base but it’s actually not it’s a unique token and this token contains all sorts of information and essentially what the front end does is if we look at the logic let’s close that and open up the index.html we can see here we get back some data so we receive the Json we basically pass the Json and we assign it to a value called Data now this is all within the front end and then if the data contains a field called token we save that token to the local storage which is basically a client side database it’s how all data is persisted on a front-end only system it’s kind of like a cookie if you ever get asked you know do you want to save your cookies it’s a similar concept it’s saved to local storage so that we can conf consistently access the token even if we refresh the page or reopen it a day later and then if we have that token we then fetch all the Tod do associated with that token now in this case uh if I come across to my application just here and go into the local storage we don’t yet have a token but this is eventually where that token will be saved but if we once again come across to the index.html and now we go down to the fetch todos we can see that when we fetch these to-dos which are going to come to this particular endpoint within our to-do routes we’re going to get all to-dos associated with the user that code comes right here in this fetch to-dos function within the index.html we can see the fetch API is used to send out the network request we send it to the too’s route and we assign some headers with it now in this particular fetch request I don’t specify the method but by default it’s going to be a get request cuz we’re getting information now what we do within the is is we specify for the authorization we add the token and then when this network request is sent out with the token encrypted within the network request we then receive that endpoint right here however eventually it will be intercepted by our authentication middleware which is going to basically check that the token is for a valid user and then we’ll only send them back to does associated with that particular user and that particular token now I recognize that’s probably a mountain of jargon a whole lot of new Concepts if it feels overwhelming absolutely don’t stress when we code all of these systems out and really get a good understanding of what does what and how it all works and how it comes together it will become much much clearer but anyway that is our first authentication endpoint done the login one is fairly equivalent and that’s what we’re going to jump into to now and we should be able to check that they have worked and are working successfully by from within our client emulator and we will eventually just be able to register a new user and then log in a user all right so we just finished up our registration route that’s all working we tested it with our client emulator looks absolutely excellent and what that registration route does is it creates a new user inside of the user’s table with a username and pass password so now once we have a registered user we can allow them to log in and the way this is going to work if we come into the index.html and look at our authenticate function here we have the is registration code we’re obviously not registering anymore we’re logging in so we hit this else case where we log in an account and you can see first it’s to the/ off/ login R route or path we post information as part of that Network request it’s of cont ENT type Json and we transmit a username and a password over as the body of the request now when that hits our endpoint the first thing we’re going to have to do here is destructure out the username and the password and the reason we have to do that is because we need to check our database for an existing user that matches that username and then we need to retrieve the hash password and compare the two of them see if they are valid so the first thing we’re going to do is just like we had inside of the registration route we’re going to destructure out the username and the password where if you haven’t picked up on this yet the username and the email are equivalent now they come from the requestbody which is the body of the request which is the information that is being posted with the network request now whenever we interact with the database we’re going to throw that inside of a TR catch block where we catch the error and we console.log the error. message so if we do have an error we can see what it is and in the case that we get an error we’re going to send a status we’re going to say response. send status and we’re going to send a 503 which indicates that we had an error in the back end internal server error now the tri block is going to contain the logic that is going to attempt to interact with the database and just because that can potentially be a precarious oper operation for example in the instance where our database is shut shut down you know we need to handle that potential error case anyway so it’s time to interact with the database the first thing we need to do is pull up the existing user so what we’re going to do is Define a variable called const and we’re going to get the user get user and that’s equal to database and we’re going to prepare a query that is going to read the database for this user so we’re going to go database. prepare invoke that method and in here we’re going to have a string now the SQL command we need to use to read an entry from the database is we say select and then we use an asterisk key to say we want to read every single column from the database so we read everything from the users’s database and now what we’re going to do is throw in a condition so we’re going to read in all of the data from users where the username is equal to and then we’re going to have the placeholder so this is the SQL command we need to read all of the entries from the user’s database but then actually have a condition that filters out a whole lot of them so now that we have that query prepared we can go ahead and run it so we can assign it to a variable called user and we can just say get user. getet method and we get the user that matches the the username so essentially what this command does is it’s going to inject this username and into this question mark right here and then it’s going to read everything from the users where their username matches the one that we pass in so that’s just an email lookup from a SQL database or a sqlite database so now that we have this theoretical user we need to throw in some conditional Logic for the case where no user is returned so if they try login and they don’t have an account we need to reject them out of this process so what we’re going to do is throw in an if Clause that says if not user then what we’re going to do is we’re going to return out of this function but we also need to respond to the network request telling them that we couldn’t find a user so we’re going to say res. status we’re going to throw in a 404 could not find and then we’re going to send an object that contains the key message and the associated value of user not found so if we go ahead and save that we can actually test that so what I’m going to do is come into our to-do app. rest and I’m going to restart this code so I’m going to contrl C rerun npm runev and that is going to reboot our application and the reason I’m doing that is cuz that is going to empty our database every time we restart our server it’s going to empty empty out our database now if I log in I would expect us to not find a user however since we’ve handled that case we should get an appropriate response and if we send that we can see that we do in fact get back a 404 that contains that message user not found now if I were to instead register a user we get back the token and now that has created an entry in the database and I should then be able to login and we can see that when I hit the login it actually doesn’t respond we’re just stuck waiting and that’s specifically because we haven’t handled the case in which we actually find a user we haven’t responded to it yet so I’m just going to cancel that but that confirms everything is working well for the case where we cannot find a user so if we get past this what’s known as a guard Clause cuz it guards the code or the successful code in the case where we do have a user now what we need to do is check that the password is valid so what we’re going to do is Define a variable called const password is valid and what we’re going to do is use a bcrypt method so we’re going to type bcrypt and we’re going to compare we’re going to use the compare sync method which is a synchronous comparing and we’re going to compare the password which is the one that the user has just entered with the user. password password so user. password is the second argument to this compare synchronous and essentially we can see what the method does right here it synchronously tests a string against a hash so essentially what it’s going to do is as I described earlier it’s going to Hash our password and compare it to the hashed password make sure that they are equivalent so essentially that’s going to return a Boolean where if the password is valid it’s true if it’s not if it’s incorrect correct if the comparison is not true then it’s going to return false now because we’re in the habit of using guard Clauses we’re going to first handle the case where we uh find that the password is incorrect so we’re going to say if not password is valid so if the password isn’t valid then we’re going to return out of the function break out of this code and end the execution and we’re going to respond with a status code of 401 and we’re going to send back an object that has a message key that is associated with the string uh invalid password and that can be lowercase password so now if the password is incorrect we’re going to respond and basically say nice try buddy not getting in today now if we get past this guard cause then we have a successful uh Authentication so let me just add in some comments here so uh if the password does not match return out of the function and up here this G Clause says uh if we cannot find a user associated with that username exit or return out from the function so now we can handle the case where we’ve matched the password we’ve found the user and everything looks good so what do we do well just like we did above we sign a token and we send back the token and I actually think the code is nearly equivalent so all we do is we give them back the unique token which we associate with their account which they can use to authenticate all of their crud actions and all their to-do updates and then whenever they go and make those actions we can just verify that they are in fact the correct user so first we have to get the token we’re going to Define a variable called const token and that’s going to be equal to and we’re going to use the JWT Json web token Library do sign the sign method and this takes an ID like it did just above up here we provide the ID of the user except in this case we access the ID via this little uh through the result. last inserted row ID in this case when we do have a user we can just access the ID field now what I might just do here is actually console.log the user so that we can see what we’re actually looking at when we run this request but anyway that is the first argument we need to pass into the sign method after that once again we do the process. EnV we access the environment variables file and specifically the JWT secret key and then we have one last argument which is uh an object and that contains the expires in key and that just once again expires in 24 hours so that’s going to create this token for us and then the last thing we do is we send that back we res. send or res. Json if you want to send Json and in this case we just send the token back from our endpoint and just like that we have all the logic we need to successfully handle authenticating a user so now we can actually go ahead and test that let’s once again restart our code by control cing out of it and then running the npm Run Dev command and now we’re going to log in a user which won’t exist we get back user not found now I’m going to register a user by running this register end client emulation so now we get back a token that’s super cool and then we should be able to log in a user however I’m going to use the incorrect password so now we log in we do in fact get an invalid password but now if I correct the password we should uh receive the token so that works successfully and we can see we actually consoled the database entry for that user so we can see they have an ID of one we can see their username and we can see the associated hash password now once again this is the token we’re going to use to authenticate all of our uh to-do crud actions so we’ll see how to use that in just a second but before we do any of that we have some crud end points to Now set up so we’re going to come across to the to-do routes and start filling out these end points so there’s four end points in here there’s a get for getting all of the users to do there’s a post a put and a delete now these endpoints are relatively straightforward for example if we want to get all of the uh to-dos associated with a user all we have to do is once again prepare a SQL query so we type const get todos and that is equal to we go to our database which we’ve import imported just up here and we prepare a SQL query now once again just like when we’re reading from all of the users we say select we use the asteris which ensures that we select all columns uh and we select from the to-dos database now because we only want to get to-dos associated with a particular user we just throw in a wear command and we match the user ID the user ID has to be equal to the placeholder that we will fill out shortly so just to summarize how this query works we say select all the columns from the to-do database where only where so it’s actually technically not every entry it’s just only where the user ID matches the value we’re going to pass in and now what we can do is we can say const to do is equal to and that’s just the query and in this case we’re going to use the all method cuz we want all of them and we’re just going to access the request. user ID now at this point you might say but James don’t we have to read these values from the body of the request and that would be one way of doing this however the request in this instant is slightly different because of some logic we’re going to add to the middle Weare so once again the middleware intercepts the end point receiving the network request so it gets there just first and it’s like a security layer so what we’re going to do is actually finish this endpoint assuming that our request does in fact have access to the user ID and that should be a lowercase ID and then we’re going to see how the authentication middleware works so in the case that we do fetch all the to-dos associated or where a user ID matches the one that we’ve just got from the request we can just send back Json containing the to-dos which is an object so that’s this endpoint complete but as I said we need to complete the middle Weare which is going to authenticate a user and make sure that the correct person now all of that is going to happen inside of our or middleware so we’ll save this file and head over there now the way that this middleware is going to work we’ll once again need the uh JWT package so we’re going to import JWT from Json web token now notice how I’ve been signing all of these tokens in giving the user all of these tokens and if you come back to the index.html and look at any of the fetch to-dos we can see that we then send this to-do as a network request over the Internet when a user makes any of the crud actions once they’re actually logged in so they’re logged in they have a token that authorizes them and and then that token is attached to every Network request they create while managing reading updating deleting all of their to-dos now the purpose of the middleware is we intercept that Network request and we read in the token and we verify that the token is correct for that particular user so in here what we’re going to do is Define a function called orth middleware and that is going to to uh receive some arguments it’s going to receive three arguments now the first one is the request the second is the response and the third is a parameter called Next which is new we haven’t seen that yet the request and the response are pretty standard the request is going to be super important because we’re going to need to access the token associated with that request and the response is also important because if our middleware our authentication middleware intercepts this request and finds that the user is not in fact correct then we can reject them using this response so we can basically respond before the endpoint actually receives the information we’ll see what next does very shortly anyway I’m going to go ahead and open up this code now in here the first thing we need to do is Define a variable called token and we’re going to read that from the headers of the incoming request so the way that works is we say const token is equal to the request do headers field so we access the headers of the request which once again if you remember from when we create the network request from the client we have the token inside of the headers associated with the authorization key so I’m going to copy that key because we are going to read authorization from the headers and that is going to give us access to the Token associated with the incoming Network request now if we do not have a token if we try to read the token and there’s nothing there then we can return out of this function and we can respond with res. status we send back a 401 Network request which basically says the error doesn’t have a problem actually your request is problematic and we send back a message we’re actually going to use the Json method to send back this message that’s going to be an object and that’s going to have a key message and an Associated string that says no token provided now that’s a good little gar clause and if we get past that line that guarantees that the token has been provided and now we can verify it so what we do is we use the JWT package and we use a verify method and this takes a bunch of arguments the first one is the token so that’s pretty straightforward we have to verify the token the second if you recall when we signed these tokens we signed them with a key the JWT secret key and once again this is a highly secure key and that’s why it’s in our environment variables. EnV file so we’re going to go ahead and read in from process. env. JWT secret we’re going to gain access to that secret key as the second argument and then the third argument is a call back function so essentially what this function does is we uh verify the token and then we get given some outputs and this function is run and it allows us to basically say in this case do this now this function receives two arguments one is the error in the case where something goes wrong we’re trying to verify and something goes wrong and the last one is a parameter called decoded now we’re going to go ahead and open up this Arrow function and in here if we get an error then we’re just going to return out of the function once again and we’re going to respond with a status of 401 and we’re going to provide some Json that’s going to have an object with the message key an Associated string that says invalid token so we tried to verify them and it didn’t work and so we’re sending back a response saying nice try buddy you’re not the right person or potentially just that their token has expired they need to log in again so that’s in the in case that we get an error now the decoded argument is basically going to give us access to some of the core parameters of the uh verified user and what we’re going to do is we’re going to assign them to the request so the request as much as you might think of it as a network request coming in something we can’t change well technically if we intercept it then we can modify some parameters of it before it actually hits the endpoint and it works just like an object so we’re going to say request. user ID and that is going to be equal to the decoded ID which is the ID that we found from that user and then the last thing we do is we call the next method and that basically says okay now you’re good to head to the end point so we’ve modified the request and then when we call next we say you passed this checkpoint the security checkpoint you can now reach that endpoint where if they were trying to get to-dos we can now read the to-dos from the database and since we’ve added this user ID parameter to the request we can then access it from within this uh this endpoint from this request now the reason we don’t just do this process inside of the endpoint is because with middleware we can write this function once and then slam it in front of every single authentication protected endpoint anyway so now we have this code where we basically verify the token if we find out that they are indeed the correct person then we modify the incoming request to ensure that it also contains the ID of the user since we’ve verified them and then we tell them you may carry on to that particular endpoint so that’s the uh or middleware complete now how do we actually throw it in front of the endpoints well the first thing we need to do is export a default module called or middleware so we have to export it from this file and then if we come over to our server. JavaScript and come down to this particular app. use to-do’s endpoint all we do right here is we literally slam it in front of our to-do routes so in this case we would just throw our or middleware which I’m using the Auto Imports right here we can see it’s suggesting I import it and I’m going to throw it in between the to-do routes so we can kind of think of it as like okay we hit this endp point first we encounter the middleware and then every single to-do route endpoint is blocked by this middleware and that is imported just here and that is now available and if I go ahead and save that that should make sure that all of our to-do routes are protected by our or metalware where we have to confirm the token now two small things I wanted to clarify really quickly first is uh regarding the inside the or middleware specifically what this next does this next just says okay you may now proceed

    to the end point so it’s the final step before saying okay we’re done with the middleware let’s go on to the actual endpoint which is one of these to-do routes and that is you know all of them here so we hit the or middleware we call that next in the case where it’s an actual verified user and then we send it through to the to-do routes having added the ID to the request so that within the to-do routes we can read the ID from the request now as for the decoded ID if you recall when we originally create these tokens when either we register or we log in we actually create the token encoding the ID so the ID is what we associate with the user and so when we encode it into the token we can then decode it and that’s what this decoded is and consequently we can get out the ID and verify the user so those were just two small clarifications I wanted to make now we’re actually at a super cool point in our project because what we’ll do eventually is come down and write the uh client emulations for all of these endpoints but we can actually register a user log in a user and fetch all of our to-dos which for a brand new user should just be that one default entry that we uh added when they register so they get this default to-do everyone who signs up to our app and the front end should be able to fetch all of those things cuz we have created the complimentary backend endpoints to facilitate that interaction so what I’m going to do is once again restart my server completely that’s going to clean out our database and then I can now try to I can now attempt to log in let’s try a random user I go ahead and sign in it says failed to authenticate that makes sense because we don’t actually have that user saved in the database if I right click and come over to the uh in Chrome developer tools by clicking inspect we should be able to take a look at that Network request right here so if I refresh that page and we do that once more we’ll go for test gmail.com I’ll do the favorite password and submit that we can see we send out that Endo we get a 404 not found we can see here’s the payload that got sent out as part of the network request from the client and then the back end received it and then we responded with the uh Json that said use it not found so that all works perfectly however now if I try to sign up I can submit that and we can see it’s actually logged Us in and we can see a few extra Network requests were just run one was this register endpoint and we can see we got back at 200 okay we can see it was to the orregister endpoint and we can see we sent over the username and password and we got back a response containing the token now the way registration is working in this application is that it also upon uccessful registration logs in a user if we take a look at the front end code and come up to the authenticate function basically the registration and the login functionalities both serve to gain access to a token and then once we have this token if the data contains this token we save it we cach it essentially in cookies or local storage and then we load all the to-dos by calling this fetched to-dos method which if we come down sends out a network request saying okay now we have access to this token let’s send it to the/ todos endpoint it’s a get request we get all of the to-dos back the endpoint response with the to-dos it looks them up in the database once our middleware has authenticated the user via this token and then when we get the to-dos back we display them on the screen we render the to-dos and consequently we end up with a dashboard right here and that’s what it looks like it’s nice and responsive mobile responsive looks great uh and we have our first to-do right here now currently once again we can’t add to-dos we can’t edit to-dos we can’t delete to-dos cuz we haven’t finished those end points for example if I TR say done we just get a failed Network request we don’t get anything back nothing happens if we delete we also just get absolutely nothing back it’s not working so we’re going to have to program them in a second uh but we can now log in display a dashboard register a user login a user and our token is working and we can confirm that by a reloading this page if I reload this page we can see it asks us to log in again and now if I go test. gmail.com type in our password and submit we can authenticate once again and now we have this token now as for the token we can come across to our applic right here and we can see that inside of local storage we have our token saved right there so that is all hunky dory uh and that’s working brilliantly so now what I’m going to do is start to code out the rest of these end points so the first one we have to code out now is this endpoint that allows us to create a new to-do now the way that this one is going to work is not too dissimilar to this first endpoint but it is a great opportunity for us to learn some more SQL queries which is kind of fun CU at the end of the day we’re writing these SQL queries to inject the data into our database tables so what I’m going to do is once again log in we submit that that is done we’re now in the dashboard to add all of our information so if we first come into our index.html let let’s find the function for adding a to-do so just here we can see the network request from the client that tells our backend endpoint to add this new to-do to the database so we send out a network request it’s a post method which is typically for the creation of something and we post information we send information across with this network request to the to-do’s endpoint we include the content type we say that it’s Json information that we’re encoding and we authorize this request with the token so that our middleware can confirm we are the correct user and then finally we send over the task which is whatever is input into this field right here when we click this button so our backend can expect to receive the task as part of the body of the request so if we come into our post request right here for creating a new to-do the first thing we can do is is we can define a variable or better said we can destructure out the task from the requestbody once we have access to that task what we can do in here is we can Define the SQL query so we can say const insert Tod do and that is equal to database. prepare and we can now prepare the SQL query so we’re going to use the back Tex and then in here we’re going to say insert into then we’ll specify the table which is the to-dos table and then we’re going to provide the columns for which we have information which is going to be the user idore ID and the task column the user ID is going to specify what user our task should be associated with and that is going to be something that we can now access from within the request because of our middleware authenticating our user so once we specify the columns then we provide the values and those are just going to be two empty question marks as placeholders until we complete the query so now that we’ve prepared the query we can go ahead and insert too. run we can run the query and we can pass in the request. userid as the ID for the user that we want to associate the task with and then we can also pass in the task itself and that is going to insert that to-do into our database inside of the to-dos table now once we’ve inserted that to- do the last thing we need to do is res. Json we need to respond so we’re going to use the red res. Json method and in here we’re going to provide two key value pairs the first one is going to be the ID of the to-do which we can access by going to the insert to-do and we can get the last ID that’s going to give us access to the ID of the most recently added entry and then once we have that we can now also send back the task and we can send back the completed status of zero because it’s false for the minute the zero represents the false bullan State and that is because our to-do is not yet done we’ve literally just added it so we can now save that endpoint now what I want to do at this point is just actually create the client emulations for for both of these actions so the first one so obviously we’ve got three so far one is to get the homepage we’ve got a register a user and log in a user now we need to emulate the uh fetch all to-dos endpoint which is going to be a get to the slash uh to-dos endpoint a get request to the slash to’s route or path better said and I might also make a note that this one is protected and essentially how this is going to work is we very simply have a get request to http localhost uh 53 or whatever Port you’re using SL todos now first what I actually want to do since this is a get request we don’t need to provide any information as part of that request we can just send it but in its current state I haven’t provided the token to associate us with a user so if I just run this request we would expect to be blocked because we can’t our middleware can’t authenticate us so if I go ahead and run that I get back a 401 which says that I’m unauthorized and we get the message saying no token provided now the way that from this rest client we provide an authorization token is very simply by writing authorization and then a semicolon and then in here we put the token now for the minute we don’t have a token so what I’m going to do is register a user that going to create a user with these credentials and then I’m going to copy the token right here and that’s the token we’re going to use for authorizing these uh to-do crud actions so now that I’ve pasted in the token I should be able to send this request except the difference is this time when we send out the request we have encoded the authorization token into that request so that our middleware our or middleware can intercept that and interpret it and consequently authorize us so now when I send that out we can see that I do in fact get back a to-do entry it has an ID it’s the very first to do we can see that it’s associated with a user with the ID number one so that’s our first added user and here we can see we have the task and here we can see we have a completed status which is zero which is the false bullan status so that is an incomplete uh to-do now what I want to do is Define an in or a client emulation that creates a new to-do and that is a post to the slash too’s endpoint and that is also protected now that is going to be a post to the HTTP localhost 5003 SL toos now this one has a bit more information first up we once again need to authorize the user so we’re going to copy and paste that token now one thing to note is that if you’re restarting your user that token will be invalid as our database refreshes every time we reboot our server now that might seem counterintuitive but in the fourth project we will learn how we can persist that information as we create a third party database entry but for now the moral of the story is that this token is only relevant for either one login session or one registration session while that user is persisted so we use the same to token now what we need is also some Json that contains a task so we’re going to add the task field right here and also have an Associated string that says finish coding the projects now as always because we’re including Json we have to add the content type header to this request by specifying the content-type and that’s going to be application sljs so now when I save that we should be able to create this new entry because we have also created that endpoint so I’m going to go ahead and send that request and we can see that we get back our client endpoint response with the task and the completed status of zero now one thing I noticed is that we should have gotten back an ID as part of that field and I think what we’re missing just here is we have to create a variable here that’s called const result and instead of just running this command we actually have to assign the output to a variable called result and then this should actually be result do last insert row ID so now what I’m going to do is once again go ahead and run that and let’s try that once more so if I come back to the client emulator we’re going to have to run all of that again cuz our server has been restarting so we will well actually let’s try log in yeah user not found so let’s register a user we get the key we’re going to have to replace the key in all of these emulations uh so these authorization tokens are going to have to change to the new token and now I should be able to test them both so let’s get all the to-dos we can see we have one to-do just there that’s the default entry when we register a new user now we create it too and here we can see we get back the IED this time so that has worked and it’s important that we get back the ID because in these emulations when we start specifying what to-do we actually want to modify or delete we’ll do that by specifying the ID of the to-do to you know perform these actions to now if I go ahead and get them again I would expect to have two IDE two tasks inside of my database and indeed I do I have the default one and I also have the secondary one right here that we just added so that is super cool now we can go on to the put entry now for the put entry what I’m actually going to do is uh start off by creating the client emulation for that so we can understand how these Dynamic query parameters actually work so we’ve now added two to-dos so we can go ahead and modify one of the added to-dos so in here we’re going to create a new client emulation and that’s going to be called update a Todo and this is to the slash uh how how did I specify it it’s post to/ too so this one is put if I can spell that correctly put to slash todos and then it’s slash an ID which is a dynamic parameter so we throw in front the uh pound key and this is also a protected route uh and when I say ID this is just a demonstration there are 100 different uh update or modification actions you could make for example you could change what the Tod do in actually says but in this case it’s just modifying the completed field so when we click the completed button or done button just here that counts as a modification so that’s what we’re going to use to demonstrate this kind of endpoint so that is a put to the HTTP localhost uh 53 too’s endpoint however if we come to the actual endpoint we can see that now where these ones were just slash so we can’t just do SL toos it now has to be/ toos slash an ID entry so since we you know saw earlier when we sent out these requests we have an ID of one for the first to-do and an ID of two for the second request cuz they automatically increment so I’m going to go ahead and modify the second Todo entry so in here what we’re going to do is specify the second entry so we’re going to add add the slash2 on the end which is the slash ID that’s the ID of the to-do that we want to modify now we’ll still need the content type of application SL Json cuz we’re sending data and we’re also going to need the authorization token so we’ll copy that from just there and now what I’m going to do is specify the Json data and tell it that we want to change the completed field to a value of one currently it’s zero we want it to be one so that should go ahead and complete that to-do and what you could actually do since we’re running all these modifications is you could also specify the task and change the data associated with that task but we’re not going to do that we’re just going to modify its completed status so if I save that I should now be able to run that and update that entry so I go ahead and run that and we can see that nothing happens and that’s cuz we haven’t actually coded out that endpoint but the moral of the story is you can see just here how these Dynamic paths actually work so just here we can now from within our to-do routes figure out what this ID is associated with so the first thing we’re going to do in here is a we’re going to access the completed status from the requestbody so we’re going to destructure out completed from the body of the request now the second thing we’re going to need to do is access the ID now the way that we get the ID from the URL is we say const and we destructure out the ID from the request. params so that’s the parameters of the request of which the ID is now one now you can also get uh parameters from the queries of the request and and that’s just specific to the URLs and we’re actually not going to worry about that in this course but it’s just a demonstration of the different ways that we can send information via a network request we can either send it per the body or we can throw it into the actual URL what the last example would look like is uh just to show you quickly uh let’s say here we had a question mark that said uh task is equal to and this is the is the updated text or something what we could do is if we went request. queries we could access the task field where the queries come after a question mark a common example is page page is equal to four that’s a query uh that’s associated with the request and then you can access the page number and you can consequently get the value associated with that but we’re not going to worry about that well actually why don’t I just show you page is equal to four now if I go ahead and run that once again that’s going to have the exact same output uh which in this case is to weight so that determines that this URL still hits the exact same endpoint however now what we would do is inside of the to-do routes to access the page I would say const uh and destructure the page out from the request. query so now we have collectively uh demonstrated the three different ways that should you want to you can send information or parameters across via Network request from the body via the parameters or as a query entry once again this last one is not really relevant to this course it’s just good to be aware of and it’s not really going to change anything you can just throw these uh queries onto the end of the rest of the URL so we have access to the ad and we have to access to the new completed status so what we can do in here is we can now say const updated to do is equal to and prepare our SQL query so that’s database. prepare and the query here now we already have an entry inside the database so we’re not going to use the insert to create a new one we’re going to update an existing one and the way that that works is we say update to-dos we update the to-dos table and we set the completed field equal to question mark Mark now if we wanted to update the task as well what you would do is you would just say task is equal to question mark and you would comma separate them so if you wanted to you know have two different columns being modified you just throw them after the set key and you just comma separate them this is obviously the common name and this is the new value with a question mark is a placeholder for the data that we will add in a second but in this case we’re just going to modify the completed step status so we’re going to remove that and now we only want to set these new values where the ID is equal to a placeholder which is going to be this ID just up here the ID associated with that to do so now that we have that uh SQL command prepared we can say const result is equal to well actually we don’t need the result in this case cuz we don’t have to send back an ID we can just say updated to do do run and in this case we pass in the new completed status which we destructured and we also pass in the second question mark which is the ID and that’s the ID that we gained from the parameters which is part of the URL once we’ve successfully updated that we can just send back res. Json and we can send back an object with the key message and the associated string uh too completed so now if I go ahead and save that we should be able to run all of these uh emulations so first we have to register a user that gives us the token we’ll copy the token and we’ll paste it inside of the uh crud actions so first what we’re going to do is just get all of them so that’s our default entry it’s currently got a completed status of zero now we’re going to change the authorization token for adding a new entry we’re going to to add this new entry right here that is now added we get back its ID of two it is also incomplete and now we have one for updating and that is going to update the to-do with the ID of two that’s the dynamic parameter and this is the query on the end that as I said earlier anything after the question mark isn’t going to change the actual end point we hit it’s just specifying some further unimportant information in this case that’s not relevant it was just a demonstrate a point of the three different ways we can encode information into a network request via the body or via a parameter or as a query anyway so this hits the update end point have we updated the token let’s just confirm contrl v no now it is updated so we can go ahead and update this and this should change the completed status of the to-do with the ID of two to now complete that is now completed we get that success for response it’s a 200 level status code so that is perfect and now if I get them we can see that the first entry is still incomplete but the second entry is now complete and even cooler if I refresh the page and log in what are the credentials that we’re logging in with Gilgamesh gmail.com and we type in the password this should still work cuz they’re both using the same server and database we can see that we actually have two entries just here and because one is actually complete we can no longer click that complete button and it is inside of the complete column and we now have only one open to do and now if I go ahead and click done on this particular entry we can see that that Network request is sent across just here we get back at 200 level status code and our application changes so it is now two complete entries and zero open entries so everything is working perfectly and so now that that’s all done we have but left one more end point and that is the delete endpoint now what I’m going to do for the delete endpoint is once again create the emulation first now in this case what I’m going to do is actually just copy and paste this update one cuz it’s very similar and this one is just going to be delete a to-do it’s going to be a delete method and I think that should just be lowercase for consistency and in this case cuz we’re modifying a particular Tod do in this case deleting it uh we need the secondary Dynamic ID parameter now we don’t need the query in this case we’re just going to have the ID uh we still have the same authorization token and now we actually don’t need to send any information but we do have to update the method to delete and that is the emulation all done now one thing I would like to point out uh and this is just you know FYI for your information is that if you’re working in a big organization typically what we’re doing right now which is known as a hard delete is almost not recommended uh because it’s it’s permanently erased you can’t necessarily get it back very easily so what a lot of companies will do let’s say you’re managing lots of Google docs for example if we came over to where we create this database typically they will create an additional field inside the relevant table called Soft delete and that’s once again just a Boolean value where when a user deletes it you actually don’t remove the entry from the database you just change the soft delete value to true and that way it can be restored at a later date so it’s kind of just like a fake delete that’s just something interesting to be aware of we’re not necessarily going to handle that in this case we’re going to go for a Perma delete uh but I just thought I’d mention that any who so now that we have this emulation going let’s go ahead and fill out this last endpoint and wrap this project up so that we can dive into the more advanced version so this endpoint is once again relatively straightforward it’s going to delete a to-do and the first thing we need is to access the ID so we’re going to destructure the ID by saying const ID is equal to the request. prams because it’s a parameter this is a dynamic parameter so that’s going to give us access to the ID then we’re just going to prepare the SQL query so that’s going to be const delete Todo and that is equal to database. prepare now the SQL command to delete an entry is as follows we are going to delete from then we specify the table which is the to-do’s table and then all we have to do is say provide the condition essentially so we say delete from the to-dos where the ID is equal to a placeholder that we’ll fill out and a Boolean operator so both of these have to evaluate to true for it to be deleted and the user ID is equal to question mark now the reason we’re having a Boolean condition here is because we want to match both the to-do ID in addition with the user ID so that’s just basically a double secur to ensure that we’re only deleting a to-do that is associated with the correct user so now that we have prepared the query we can go ahead and run it by saying delete Todo do run and then we just pass in the ID as the first field and then the user ID as the second field and we need to get access to the user ID so we can just go ahead and destructure out the user ID uh from the request that would be one way to do it or we could literally Define a variable called user ID uh and set that equal to request. user ID so that’s going to gain us access to the user ID these values are going to be injected into this query and consequently run and that should complete our endpoint so now if I go ahead and save that it’s going to refresh our database and let’s run through this from the very beginning I’m also going to refresh our app and the other thing I’m also going to do is just uh Delete the token in here and refresh the app uh so now that we also have a blank application so first up let’s test the emulator well first what we’re going to do is we’re going to register a user that works perfectly and now that we have registered a user I can go ahead and log in the user to confirm that that’s working so now we log in a user we can see the user just down here there’s the hash password and we can copy this token now that we have the token we can run all of these authentication protected endpoints so first I’m just going to change the token and all of them and then we’ll go and have a fiddle so first what we want to do is we want to fetch all the to-dos we just get the default one that’s perfect now we’re going to go ahead and create a to-do that adds a new to do and I’m actually going to create that twice so now we have an ID of three and if I get them all again we can see that I do in fact have three Todo entries now I’m going to go ahead and run the put which is a modification and that essentially enacts to complete a to-do so if I run that that should complete the I the to-do with ID of two so this middle one so we run that and the to-do is now complete and if I go ahead and get them all again we can see that the to-do with the ID of two is in fact complete this one state represents true it’s kind of like binary and then what we can now do with our last endpoint is delete the entry with the two since we’ve completed it and that didn’t work and that’s because we forgot to respond I actually think in the context of our database that will have run because technically we will have hit the endpoint and run the logic so if I go ahead and fetch all the to-dos we’ll see that the element or the entry is missing but yeah we just need to respond inside of uh the to-do endpoint so that’s the last line that I totally forgot so in here we’ll just say res. send uh and we’ll send a message that says too deleted very simple and just like that we have officially completed project number two and chapter number three we now have a fully functional application if I once again uh come over to the dashboard and try sign in with gilgames run our password that fails so I’m going to sign up I can now submit my sign up we add that default entry to the database uh I can say that we’ve done it even though technically we haven’t that changes the tab that it’s available in I can add a new to-do let’s say uh go to the gym add that that gets put in here I can refresh the page those are fetched back for us I can say I’ve done that now that is in the complete tab we have no open to do so I can add one it says hello that is now added and I could go ahead and delete that other entry and then if I refresh the page everything is persisted and our project is complete so that is absolutely brilliant massive congratulations uh once again if you do want to support the channel be sure to St the repo love that support and with that project complete is now time to dive into chapter 4 our final project where we’re going to take this code base to the absolute Moon all righty welcome to chapter 4 of this full course where we’re going to take our back-end programming skills to the absolute Moon by building out the ultimate backend application now in this particular project we’re not going to start the same as we have started the rest of the projects where historically we’ve run that mpm in- y command and we’ve built our project up from there installing all of the modules from the npm ecosystem and creating all the files and folders instead this chapter 4 project number three is actually going to be an evolution of chapter 3 chapter 3 is like the beginner version of developing a complete backend application and chapter 4 is going to be like what you would find in a company or massive Tech organization Enterprise level absolute God tear backend infrastructure as for what exactly is being evolved well there’s two things actually three things I guess in particular that are going to change and upgrade from our previous project number one is the database in the previous project we used sqlite which is a brilliant database but if you’re going to build a big production level application then you want to go with a more reputable SQL database such as MySQL or in this case postgress SQL postgress SQL is my all-time favorite SQL database and in this project we are going to put it to good use the second core difference between chapter 3 and chapter 4 is that in this project we are no longer going to be writing out these custom SQL queries instead we’re going to use what is known as an OM or an or or an object relational mapper and what this does is it’s like a middleman between our postgress database and our JavaScript where we can now interact with our database our SQL database as if it were a JavaScript entity and that is thanks to our o the middle man and in this case we’re going to be using an RM called Prisma it’s very popular we’re going to learn how we can integrate it into our project with postgress and we’re also going to learn about all the other advantages that come with using an RM because there are many and last but absolutely not least is that we’re going to dockerize our entire project now in chapter 3 we actually had our database and our server as essentially the same entity and this project is going to be different they’re going to be two separate environments that means that our server is going to have to communicate to an external database and both of these are going to be their own independent Docker environments now this is a much better practice because if your server breaks down it doesn’t mean that your database has to completely restart itself and it also means that our database is going to be able to persist data that much more effectively so at the end of the day these are some absolutely massive changes and there’s also going to be a whole lot of other stuff that we will learn as a product of making these evolutionary changes so it should be loads of fun now as for how we’re going to kick this project off it’s not going to be like the previous chapters either where we have previously run npm inet Dy and then installed all the packages from the npm ecosystem and then built up our file directory from there in chapter 4 what we’re actually going to do is create a duplicate of chapter 3 by right clicking Hing copy and then we’re just going to paste that folder directory and we’re going to end up with a duplicate that we can then rename and we can turn it into chapter 4 so now we have our code base that we can go and rip to Pieces keep the core logic keep the server and make any necessary changes so that we can create this evolved backend project now the first thing I’m going to do inside of here is come into our package.json because I’m just going to change the name of our project and I’m also going to change the description just here so this is instead going to be a dockerized full stack application that uses no jst backend uh a postgress SQL database a Prisma RM and JWT authentication so those are going to be the core changes made in this project now as for what we’re going to start off with we’re going to install the necessary packages that we need for these new techies that we’ll be using in this project specifically we’re going to need to npm install number one is Prisma number two we can space separate different ones is Prisma SL client and number three is a package called PG and we’ll learn what that does later but essentially it’s just a client for postgress so if we hit enter that’s going to install those packages and I realize I’m a muppet we actually need to uh first change directory into our new project so I’m just going to CD into chapter4 and then run that command once again and that will install them all and now we can see inside of our package.json we have our Prisma client we have PG and we have Prisma now the second thing I’m going to do and this is where we’re going to start off making these modifications is we’re first going to create this Prisma client and the way that we kick that off is we start by typing in a command npx Prisma in net and if we hit enter on that that is going to create a Prisma folder inside of our chapter 4 uh project directory now inside of this Prisma folder directory we can see there’s a file called schema. Prisma now if you’re unfamiliar with what a schema does it basically is a folder that specifies the structure of our database so if we think about chapter 3 when we created this database. JS file we specified what we wanted our tables to look like inside of our SQL database well the schema does it in a similar way where instead of using a SQL command we create it as if it were a slightly complicated object and that’s because it’s going to allow us to interact with it as if it were some form of object and that’s just going to ensure that our code stays much much cleaner so we’re going to open up the schema. Prisma and there’s about 15 lines in here and we can just leave them all in there now there might be a couple of lines in this file that are a bit confusing but we’ll only add about 14 lines if you want to learn more you can check out the docs at this particular link and it will explain everything you need to do but you know you’ll also learn by doing in this particular case so when we create this conversion between our SQL database and JavaScript we essentially need to create a model for our JavaScript to allow to interpret these SQL tables so in this case we’re going to need two models so we say model and then we Define the name of the model which is the user and this is essentially going to be uh the structure of the model it’s kind of like predefining what the tables are actually going to look like so we’re going to create this user object and in here what we’re going to do is just like we did before we’re going to create the columns that are going to exist inside of that table so the first one is going to be an ID as we had before then I’m going to tab across and specify that it’s of integer type typ and then I’m going to tab again and in this case we’re going to have some extra parameters so the at ID is going to have an at default and that’s going to autoincrement and we’re going to call that so this just here basically says that you know this is going to be a default parameter which means that we don’t need to specify it when we create a new user and we want it to autoincrement as we add new users now the second field in here is going to be the username and that is going to be of type string and in here we’re just going to ensure that that is unique cuz we can’t have two users with the same name underneath that we have a password column and that is going to be also a string and there’s nothing special there and last we have the to-dos and this creates the relation between the two tables and this is just going to be a to-do as an array so that’s going to be our user model that we’re going to use to basically interact with our postgress SQL database using a JavaScript syntax now we’re going to Define our second model which is going to be the to-do this is what our to-do is going to look like and once again we’re almost done with this file but if you want to learn more about all of the information is in this docs for this project once we create this schema then we just initialize our database using the schema and we’re good to go so for the to-do the first parameter is going to be an pretty similar this one is going to have of type integer and it’s also going to be at ID at default uh autoincrement so that’s pretty straight forward uh the second field for that to do as per the chapter 3 project was the task and that is of type string uh underneath that we have a completed status and that is of type bull in and in here that’s going to have a default at default of false because when we add a new to-do it probably hasn’t been completed yet underneath that we’re going to have a user ID so that’s going to associate the task with a particular user that’s going to be an integer field and finally we’re going to associate this table with the users table so we’re going to have the user and that’s of type user and in here that’s going to be a relation so we’re going to type at relation and we’re going to have the fields and in here we’re going to have a user ID inside of uh the square parentheses and then we’re going to create the references and that’s going to be exactly the same just like that and this is the schema we need for our entire database we can see up here it’s already configured for a postgress SQL database and we have the client so this right here is the code we need to configure our postgress SQL database for a user or the table specifically and here we have another table a predefined template for our to-do uh table so we can go ahead and save that schema now one of the reasons why an OM is also absolutely brilliant is because of a concept known as migrations with our previous project if we were to deploy this live to the internet we create these tables suddenly if we’re in a production environment and we need to change what our database looks like that’s going to be incredibly complicated how do we go back through and make these modifications to all of the entries inside of our database you know like if you have 100 users that are using a primitive form of your database and then 100 that are using a later version essentially version control of your database becomes incredibly complicated when you have loads of users relying on it on a daily basis using something like an omm allows you to easily introduce the concept of migration so essentially what it is is it’s just a record of all the modifications that have been made to the database and when you run your migrations every instance of your database is updated to reflect these changes so it’s always the most recent version and it’s also you know supports these Legacy entries so all of the previous entities have been updated to reflect these changes so essentially what’s going to happen is eventually when we create our postgress environment our postgress SQL database we will run our very first migration and it will format our database to match the schema but we’ll see how that works in a second I know that can be a little bit confusing now the other file we’ll need for our database is a Prisma client and I’m going to create that inside of the source directory so we’re going to create a new file called the prison client.js now this file is pretty equivalent to the database without all this funny business down below we’re just going to create an entity a Prisma entity through which we can interface with our postgress SQL database so in here what we’re going to do is import the Prisma client from at Prisma client pretty straightforward then we initialize it we say const Prisma is equal to new Prisma client and we invoke that we instantiate that class and then finally we export default Prisma so that we can access this Prisma entity from anywhere inside of our project that’s that file totally complete and now that we have it we can go ahead and see how radically improved writing and interacting with our database can actually B so the database interactions that we’re going to be modifying are within these orth routes and these to-do routes and I’m going to start off with the or routes so just up here we can see this is the code that we used to do to create a new entity inside of our uh user table and then likewise with the to-dos tables where we prepare the SQL query and then we execute them well once again with Prisma it’s a little bit different so what I’m going to do is start off by deleting this insert user query and instead I’m going to say const user is equal to and I’m going to await because now that our database is a third party entry the communication between the server and the database is an asynchronous process so we have to make sure that our in points are asynchronous so now what we do is we await Prisma do user we access the user model that we have created and all we do here is we say we call the create method and we pass in an object the object has some a data field and that is an object itself and in here we provide the username and the password which is the hashed password and just like that we have created a user using a JavaScript syntax so that is super easy now the second we have to do is insert a to do I can get rid of all these SQL entries and now I can just await Prisma do too. create pass in an object as an argument and in here we have a data field that’s also an object and we just have the task which is the default too and we have a user ID field which is the user. ID it’s super simple this user right here is just what gets returned it’s essentially just that model object that we created inside of the schema and that’s all we need to do well actually there’s one more thing now we just take this user to do and replace this code just down here and we have now updated this file to instead use the OM Prisma instead of having to manually write out all of these SQL queries so that is the registration done let’s see what it looks like for the login well for the login we can get rid of these two lines right here and we can say const user is equal to we await since we’re having to await we need to make this endpoint asynchronous so we just throw an async in front of that function and then we just await Prisma the user entity and we find a unique entry that takes an object and in here we specify a where Clause so it’s kind of like the SQL logic where we say where ID is equal to ID or in this case where username uh is associated is matches the username that we have entered just here and that is literally all the code we need to find our unique user so I can now save that file and that is complete now we’re not quite ready to boot up our project just yet because we haven’t actually instantiated our postgress Docker environment just yet but we will get to that very shortly first we’re just going to update some of these to-do endpoints so that we can finish up with our Prisma omm configuration so first one first let’s get all the to-dos so we’re just going to remove this code right here and we’re just going to say const todos is equal to we’re going to await we’re going to import Prisma which we also had to make sure we did inside of this file and I actually didn’t do that so that’s me being naughty let’s make sure we import Prisma from our Prisma client save that make sure it’s imported inside of our to-do routes as well and then we can await Prisma we access the to-do table and we find many cuz we’re getting a lot of them and in here we provide an object and we just say where and we want to return the entries where the user ID matches the request. user ID now once again this needs to be changed to an asynchronous endpoint and that is all we need to do to access all of the to-dos where the user ID matches the ID present in the request super straightforward once again you can find the documentation for all of these Methods at that link inside of the Prisma client but we’re going to demonstrate how most of it works inside of these endpoints for creating a new to-do you might be starting to get the hang of this now all we’re going to do is throw in an async key right just in front of that endpoint function and now I’m just going to uh say const Todo is equal to and await Prisma do too. create and in here that’s an object and it takes a data field that’s also an object and we just provide the task and the user ID which is the request. user ID super simple and then we just return the to-do so we’re going to actually just send back the to-do we don’t even have to create all these fields it’s just manually assigned to this variable and we will get an object that represents this new to-do for the put entry once again super straightforward we can get rid of all of this logic and I’m also just going to get rid of this query as well and in here we just say const updated to do is equal to this one also needs to be made asynchronous so we throw the async key in there I’m also just going to throw that in front of the delete one while we’re down here so I don’t forget to do it and then we just say equal to we await Prisma do too. update and that just takes an object and we say where that’s also an object and we say where the ID is equal to and we’re going to pass integer so we’re going to convert it to a numeric value and we’re going to access the ID that’s the field right there and then after that we’re also going to match the user ID with the request. userid field so that’s going to make sure we update to-dos only where the to-do ID matches and also we have the correct user and then underneath that we just provide the new data which is just going to be the new completed field super straightforward and I think in this case the completed field is currently going to come through as as a numeric value so what I’m going to do is throw a double exclamation mark in front of it and that is going to convert it to a Boolean amount and then once again we can now just send back the updated to-do done and done super simple so this command just to summarize we update the to-do where all of the IDS match to confirm it’s the correct to-do we want to update and we provide the new data and we force our completed field to become a Boolean value by throwing the double exclamation in front of it that’s a little secret hack it’ll convert anything to its uh truthy or falsy state and finally we have the delete field once again pretty straightforward we get access to the user ID and in here we just uh await Prisma do too and we use the delete method that takes an object and we just specify where and that is exactly the same as this where Clause up here so I’m just going to pass that in and we are done that’s literally all it takes to use the Prisma RM super straightforward and I actually think I can uh do that and that we’ll use that value instead so it’s just so much tidier than having all these SQL commands all throughout your files and just like that we have configured all of our end points to use the Prisma omm and also to be relevant for postgress which is super important so now that we’ve done that we are now ready to dockerize our environments and actually get our postgress SQL database up and running and then the last thing we’ll do is we’ll see how we can create a composed. yaml file which essentially just configures everything so we can boot it all up in one command all right it is now time to get our hands dirty with Docker and container Miz some of our uh infrastructure so the first thing we’re going to need to do is actually boot up the docker desktop on our device if you recall we installed it earlier and the link to install Docker is available in the description down below so you can just come over to that link and hit download Docker desktop and select your operating system now when you have it open it should look something like this particular screen here where all of our environment are referred to as containers we contain our application it’s like its own mini environment where we can configure how to set it up and consequently what code to run now everything else is going to be done from the terminal uh from the command line so we can just move that to the side we don’t necessarily need it open but but we have to have the application running and there will be some advantages to having the client open later now the way that we go about containerizing our environments is by creating a doer file file now the docka file is basically just an instruction sheet on how we can create this environment so that it has everything it needs to run our little you know infrastructure backend infrastructure be that the database or our node.js server so what I’m going to do inside of chapter 4 is create a new file and it’s just going to be Docker file just like that doesn’t even have a file extension now inside here as I said a second ago it’s pretty much an instruction sheet and the first instruction we need obviously we’re running a node.js application is so we need to set up this environment to have access to nodejs so what we’re going to do I’m going to leave a comment just to walk us through these steps I’m going to say use an official node.js runtime as a parent image now I remember when I first came across this term image I was just thinking oh it’s like a picture and that’s kind of true but in this context an image is actually more like a snapshot it’s a snapshot of a separate instruction sheet so when we eventually create this Docker file and build our container what is happening is we’re creating a snapshot of that environment and then whenever we run our containers we can just run that snapshot and get us right back to where we were and we can build off pre-existing images in this case we’re going to build off the node.js official image and that’s just going to take a snapshot of the node environment that we specify and add it to our new environment that we’re creating via this Docker file so we’re just going to use the command from and we’re going to say node version 22- Alpine so that is the official node.js image we need to throw into our new containerized environment now this Docker file just here is specifically for our nojz application we’ll see how we can get our postgress SQL environment up and running very shortly so the second line we need is to Now set the working directory so we’re creating this new environment we need to specify a folder for our project so we’re going to set the working directory in the container and we do that using the work dur command and we’re just going to say slash app that’s where our working directory is going to be step three now that we’ve got that no JS Bas image and we’ve got our working directory we need to copy the files from our local project into this new environment because basically it’s like you’re setting up a new computer you need to copy all your stuff across so we’re just going to copy the package.json and the package-lock do Json files to the container and the command that we use to copy stuff from our local device into our Docker container is the copy command and the files we’re going to copy first is going to be the package.json files so we’re going to specify that we want to copy any file that has a package in it so then we’re going to use the uh little asteris so that it selects both the package-lock and the package.json and then we just want them to be the um Json file so that is from the source and then the destination we want to copy it to is the period which is going to be the current working directory which is the SL app so it’s going to copy these two files from our local device and slam them into the app of our Docker environment now that we have access to the package.json we need to install all of the necessary npm packages or dependencies that we need for our project and since we have access to the nodejs and consequently npm ecosystem we can do that very easily so we’re going to install the dependencies now traditionally we’ve typed mpm install and then we’ve specified the name of the package however if we just want to install every dependency inside of our project we can do that using the npm install command and we don’t have to specify any packages and what that will do is it will just read our dependencies list and install them all and since we have access to that file inside of our Docker environment we can just use the Run command and we can run the mpm install command inside of our Docker environment and that will install all of the dependencies now that we have the dependencies installed we’re good to copy the rest of our application across so we’re going to copy the rest of the of the application code and that is once again a copy from The Source destination which is our current file which is chapter 4 to the destination which is the uh SLA directory the period is the current file and this is the current working directory inside of our uh Docker environment and the destination so that’s going to copy all the remaining source code across now the reason we separate these commands is because the way that a Docker image is built is from the top down and if we change some of our source code when we next go to build that image it will rebuild our container from any files that have been changed if all of this stuff is exactly the same which it likely is we’re probably not going to be installing any more packages and that means all of our dependencies will remain consistent Docker is clever and so can cach all of this build information and it can just rebuild the image from the changed line so technically we could just copy everything in one go but that means that when we make changes to our source code we would then consequently have to recopy this line and reinstall all the dependencies which we can avoid if we just copy first the package.json then install the dependencies and then next time we change our source code we can just rebuild our image from this line down you don’t have to do this but if you just want to make the process of building your containers slightly more efficient this is a good way to do it so this line copies our entire source code across to the container now that we have copied all of our source code across what we need to do is expose the port that the app runs on now what this means is that when we create this Docker container and we run our application inside of it it’s essentially walled off from the rest of the world and what we need to do is open up its ports and we map an external port to an internal port and we’ll see how we can do that later but the point is we need to expose the port that we run our application on for consistency I’m going to do 53 and we can see just here this uh EXP expose command exposes the port that the container should listen on Define Network ports for this container to listen uh on a runtime so once again just to summarize this line we need to tell our environment to open up this port to incoming Network requests from whatever Source if we didn’t have this line it would be an impermeable barrier that we couldn’t send Network requests into so it’s just like opening up a wormhole between our real environment and this docker environment and then once we’ve exposed this port we are now good to boot up our application inside of this Docker container so we’re going to define the command to run your application and the way that we do that is with the command uh command CMD and in here we have an array of the strings or words needed to boot up our application now the way that we typically run a file is we say node and then in a separate command we specify where node can find the executable file which in this case is the/ source server.js file and you’ll note that that is fairly equivalent to what we see inside of our script we say node and then we specify the file but we obviously have all of this jargon in the middle now because in chapter 4 we’re using postgress and we’re not using any experimental feat feates inside of node.js we can remove all of these uh all of these different lines so that’s super handy so we can have a very simple startup script we don’t need any of these experimental features and consequently this is the command that is going to boot up our application and I actually think these are meant to be the double quotation mark not the single quotation mark so I’m just going to change that very quickly just like that so that is now happy and that is going to boot up our application inside of this container and it’s going to listen to incoming Network requests on Port 53 so that is officially the instruction sheet that our container needs to get our application up and running now this is super cool because anyone on any operating system can suddenly start up our application inside of this little environment another good example is that if you wanted to run postgress normally what you would have to do is install postgress on your device and then you can up get postgress up and running in this case at no point have we installed postgress because our postgress environment is going to run inside of its own little Docker container and so we can just actually tell Docker to create an environment with postgress installed and we can just refrain from installing it on our device so it just makes it super easy to you know deploy your code to a different environment for someone else to download your GitHub repo runner on their device uh and at the end of the day for for us just to not have any software installed on our computer aside from Docker and we can still boot up these amazing applications so Docker is super handy and it is ubiquitous which is a good word so now that we have this instruction sheet what we have to do next is actually build this container we’ve just created the instruction sheet to build this container however one thing we need to do first since we’re not running these uh experimental Flags we need to make sure that we’re not involving sqlite node sqlite anywhere inside of our project currently we still have the import line for our original sqlite database in our routes files and that means that when we boot up this container it’s going to try and execute this database as well as having a postgress database now the problem with this is that since we haven’t enabled at these experimental Flags that’s going to break our container so we don’t have to delete these files we just need to delete the imports from the or routes and the to-do routes to that database file and then it’s just going to sit there not doing anything so that is absolutely fine so we’ll just remove those two lines from our code base and now we’re almost ready to build our container the one last thing we need to do before we actually build our container is finalize our Prisma setup now the way that we’re going to do that is is from our terminal and we need to run a command that is going to generate a config file for our Prisma client now the reason we have to generate this config file is because it’s specific to our schema which is our database structure essentially and every time we change or modify this schema we need to rerun this command and the command can be found within the readme.md file if you’re looking for it later there’s a whole instruction sheet just here on how to get this up and running but essentially what we’re going to do from inside of the chapter 4 directory now that we’re finished with the schema is we’re just going to run npx Prisma generate and hit enter on that command so now that has generated that Prisma client and it saved it inside of our node modules that is all done we are almost ready to build our containers currently if we went head and build our containers we would build our Docker file for this nojz application which is brilliant however the problem is that doesn’t help us with our postgress SQL database now to configure a postgress database inside of a Docker container you essentially just need to run this first command and because it’s just one line we can actually do that from what’s known as a composed. yaml file now where a Docker file is essentially an instruction sheet for creating one Docker container when you have an application that uses potentially you know numerous or even tens of different containers or environments you need to define a configuration sheet to boot up all of these Docker environments so what we’re going to do is create a new file and that’s going to be called docker-compose yaml y ML and I’m going to hit enter on that and so where the docker file is the setup instruction sheet for a singular Docker container the docker composed. yaml is a configuration sheet for our conglomerate of docka files or individual containers it’s kind of like a glorified specs sheet now there’s a few different lines in here and I like to think of it as kind of like a bullet point specification list so we just have a bullet point and some tab indentation of all of the different specs we need to get all of our containers up and running in one Fell Swoop so in here the first uh parameter we have to specify is the version which is going to be version three that line doesn’t really mean much underneath that we have a line that means a whole lot more it’s called Services now inside of services underneath that we’re going to tab across we’re going to indent it and this is where we Define the configuration for all of our different containers the first one is going to be the app container and that is going to be the nodejs docker file that we just created so in here I’m going to use a semicolon enter and then tab across once again it’s like indentation uh instead of using bullet points now this app needs what’s known as a build line now we use the build line when we have a Docker file to build that container

    and the path to that Docker file is just the period which is the current directory so it’s going to look in the same directory as this yaml file and it’s going to find the docker file and that’s the instruction sheet it’s going to use to build that container now underneath that at the same indentation we’re going to have a container name and that’s just going to be too- app underneath that we’re going to have an environment parameter now the environment parameter is for specifying the environment variables where in the previous projects we use a EnV file in this case we can do it directly from the specs file which is super handy now the first parameter we’re going to need in here is a database and this is all upper case URL now if you remember inside of this schema Prisma we had to provide a URL and this was the default code created when we created this file so this was already there and it’s looking inside of the environment variables for a database URL now in chapter 3 our server and our database were one unified entity however in this project we have one container with our database and a separate entity that has our server and so we need to provide our server with an address address to locate our database container and that is the database URL so in this case the database URL is a little bit complicated and I’m actually just going to copy it across so once again if you head over to the GitHub code you’ll be able to find this line uh just look for the docker composed. yl file and copy this line across and this is the address we need to find our database and if you do check out the GitHub be sure to St the project love that support so this will give our node server an address through which it can communicate with the database now the second environment variable We need oh I added a little um quotation mark there the second environment variable we need for this app is the JWT secret and that’s an environment variable we’re familiar with from chapter 3 and in here I’m just going to provide a random string your J wtor secret here now this string can be any string it could be a jargon string a whole bunch of random mumbo jumbo or it could be something specific to you but once again environment variables are secure they are protected and it’s something that only you should have information to so whatever string you choose to put here whatever selection of characters just make sure that it’s only available to you equally you know you could just use this uh for development as secure on your device so it’s not going to be a huge issue the third environment variable is called the nodecore EnV so that’s the node environment now typically there’s two environments sometimes there’s three one is development so when you boot up your application in development that’s going to be the development environment another example would be a production environment and a third one might be staging which is typically somewhere in between it’s the envir prior to deploying your code to production now the reason we like to specify this as an environment variable is because sometimes we have code that runs specifically in development other times we’ll have code that only runs in production and so if we specify inside of the environment variables that just allows us to have the same code base but change one line and that’s going to basically specify what environment that we’re in so in this case it’s going to be the development environment finally we’re going to specify the port environment variable which if you recall inside of our server.js we read the port for our app to listen on from the environment variables under the port key we obviously have a backup of 50003 and in this case the port for me is just going to be the same 50003 so that is our environment variables complete now underneath the next parameter we have to specify for our app is the ports now this is called Port mapping where what we do is we match an external port to an internal port and just like exposing we basically set up a configuration for an external network request to meet a port and we match that to an internal Port so in this case I like to keep them the same and that’s going to be a string and we’re just going to match 5,000 in your case or 5,000 and3 in my case to Port 53 so this is the external port on our container and this is the internal Port that we’re going to match it to or map it to underneath the ports the next parameter is going to be a depends on field now obviously our server depends on our database now we haven’t configured the database service just yet but we will do that very shortly but all we do is we just tell it that it depends on the database so one is dependent on the other and that is going to interconnect the two of them and then the last field we need is a volumes field now the volumes field essentially what it does is it creates a database or a storage or a history record of our server so if we didn’t have a volume every time we booted up our container it would be a blank slate if we do have a volume that is a place for us to save the previous state of our container so that when we shut it down and we boot it up again we can just read back from where we were so in this case the storage for basically where we’re at is going to be in this directory right here that is the volume and that is just going to persist any configuration any data any information that is available inside of our container and this would only be erased in the case where we would actually delete the container and rebuild it from scratch so it’s important to have a volume so that your app can remember essentially and that is our first app service complete now the second service which has to be at the same indentation as the app is going to be called the database now the database doesn’t have a build file because we haven’t specified a Docker file for it and as I said earlier instead what we’re going to do is just build it directly from an image now the image we need for our postgress database is the postgress version 13- Alpine and obviously this image is one tab indented from the database which is one tab indented from the file left hand side the indentation is super critical for this file so if you’re uncertain be sure once again to compare it to mine in the GitHub repo now that we have the image which is basically all the setup we need for creating that environment with postgress inside it we’re going to give the container a name so the container name is going to be the postgress database and under that we’re going to have some environment variables now these ones are going to be a little bit different essentially the first one is going to be a postgress user and this is all uppercase and the username is going to be postgress so we’re essentially just defining the login information if you wanted to be a hacker and modify the database behind the scenes so it’s just all of these security credentials for the database the postgress password password is going to also be postgress and finally the postgress uh database name is going to be called to-do app and this to-do app right here has to match this to-do app at the end of the uh database URL so they need to match perfectly now with that done we can specify the ports and the port mapping I want for our database standard practice or convention is is to match 5432 to Port 5432 so that’s just going to map Port 5432 on the outside or the external port for our container to the same port on the inside of our container and once again if we come up to our URL we can see that the port we are using for our database URL is 5432 and as I said that’s just convention and we are going to stick with it now the last thing we need and it’s arguably more important in the case of our database is to specify the volumes which is once again going to just create a data persistence for this container if we didn’t have this once again every time we rebuilt our container or reran it it would be starting from scratch and that’s not very convenient when you’re working with a database we need an environment that’s going to persist data until we literally delete that container off the face of the earth now the volumes URL where this data is going to be saved is a little bit more comp licated in this case it’s just going to be postgress ddata semicolon SLV SL lib SL postgress SQL slata that’s where all of the information is going to be persisted so that when we reboot up our container we can pick up right where we left off finally we’re going to have one more field and this is going to be at the far left hand side so no indentation and that’s just going to be called volumes and that’s going to be postgress Das data just like that and with that our specs file our composed. yaml file is complete so now that we’ve configured this specs file which basically is an instruction sheet on how Docker can boot up every container needed for our application we can actually go ahead and build these containers and finish our project now there’s a few steps required to build these containers and consequently get them up and running one such example is when we have our postgress container working we need to then go into it and make sure that the tables are created with inside that database and we’ll see how that works shortly all of the commands that I’m about to run are available within the get started section of the remy. md4 chapter 4 so you can find all the commands we’re going to be using just there it is a bit of a step-by-step process but once it’s done it’s complete and you now have This brilliant application so we’ve completed the docker compos yaml let’s go ahead and build all of these containers now with Docker open in the background the First Command we’re going to go ahead and write is called Docker space compose space build now what this is going to do is it’s going to build our containers from that composed. yaml file and here we can see it’s starting off by running all of those Docker file commands first for our app and then it also does it for our postgress database now with these containers built we are now ready to boot them up based off the images that are created or the snapshots of these completed environments and we can go ahead and run these virtual environments however before we go and run them both together we now need to make sure our database is updated to match the schema. Prisma file or essentially the tables necessary for our database now the command we’re going to use for that is a little bit complicated it’s Docker compose but instead of running build we are going to run our app and inside of the app we’re going to execute the command npx Prisma migrate Dev d-name and net now what this command is going to do is it’s going to migrate or run a migration for our database and that is going to our schema file and it’s going to create the first version history for essentially any modifications made from our database which in this case is going from a completely blank database to having the necessary tables we need for our application so I’m going to hit enter on that command and we can see what has happened is it has run our postgress container executed this command and it has created the to-do table and the user table so that is excellent we can see that it’s also created a migrations folder inside of our Prisma file this is absolutely normal and it’s not something that you want to Middle with uh it’s just the record history of any modifications made to our database so that is all done now that we have finished that line our database is set up to you know have all the tables we need for our project we can go ahead and boot up our two Docker containers and the command we use for that is Docker compose and then up now you can also specify a/d flag and what that’s going to do is it’s just going to boot them up in the background and give us access to our terminal again but I’m not going to use that in this case so if I go ahead and hit enter we can see now we are running two Services we have a container called postgress DB and another one called to-do Das app and we can see that our to-do app has actually executed that console.log that we have when we tell our app to listen on Port 5003 that is now running and even cooler if we open up Docker desktop we can see here we have chapter 4 we can click on that we have our two containers running and we have a nice log that we need for our whole project so that is absolutely brilliant you could look at the logs for each of them specifically by clicking on each one but here we have just a global log for our containers and with that done that is actually our app running inside of these two containers and so we should be able to I’m just going to clear out any um tokens that we have saved just here so that we can start from scratch so let’s refresh this page we have absolutely nothing local storage I’m going to delete that token so we have our blank application we can see that it is being served up on Port 5003 which is super cool because I’m not running this I’m not running npm run Dev you know like our application is running inside of this container and it’s serving up our application I could go ahead and try to log in and that fails to authenticate which means that our backend endpoints are working what I could also do is I could come into our client emulator right here out too app. rest and I could run this register command I could emulate this client request and we can see that that works and that registers us and it gives us back a token I could try log in again I haven’t actually registered a user inside of the application but that client emulation has created that entry inside of the database so now when I submit I’m actually logged in and we can see that we even have it to-do which means that our backend inside of this container wrote to our postgress database I could go in here and add a new to-do and I could say that this first one is complete and now I can refresh the page and we can see that this data is persisted inside of our database so that is super cool and it gets even cooler what I could do now is I could go ahead and delete this token once again refresh the page that’s going to log us out and I could create a new account and this is going to be test gmail.com here’s a basic password and I can log in and we now have a new user created and what we’re going to do now that we have made all these uh entries to the database is we’re going to see how we can log directly into the database which is a place where we can modify it directly using SQL queries so if I come up here we have all of this uh loging going on inside of our Docker container and what I’m going to do is create a new terminal instance and that’s going to keep this one running in the background but it’s also going to give us a terminal that we can run some new commands in now to log in directly to our database is a slightly complex command and it is once again available inside of the readme.md file but essentially what we do is we write Docker execute exec Das it and we specify the name of the database which is postgress – DB we specify the user which is postgress if you recall that’s what we uh specified in the environment variables and then we specify dasd and we call uh call the to-do app container now if I run this command it’s going to log us into our database uh and I realized I made a mistake just there I missed out one command just after the DB we need to have a psql which is the postgress SQL command and that is important to put in front of all these flags so if I go ahead and run that in that logs Us in directly to our database where we can go ahead and run SQL commands now the first SQL command I’m going to introduce you to is the back SL DT now if I hit enter on that that’s going to show us all the tables available inside of our database so just here we can see we have three tables one is a history of our migrations which are the modifications made to our database or the changes made over time kind of like a version history we also have a to-do table which has all of our to-dos and we have a users table and what I could do now is run the select query to read all the entries inside of a table I could say select everything from the to-do table and what we’re going to do here is just wrap to-do inside of the double quotation marks and that’s just going to match the casing which is going to be super important and when you run these commands you always have to finish them off with a semicolon so if I hit enter on that we can see here we have a table that shows all the data inside of our database we can see we have three tasks we can see the first one is go to the gym and we can see that’s true and we can see that’s currently incomplete it’s completed status is false we can see that we have hello add your first too and and that’s true if you remember I clicked complete when we added our first to-do go to the gym and we can see that both of these to-dos are associated with our first created user and then we can see we have another to-do and that’s associated with our second user and then if I wanted to exit out of this database all I do is I write the Quick Command and that gives me back access to my terminal and you could log in you could actually run a whole lot of crud actions using SQL commands directly on the database and all of these changes would be reflected in the front end but ultimately that is our backend application complete we’ve seen how we can build a server we can set it up to listen to incoming requests to its Port we can add authentication and database interactions we can serve up a front-end application that can create network requests between the front end and the back end so it’s ultimately a full stack application we’ve added middle wear to add authentic ation protection to a whole lot of our crud to-do endpoints we’ve created Docker containers for two different environments one is our server and one is our postr SQL database and we’ve seen how we can boot them up and run them as a collective application for development and this is such a cool backend project to have because a lot of the backend infrastructure you would find at almost any company is just going to be a slightly more sophisticated or equivalent version of what we have just coded here in this F course so I’m super proud of you well done for persisting to the very end congratulations you should pet yourself on the back learning back in development is you know takes a lot of time and practice but now you have an absolutely amazing codebase 2 reference as you build out some absolutely amazing backend applications in future thank you guys so much for sticking with me throughout this course I hope you’ve had a thoroughly good time and if you have enjoyed the course don’t forget to smash the like And subscribe buttons I’ll catch you guys later peace learning to code if so be sure to check out the learn to code road map or dive straight in with these videos that’s a good one

    By Amjad Izhar
    Contact: amjad.izhar@gmail.com
    https://amjadizhar.blog