<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0"><channel><title><![CDATA[Pujan Bhattarai's Blog]]></title><description><![CDATA[Pujan Bhattarai's Blog]]></description><link>https://pujanbhattarai0.com.np</link><generator>RSS for Node</generator><lastBuildDate>Wed, 15 Apr 2026 18:31:48 GMT</lastBuildDate><atom:link href="https://pujanbhattarai0.com.np/rss.xml" rel="self" type="application/rss+xml"/><language><![CDATA[en]]></language><ttl>60</ttl><item><title><![CDATA[Deploying a Two-Tier Application Architecture on AWS]]></title><description><![CDATA[Overview
A two-tier application architecture is one of the most common patterns used to build modern web applications. It separates the application into two main layers:

Application Layer – Runs the frontend and backend logic, usually on a web serve...]]></description><link>https://pujanbhattarai0.com.np/deploying-a-two-tier-application-architecture-on-aws</link><guid isPermaLink="true">https://pujanbhattarai0.com.np/deploying-a-two-tier-application-architecture-on-aws</guid><category><![CDATA[Devops]]></category><category><![CDATA[ec2]]></category><category><![CDATA[rds]]></category><category><![CDATA[two-tier application]]></category><category><![CDATA[WordPress]]></category><dc:creator><![CDATA[Pujan Bhattarai]]></dc:creator><pubDate>Thu, 18 Sep 2025 13:38:52 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/upload/v1758203317222/3f5dd3fc-0b43-4996-a6c4-18d79029e62b.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<h2 id="heading-overview">Overview</h2>
<p>A <strong>two-tier application architecture</strong> is one of the most common patterns used to build modern web applications. It separates the application into two main layers:</p>
<ol>
<li><p><strong>Application Layer</strong> – Runs the frontend and backend logic, usually on a web server or application server.</p>
</li>
<li><p><strong>Database Layer</strong> – Stores and manages data using a database server.</p>
</li>
</ol>
<p>This separation makes the application easier to manage, scale, and secure compared to a single-tier setup. In this blog, we’ll walk through how to deploy a two-tier application on AWS.</p>
<h2 id="heading-objectives">Objectives</h2>
<p>Our goal is to design and deploy a <strong>two-tier application</strong> on AWS consisting of:</p>
<ul>
<li><p>An <strong>application layer</strong> running on Amazon EC2.</p>
</li>
<li><p>A <strong>database layer</strong> running on Amazon RDS.</p>
</li>
</ul>
<p>We will use AWS networking services to securely connect the two layers.</p>
<h2 id="heading-prerequisites">Prerequisites</h2>
<p>Before starting, make sure you have:</p>
<ul>
<li><p>An active <strong>AWS account</strong>.</p>
</li>
<li><p>Basic knowledge of <strong>Linux commands</strong>.</p>
</li>
<li><p>Understanding of how <strong>EC2 and RDS</strong> work.</p>
</li>
</ul>
<h2 id="heading-high-level-architecture"><strong>High-Level Architecture</strong></h2>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1758193075819/c322bebe-18a2-4b9f-94f0-02180d352845.jpeg" alt class="image--center mx-auto" /></p>
<h2 id="heading-task-1-create-a-vpc-and-networking">Task 1 — Create a VPC and networking</h2>
<ol>
<li><p><strong>Open AWS Management Console</strong> and choose the nearest region (e.g., <strong>ap-south-1 (Mumbai)</strong>).</p>
</li>
<li><p><strong>Create a VPC</strong></p>
<ul>
<li><p>Services → VPC → Create VPC</p>
</li>
<li><p>Name: <code>VPC-EpicReads</code></p>
</li>
<li><p>IPv4 CIDR: <code>10.0.0.0/16</code></p>
</li>
<li><p>Tenancy: Default</p>
</li>
<li><p>Add tags (Name = <code>VPC-EpicReads</code>) → Create</p>
</li>
</ul>
</li>
</ol>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1758196372837/3e2e1c34-92fb-4df4-bf4c-553e1034ce24.png" alt class="image--center mx-auto" /></p>
<ol start="3">
<li><p><strong>Create subnets</strong> (across 2 Availability Zones for separation)</p>
<ul>
<li><p>Create Public Subnet 1: <code>10.0.1.0/24</code> (AZ: us‑east‑1a / ap‑south‑1a)</p>
</li>
<li><p>Create Public Subnet 2: <code>10.0.1.0/24</code> (AZ: us‑east‑1b / ap‑south‑1b)</p>
</li>
<li><p>Create Private Subnet 1: <code>10.0.2.0/24</code> (AZ: us‑east‑1a / ap‑south‑1a)</p>
</li>
<li><p>Create Private Subnet 2: <code>10.0.3.0/24</code> (AZ: us‑east‑1b / ap‑south‑1b)</p>
</li>
<li><p>Tag the subnets with clear names (e.g., <code>PublicSubnet1-EpicReads</code>, <code>PrivateSubnet1-EpicReads</code>).</p>
</li>
</ul>
</li>
</ol>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1758196563443/210453fc-9fa3-4c49-b0f5-8670717eb9a8.png" alt class="image--center mx-auto" /></p>
<ol start="4">
<li><p><strong>Create and attach an Internet Gateway (IGW)</strong></p>
<ul>
<li><p>VPC → Internet Gateways → Create IGW → Name it <code>IGW-EpicReads</code> → Create</p>
</li>
<li><p>Actions → Attach to VPC → select <code>VPC-EpicReads</code></p>
</li>
</ul>
</li>
</ol>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1758196738520/00f0ba24-bbd7-4d9a-90d4-4b7ce3486bde.png" alt class="image--center mx-auto" /></p>
<ol start="5">
<li><p><strong>Create a Public Route Table and route 0.0.0.0/0 → IGW</strong></p>
<ul>
<li><p>VPC → Route Tables → Create Route Table (<code>RouteTable-Public-EpicReads</code>) and pick <code>VPC-EpicReads</code></p>
</li>
<li><p>Edit routes: add <code>0.0.0.0/0</code> → target = <code>igw-xxxx</code> (your IGW)</p>
</li>
<li><p>Subnet Associations → Associate <code>PublicSubnet1</code> and <code>PublicSubnet2</code></p>
</li>
</ul>
</li>
<li><p><strong>(Optional) Create a NAT Gateway</strong> if you need outbound internet from private subnets (e.g., for patching or package downloads). Place NAT in a public subnet and create a private route table that points 0.0.0.0/0 to the NAT.</p>
</li>
<li><p><strong>Network ACLs (optional but recommended for subnet‑level controls)</strong></p>
<ul>
<li><p>VPC → Network ACLs → Create NACL and attach to the VPC</p>
</li>
<li><p>Edit inbound/outbound rules as needed (allow HTTP/HTTPS/SSH where applicable).</p>
</li>
</ul>
</li>
<li><p><strong>Security Groups (instance‑level firewall)</strong></p>
<ul>
<li><p>EC2 → Security Groups → Create <code>SG-EpicReads</code> (for application)</p>
<ul>
<li><p>Inbound: SSH (TCP 22) from your IP, HTTP (TCP 80) from 0.0.0.0/0</p>
</li>
<li><p>Outbound: allow all (or limit to RDS SG on TCP 3306)</p>
</li>
</ul>
</li>
<li><p>Create EpicReads-DBSG (for RDS)</p>
<ul>
<li><p>Inbound: MySQL/Aurora (TCP 3306) <strong>source:</strong> <code>SG-Web-EpicReads</code> (Reference the web SG to allow only app instances to reach DB)</p>
</li>
<li><p>Outbound: default (allow all) or restrict to app SG.</p>
</li>
</ul>
</li>
</ul>
</li>
</ol>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1758197064675/7f358858-253e-4a1f-9e0c-27c05c800412.png" alt class="image--center mx-auto" /></p>
<h2 id="heading-task-2-launch-ec2-instance-application-server">Task 2 — Launch EC2 instance (Application server)</h2>
<ol>
<li><p><strong>Launch Instance</strong></p>
<ul>
<li><p>EC2 → Instances → Launch Instances</p>
</li>
<li><p>Name: EpicReads-WebServer</p>
</li>
<li><p>Choose AMI: Amazon Linux 2</p>
</li>
<li><p>Instance type: <code>t2.micro</code> (free tier) or choose as required</p>
</li>
<li><p>Key pair: select or create a key pair (download <code>.pem</code>)</p>
</li>
<li><p>Network: select <code>VPC-EpicReads</code></p>
</li>
<li><p>Subnet: <code>PublicSubnet1-EpicReads</code></p>
</li>
<li><p>Auto-assign Public IP: <strong>Enable</strong></p>
</li>
<li><p>Security group: <code>SG-EpicReads</code></p>
</li>
<li><p>Storage: default is fine for demo</p>
</li>
<li><p>Launch instance</p>
</li>
</ul>
</li>
<li><p><strong>SSH into the EC2 instance (from your workstation)</strong></p>
<ul>
<li><p>Ensure <code>.pem</code> has correct permissions: <code>chmod 400 key-pair.pem</code></p>
</li>
<li><p>Example SSH command (Amazon Linux/AMI):</p>
<pre><code class="lang-plaintext">  ssh -i key-pair.pem ec2-user@&lt;public-ip&gt;
</code></pre>
</li>
</ul>
</li>
</ol>
<p>    <img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1758197539407/9a9d6bfd-0f49-46df-a0ea-af0232cf8d73.png" alt class="image--center mx-auto" /></p>
<ol start="3">
<li><strong>Update and install web server and PHP (Amazon Linux 2 )</strong></li>
</ol>
<pre><code class="lang-plaintext">sudo yum update -y
</code></pre>
<p>It is used to install the Apache web server on a Linux system.</p>
<pre><code class="lang-plaintext">sudo yum install -y httpd
</code></pre>
<p>Start and enable the Apache webserver.</p>
<pre><code class="lang-plaintext">sudo systemctl start httpd
sudo systemctl enable httpd
</code></pre>
<p>Install PHP and extensions required by WordPress.</p>
<pre><code class="lang-plaintext">sudo yum install -y php php-mysqlnd php-xml php-fpm
</code></pre>
<pre><code class="lang-plaintext">sudo systemctl restart httpd
</code></pre>
<ol start="4">
<li><p><strong>Verify Apache is working</strong></p>
<p> Open browser and navigate to: <code>http://&lt;public-ip&gt;</code> → you should see Apache default page.</p>
</li>
</ol>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1758198091657/1fb6ed0b-dc0b-40e8-bc82-b83af7936119.png" alt class="image--center mx-auto" /></p>
<h2 id="heading-task-3-deploy-wordpress">Task 3 — Deploy WordPress</h2>
<blockquote>
<p>These commands assume you are on the EC2 instance and in <code>/var/www/html</code>.</p>
</blockquote>
<ol>
<li><p>Download WordPress:</p>
<p> Change the directory to /var/www/html/</p>
<pre><code class="lang-plaintext"> cd /var/www/html/
</code></pre>
<p> Download <strong>latest WordPress package &amp; Extract it</strong></p>
<pre><code class="lang-plaintext"> sudo wget https://wordpress.org/latest.tar.gz
 sudo tar -xzvf latest.tar.gz
</code></pre>
<p> Copy the configuration file to wp-config.php file</p>
<pre><code class="lang-plaintext"> sudo cp wp-config-sample.php wp-config.php
</code></pre>
<p> Set Permission for Wordpress file.</p>
<pre><code class="lang-plaintext"> sudo chown -R apache:apache /var/www/html/wordpress
 sudo chmod -R 755 /var/www/html/wordpress
</code></pre>
<p> <img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1758199384294/c639dd9e-f99a-4246-bf1f-094d0c2ee069.png" alt class="image--center mx-auto" /></p>
<h2 id="heading-task-4-create-the-database-rds-mysql">Task 4 — Create the Database (RDS MySQL)</h2>
<ol>
<li><p><strong>Create a DB Subnet Group</strong></p>
<ul>
<li><p>RDS → Subnet groups → Create DB subnet group</p>
</li>
<li><p>Name: <code>epicreads-subnetgroup</code></p>
</li>
<li><p>VPC: <code>VPC-EpicReads</code></p>
</li>
<li><p>Add the <strong>private subnets</strong> (<code>PrivateSubnet1</code>, <code>PrivateSubnet2</code>)</p>
</li>
</ul>
</li>
<li><p><strong>Create RDS instance (MySQL)</strong></p>
<ul>
<li><p>RDS → Databases → Create database</p>
</li>
<li><p>Standard create → Engine: <strong>MySQL</strong></p>
</li>
<li><p>Template: Free tier (or as required)</p>
</li>
<li><p>DB instance identifier: <code>wordpress</code></p>
</li>
<li><p>Master username: choose a username (store it securely)</p>
</li>
<li><p>Password: choose a strong password (store it securely)</p>
</li>
<li><p>Connectivity: choose <code>VPC-EpicReads</code> and DB subnet group <code>epicReads-subnetgroup</code></p>
</li>
<li><p>Public accessibility: <strong>No</strong> (keep DB in private subnet)</p>
</li>
<li><p>VPC security group: attach <code>EpicReads-DBSG</code></p>
</li>
<li><p>Additional configuration: initial DB name <code>wordpress</code> (optional)</p>
</li>
<li><p>Create DB</p>
</li>
</ul>
</li>
<li><p><strong>Edit DB security group inbound rules</strong></p>
<ul>
<li><p>EC2 Security → Security Groups → select <code>EpicReads-DBSG</code></p>
</li>
<li><p>Inbound rule: MySQL/Aurora (TCP 3306) Source: <strong>SG-EpicReads</strong> (so only app SG can reach DB)</p>
</li>
</ul>
</li>
</ol>
</li>
</ol>
<p>        <img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1758200108185/13a7fa16-14b9-4d88-8c0a-faec7ab307f2.png" alt class="image--center mx-auto" /></p>
<ol start="4">
<li><p><strong>Capture the RDS endpoint</strong> from the RDS Console. You will use this in the WordPress config.</p>
<p> <img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1758200255036/bf1cd660-1fbd-45b2-9bd2-4a3e8ffeae2e.png" alt class="image--center mx-auto" /></p>
</li>
</ol>
<h2 id="heading-task-5-connect-wordpress-to-rds">Task 5 — Connect WordPress to RDS</h2>
<ol>
<li><p>On your EC2 instance, set an environment variable (for testing):</p>
<pre><code class="lang-plaintext"> export MYSQL_HOST=&lt;your-rds-endpoint&gt;
</code></pre>
</li>
<li><p>Connect to the WordPress Database</p>
<pre><code class="lang-plaintext"> mysql --user=&lt;your-username&gt; --password=&lt;your-password&gt; wordpress
</code></pre>
</li>
<li><p>Access the wp-config.php</p>
<pre><code class="lang-plaintext"> sudo vi wp-config.php
</code></pre>
</li>
<li><p>Edit the DB_NAME, DB_USER, DB_PASSWORD, DB_HOST.</p>
<p> <img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1758200999287/92750cca-168f-4700-be9e-b542093a0d65.png" alt class="image--center mx-auto" /></p>
</li>
<li><p>Go to this WordPress Secret Key Generator <a target="_blank" href="https://api.wordpress.org/secret-key/1.1/salt/">https://api.wordpress.org/secret-key/1.1/salt/</a> and Replace the current script.</p>
<p> <img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1758201137405/3852d085-4591-460e-b384-14151348d220.png" alt class="image--center mx-auto" /></p>
</li>
<li><p>Allow 'W3TC' plugin write the configuration data into DB</p>
<pre><code class="lang-plaintext"> define( 'W3TC_CONFIG_DATABASE', true );
</code></pre>
</li>
<li><p>Install the PHP XML extension on your Linux system.</p>
<pre><code class="lang-plaintext"> sudo yum install php-xml -y
</code></pre>
</li>
<li><p>Copy all the files from wordpress folder to html folder &amp; Change Owner.</p>
<pre><code class="lang-plaintext"> cd ..
 sudo cp -r wordpress/* /var/www/html
 sudo chown -R apache:apache /var/www/html
</code></pre>
<p> <img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1758201688170/9f65d3b1-359a-457b-b623-2f3f052949b4.png" alt class="image--center mx-auto" /></p>
</li>
<li><p>Finally, start hosting the Apache web server</p>
<pre><code class="lang-plaintext"> sudo systemctl start httpd.service
 sudo systemctl enable httpd.service
 sudo systemctl restart php-fpm
</code></pre>
<p> <img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1758201891684/28618c4d-bf9f-4fc9-bdbd-4248da0b6217.png" alt class="image--center mx-auto" /></p>
</li>
</ol>
<h2 id="heading-task-6-finalize-wordpress-setup-in-browser">Task 6 — Finalize WordPress setup in browser</h2>
<ol>
<li><p>In your browser, open: <code>http://&lt;ec2-public-ip&gt;/wp-admin/</code>.</p>
</li>
<li><p>Choose language → Click continue.</p>
</li>
<li><p>Enter site title, admin username, password, and email → Install WordPress.</p>
</li>
<li><p>Login with the admin credentials and verify you can create posts and that they are stored in the RDS database.</p>
<p> <img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1758202402923/dba1ff05-a198-4917-a7bb-616d8b75d959.png" alt class="image--center mx-auto" /></p>
</li>
<li><p>Creating a test post.</p>
<p> <img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1758202229771/ee7cc293-4583-4885-b571-ea4f2b00dfb2.png" alt class="image--center mx-auto" /></p>
</li>
<li><p>Access our site via public Ip address.</p>
<pre><code class="lang-plaintext"> http://&lt;public-ip&gt;
</code></pre>
<p> <img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1758202522258/768d759d-9a51-4d9b-8877-f0f290aef924.png" alt class="image--center mx-auto" /></p>
</li>
</ol>
<h2 id="heading-conclusion">Conclusion</h2>
<p>Deploying a two-tier application architecture on AWS provides a structured way to separate the application layer and the database layer. By setting up VPC, subnets, internet gateway, route tables, security groups, EC2 instances for the app, and RDS (or database server) for data storage, we can create a secure and scalable environment. This separation of concerns ensures better performance, easier management, and a strong foundation for running applications like WordPress on the cloud.</p>
]]></content:encoded></item><item><title><![CDATA[Hosting a Static Website on Amazon S3 – Step by Step Guide]]></title><description><![CDATA[In this guide, I’ll show you how to host a static website on Amazon S3. A static website is a website with fixed content like HTML, CSS, and images. There’s no backend server or database needed, so it’s simple, fast, and cost-effective.
What is Amazo...]]></description><link>https://pujanbhattarai0.com.np/hosting-a-static-website-on-amazon-s3-step-by-step-guide</link><guid isPermaLink="true">https://pujanbhattarai0.com.np/hosting-a-static-website-on-amazon-s3-step-by-step-guide</guid><category><![CDATA[Devops]]></category><category><![CDATA[AWS]]></category><category><![CDATA[S3-bucket]]></category><category><![CDATA[Static Website]]></category><dc:creator><![CDATA[Pujan Bhattarai]]></dc:creator><pubDate>Thu, 18 Sep 2025 07:31:21 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/upload/v1758180487954/921afdbf-5562-41df-b3e1-1c3380ae35a4.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>In this guide, I’ll show you how to host a static website on Amazon S3. A static website is a website with fixed content like HTML, CSS, and images. There’s no backend server or database needed, so it’s simple, fast, and cost-effective.</p>
<h2 id="heading-what-is-amazon-s3">What is Amazon S3?</h2>
<p>Amazon S3 (Simple Storage Service) is a cloud-based storage service provided by AWS that allows you to store and retrieve any amount of data at any time. It is highly durable, secure, and scalable, making it ideal for storing files such as images, videos, documents, or entire websites. S3 provides features like access control, versioning, and cost-efficient storage, allowing you to manage your data reliably without worrying about underlying infrastructure or server maintenance.</p>
<p>Hosting a static website on S3 is popular because it’s simple, serverless, and cost-effective. A static website consists of fixed content like HTML, CSS, and JavaScript files, which don’t require a backend server or database. By uploading these files to an S3 bucket and enabling static website hosting, S3 can serve your site directly to visitors. This approach provides high availability, automatic scalability, and a public URL for easy access, making it a perfect solution for portfolios, blogs, landing pages, and other simple web applications.</p>
<h2 id="heading-objective"><strong>Objective</strong></h2>
<p>The goal is to host a static website by:</p>
<ul>
<li><p>Creating an S3 bucket</p>
</li>
<li><p>Uploading website files</p>
</li>
<li><p>Enabling static website hosting</p>
</li>
<li><p>Setting permissions so anyone can access your site</p>
</li>
</ul>
<p>A static website is one with fixed content like HTML, CSS, and images. There’s no backend or database involved, so it’s fast, secure, and cost-effective.</p>
<h2 id="heading-what-you-need"><strong>What You Need</strong></h2>
<ul>
<li><p>An <strong>AWS account</strong> with administrative access</p>
</li>
<li><p>Basic <strong>HTML/CSS/JS</strong> files for your website</p>
</li>
<li><p><strong>AWS CLI</strong> (optional, if you want to use commands instead of the AWS console)</p>
</li>
</ul>
<h2 id="heading-step-1-log-in-and-create-an-s3-bucket"><strong>Step 1: Log in and Create an S3 Bucket</strong></h2>
<ol>
<li><p>Go to the <strong>AWS Management Console</strong> and log in using your IAM credentials.</p>
<p> <img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1758177154787/2777c528-cd80-42b9-8439-d980f62f410b.png" alt class="image--center mx-auto" /></p>
</li>
<li><p>Choose a <strong>region</strong> closest to you. This helps your website load faster.</p>
</li>
<li><p>In the search bar at the top, type <strong>S3</strong> and click the <strong>Create bucket</strong> button.</p>
<p> <img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1758177218900/5fb9bd7c-9168-4faa-89c4-ddc550472929.png" alt class="image--center mx-auto" /></p>
</li>
<li><p>Give your bucket a <strong>unique name</strong>. (Bucket names are global, so no two buckets anywhere in the world can have the same name.)</p>
<p> <img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1758177378397/8ab762f0-e168-4799-87c6-f1501fe79298.png" alt class="image--center mx-auto" /></p>
</li>
<li><p>Leave all other settings as default.</p>
</li>
<li><p>Click <strong>Create bucket</strong>.</p>
<p> <img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1758177396880/82993594-9880-4ac0-bbf9-c114cf2dc291.png" alt class="image--center mx-auto" /></p>
</li>
</ol>
<p>Now your empty S3 bucket is ready to store your website files.</p>
<h2 id="heading-step-2-upload-your-website-files"><strong>Step 2: Upload Your Website Files</strong></h2>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1758177880278/3f6fbaaa-17bc-4a06-815c-84ee814d1fb2.png" alt class="image--center mx-auto" /></p>
<p>Your bucket is currently empty, so we need to add our website files. One way to upload our project to the bucket is by using the AWS Management Console, and another way is by using the AWS CLI. In this tutorial, we will use the AWS CLI to upload our files.</p>
<p>Steps:</p>
<ol>
<li><p>Open your terminal or command prompt.</p>
</li>
<li><p>Navigate to your project folder:</p>
<pre><code class="lang-plaintext"> cd \Downloads\2117_infinite_loop
</code></pre>
<p> List your files to make sure you are in the right folder:</p>
<pre><code class="lang-plaintext"> ls
</code></pre>
<p> <img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1758177981793/7afa2452-eeea-4fb8-87c6-107a584c07ba.png" alt /></p>
</li>
<li><p>Upload files to S3 using:</p>
<pre><code class="lang-plaintext"> aws s3 sync . s3://pravinmishrademo01/
</code></pre>
<p> <img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1758178467866/5356c7d0-c68d-455a-9055-b8972288d308.png" alt class="image--center mx-auto" /></p>
<p> After uploading, you should see all your files in the S3 bucket.</p>
<p> <img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1758178508248/4f1a7a51-e51f-42be-9750-adbc6b121899.png" alt class="image--center mx-auto" /></p>
</li>
</ol>
<h2 id="heading-step-3-enable-static-website-hosting"><strong>Step 3: Enable Static Website Hosting</strong></h2>
<ol>
<li><p>Click on your bucket → Go to <strong>Properties</strong> tab.</p>
</li>
<li><p>Scroll down to <strong>Static website hosting</strong> and click <strong>Edit</strong>.</p>
<p> <img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1758178602697/4585297a-8e1c-4b20-b87c-d4545db1e442.png" alt class="image--center mx-auto" /></p>
</li>
<li><p>Choose <strong>Enable</strong>.</p>
<p> <img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1758178613016/20e6b796-9751-4435-8f29-5a31e5068c4a.png" alt class="image--center mx-auto" /></p>
</li>
</ol>
<p>This allows S3 to serve your website files as a real website.</p>
<h2 id="heading-step-4-set-index-and-error-documents"><strong>Step 4: Set Index and Error Documents</strong></h2>
<ol>
<li><p><strong>Index document:</strong> This is your homepage (usually <code>index.html</code>). When someone visits your website URL, this page will load automatically.</p>
</li>
<li><p><strong>Error document:</strong> This is shown when someone visits a page that doesn’t exist (like <code>404.html</code>). It helps guide visitors back to your site.</p>
</li>
<li><p>Enter <code>index.html</code> for the index document and leave the error document as default (or add <code>404.html</code> if you have one).</p>
</li>
<li><p>Click <strong>Save changes</strong>.</p>
<p> <img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1758178862735/8fbcb834-29ea-4de2-8285-a0ded71674bb.png" alt class="image--center mx-auto" /></p>
</li>
</ol>
<p>S3 will now give you a <strong>website URL</strong> that you can use to visit your site.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1758178884201/e59c13aa-9e17-4ea5-996d-56104b965807.png" alt class="image--center mx-auto" /></p>
<p>When you try to visit the site using the bucket website endpoint, it gives an error because we have not set the bucket policy for website access and have not disabled Block Public Access.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1758179333828/9903a8fc-207b-4ac8-ab75-b547a96ff593.png" alt class="image--center mx-auto" /></p>
<h2 id="heading-step-5-make-your-website-public"><strong>Step 5: Make Your Website Public</strong></h2>
<p>By default, S3 blocks public access. Your site won’t be visible until you allow it.</p>
<ol>
<li><p>Go to the <strong>Permissions</strong> tab in your bucket.</p>
</li>
<li><p>Click <strong>Block Public Access</strong> → <strong>Edit</strong> → Disable it → Save changes.</p>
<p> <img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1758179528240/c21cdab1-e790-4789-89a6-2efb1374ffc3.png" alt class="image--center mx-auto" /></p>
</li>
<li><p>Add a <strong>Bucket Policy</strong> to allow everyone to read your files:</p>
<ul>
<li><p>Go to <strong>Permissions → Bucket Policy → Edit</strong>.</p>
</li>
<li><p>Add this policy (replace <code>your-bucket-name</code> with your actual bucket name):</p>
<pre><code class="lang-plaintext">  {
    "Version": "2012-10-17",
    "Statement": [
      {
        "Sid": "PublicReadGetObject",
        "Effect": "Allow",
        "Principal": "*",
        "Action": "s3:GetObject",
        "Resource": [
                  "arn:aws:s3:::pravinmishrademo01/*"
         ]
      }
    ]
  }
</code></pre>
</li>
</ul>
</li>
<li><p>Save the policy.</p>
<p> <img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1758179547500/727d5bd0-fbd6-4771-a047-ff6aff6de1c0.png" alt class="image--center mx-auto" /></p>
</li>
</ol>
<p>Now your website is <strong>live and publicly accessible</strong> using the URL from static website hosting.</p>
<h2 id="heading-step-6-test-your-website"><strong>Step 6: Test Your Website</strong></h2>
<ul>
<li><p>Copy the URL from the <strong>Static Website Hosting</strong> section in S3.</p>
</li>
<li><p>Open it in a browser.</p>
</li>
<li><p>Your static website should load perfectly.</p>
</li>
</ul>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1758179627430/549194e2-62d4-4d80-b106-1928850efdd1.png" alt class="image--center mx-auto" /></p>
<h2 id="heading-conclusion"><strong>Conclusion</strong></h2>
<p>Hosting a static website on AWS S3 is easy and beginner-friendly. You don’t need a server or database. By following these steps, you can make your website live and accessible to anyone. This is perfect for portfolios, blogs, landing pages, or documentation websites.</p>
]]></content:encoded></item><item><title><![CDATA[Journey from Traditional Software Development to DevOps and Agile Practices]]></title><description><![CDATA[When we started working on our very first software project, We followed a traditional Software Development Life Cycle (SDLC) approach. It was linear, structured, and often called the Waterfall model. Each phase—requirement gathering, design, developm...]]></description><link>https://pujanbhattarai0.com.np/journey-from-traditional-software-development-to-devops-and-agile-practices</link><guid isPermaLink="true">https://pujanbhattarai0.com.np/journey-from-traditional-software-development-to-devops-and-agile-practices</guid><category><![CDATA[Devops]]></category><category><![CDATA[agile]]></category><category><![CDATA[Scrum]]></category><category><![CDATA[SDLC]]></category><category><![CDATA[lifecycle]]></category><dc:creator><![CDATA[Pujan Bhattarai]]></dc:creator><pubDate>Fri, 05 Sep 2025 16:35:13 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/upload/v1757090468970/7b3d2733-90dd-4b07-a004-877b28d84b71.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>When we started working on our very first software project, We followed a <strong>traditional Software Development Life Cycle (SDLC)</strong> approach. It was linear, structured, and often called the <em>Waterfall model</em>. Each phase—requirement gathering, design, development, testing, and deployment—was done in sequence.</p>
<p>At that time, it felt straightforward. But as projects grew bigger and more dynamic, I noticed a few challenges:</p>
<ul>
<li><p>Any small change in requirements was difficult to manage.</p>
</li>
<li><p>Feedback often came late, usually after the product was nearly finished.</p>
</li>
<li><p>Collaboration between developers, testers, and operations was minimal.</p>
</li>
<li><p>Delivery cycles were slow, making it harder to respond to fast-changing user needs.</p>
</li>
</ul>
<p>This is where I began exploring <strong>modern approaches to SDLC</strong>—focusing on <strong>Agile methodologies, Scrum frameworks, Jira as a project management tool, and DevOps practices.</strong></p>
<h2 id="heading-from-traditional-sdlc-to-agile-and-devops"><strong>From Traditional SDLC to Agile and DevOps</strong></h2>
<p>Unlike the traditional model, <strong>Agile</strong> emphasizes adaptability, customer collaboration, and incremental delivery. Instead of waiting months for a final product, Agile teams deliver working software in short iterations, often called <em>sprints</em>.</p>
<p>This approach made me realize that:</p>
<ul>
<li><p>Software should be delivered in <strong>small, valuable increments</strong>.</p>
</li>
<li><p><strong>Continuous feedback</strong> ensures we build the right thing.</p>
</li>
<li><p>Teams should work <strong>collaboratively across development, testing, and operations</strong>.</p>
</li>
</ul>
<p>And when Agile meets automation and cloud practices, that’s where <strong>DevOps</strong> comes into the picture.</p>
<h2 id="heading-what-is-agile-and-why-is-it-important"><strong>What is Agile and Why is it Important?</strong></h2>
<p><strong>Agile</strong> is a mindset and methodology that focuses on delivering value to the customer quickly and effectively. Instead of rigid documentation and long delivery timelines, Agile promotes:</p>
<ul>
<li><p><strong>Iterative development</strong> (small releases)</p>
</li>
<li><p><strong>Continuous feedback</strong> from stakeholders</p>
</li>
<li><p><strong>Flexibility</strong> to adapt to changes</p>
</li>
<li><p><strong>Cross-functional collaboration</strong></p>
</li>
</ul>
<p>Agile is important because modern software projects demand <strong>speed + quality</strong>. Customers expect rapid updates, bug fixes, and new features—Agile enables teams to meet those expectations without burning out.</p>
<h2 id="heading-what-is-scrum"><strong>What is Scrum?</strong></h2>
<p>Within Agile, <strong>Scrum</strong> is one of the most popular frameworks. It organizes work into <strong>sprints</strong> (usually 1–2 weeks long), with clearly defined roles and ceremonies.</p>
<ul>
<li><p><strong>Roles</strong>: Product Owner, Scrum Master, and Development Team</p>
</li>
<li><p><strong>Ceremonies</strong>: Sprint Planning, Daily Standups, Sprint Review, Sprint Retrospective</p>
</li>
<li><p><strong>Artifacts</strong>: Product Backlog, Sprint Backlog, and Increment</p>
</li>
</ul>
<p>Scrum helps keep teams <strong>focused</strong>, ensures <strong>transparency</strong> of progress, and fosters a cycle of <strong>continuous improvement</strong>.</p>
<h2 id="heading-what-is-jira-and-how-do-we-use-it"><strong>What is Jira and How Do We Use It?</strong></h2>
<p>When I started applying Agile in my projects, I used <strong>Jira</strong>, a popular project management tool by Atlassian.</p>
<p>Jira allows teams to:</p>
<ul>
<li><p>Create <strong>Epics, Stories, and Tasks</strong> to structure work.</p>
</li>
<li><p>Plan and run <strong>sprints</strong> with defined goals.</p>
</li>
<li><p>Use <strong>Kanban or Scrum boards</strong> to visualize progress.</p>
</li>
<li><p>Track metrics like <strong>burndown charts</strong> for transparency.</p>
</li>
</ul>
<p>In my own practice projects, I created Epics like <em>“Improve UI discoverability &amp; trust”</em> and then broke them down into stories such as <em>“Hero tagline clarity”</em> or <em>“Primary CTA color.”</em> Using Jira made the entire sprint cycle feel organized and measurable.</p>
<h2 id="heading-how-i-used-jira-in-my-projects"><strong>How I Used Jira in My Projects</strong></h2>
<p>For managing my Agile sprints, I used <strong>Jira</strong>, which is a powerful tool for planning, tracking, and delivering software efficiently. Here’s a simple way beginners can get started:</p>
<p><strong>Step 1: Create a Project</strong></p>
<ul>
<li><p>Choose <strong>Software → Scrum → Team-managed</strong>.</p>
</li>
<li><p>Give it a name (e.g., <em>Gotto Job – Solo Pujan</em>).</p>
</li>
</ul>
<p><strong>Step 2: Add Epics &amp; Stories</strong></p>
<ul>
<li><p>Create <strong>Epics</strong> for big goals (e.g., <em>Improve UI discoverability</em>).</p>
</li>
<li><p>Add <strong>Stories</strong> under Epics with clear titles, descriptions, and acceptance criteria.</p>
</li>
<li><p>Estimate each story with <strong>story points</strong> (1, 2, 3…).</p>
</li>
</ul>
<p><strong>Step 3: Plan a Sprint</strong></p>
<ul>
<li><p>Drag 3–4 stories into a sprint.</p>
</li>
<li><p>Set a <strong>Sprint Goal</strong> (e.g., <em>Ship 2–3 visible UI improvements</em>).</p>
</li>
</ul>
<p><strong>Step 4: Subtasks &amp; Progress Tracking</strong></p>
<ul>
<li><p>Break stories into subtasks: <strong>Build → Verify → Deploy → Screenshot</strong>.</p>
</li>
<li><p>Move tasks across the board as you work: <em>To Do → In Progress → Done</em>.</p>
</li>
</ul>
<p><strong>Step 5: Review &amp; Retro</strong></p>
<ul>
<li><p>Demo the increment.</p>
</li>
<li><p>Write <strong>Retro notes</strong>: What went well, what to improve, Scrum pillar observed, Scrum value practiced.</p>
</li>
<li><p>Check <strong>burndown chart</strong> for sprint progress.</p>
</li>
</ul>
<p>Using Jira helped me <strong>organize my work, visualize progress, and deliver small increments</strong> efficiently in my recent projects like <em>Gotto Job</em> and <em>Mini Finance footer deployment on EC2</em>.</p>
<h2 id="heading-the-devops-life-cycle"><strong>The DevOps Life Cycle</strong></h2>
<p>While Agile focuses on <em>how we manage work</em>, <strong>DevOps</strong> focuses on <em>how we build and deliver software efficiently</em>.</p>
<p>The <strong>DevOps Life Cycle</strong> is usually represented as an infinity loop, covering continuous phases:</p>
<ol>
<li><p><strong>Plan</strong> – Define requirements and track progress (Agile + Jira).</p>
</li>
<li><p><strong>Code</strong> – Write clean, maintainable code (Git, GitHub, VS Code).</p>
</li>
<li><p><strong>Build</strong> – Compile and package the software (CI pipelines).</p>
</li>
<li><p><strong>Test</strong> – Automate testing for quality assurance.</p>
</li>
<li><p><strong>Release</strong> – Deploy software to staging/production.</p>
</li>
<li><p><strong>Deploy</strong> – Push updates using tools like Docker, Kubernetes, or AWS EC2.</p>
</li>
<li><p><strong>Operate</strong> – Monitor system performance and uptime.</p>
</li>
<li><p><strong>Monitor</strong> – Collect feedback and performance metrics to start the cycle again.</p>
</li>
</ol>
<p>This cycle emphasizes <strong>automation, collaboration, and continuous delivery</strong>, ensuring faster and more reliable deployments.</p>
<h2 id="heading-what-i-did-in-my-present-projects"><strong>What I Did in My Present Projects</strong></h2>
<p>In my recent practice projects, I combined <strong>Agile, Jira, and DevOps</strong> concepts to simulate real-world workflows:</p>
<ul>
<li><p>Used <strong>Jira Scrum boards</strong> to create Epics and Stories.</p>
</li>
<li><p>Planned and executed <strong>mini-sprints</strong> with goals like <em>“Deploy footer with version and date on EC2.”</em></p>
</li>
<li><p>Broke down tasks into <strong>subtasks</strong> (Build, Verify, Deploy, Screenshot).</p>
</li>
<li><p>Deployed my project on <strong>AWS EC2</strong> with Nginx as the web server.</p>
</li>
<li><p>Practiced retrospectives by reflecting on <strong>what went well, what could improve, and Scrum values observed.</strong></p>
</li>
</ul>
<p>This hands-on approach gave me clarity on how modern teams actually build, ship, and improve software continuously.</p>
<h2 id="heading-final-thoughts"><strong>Final Thoughts</strong></h2>
<p>Looking back, moving from <strong>traditional SDLC</strong> to <strong>Agile + DevOps practices</strong> has been a huge leap in our learning journey. Agile brought adaptability, Scrum gave structure, Jira made work visible, and DevOps ensured smooth delivery.</p>
<p>Even in small practice projects, applying these frameworks has helped us think like a <strong>real-world engineer</strong>, preparing us for bigger roles in Cloud, DevOps, and Agile teams.</p>
]]></content:encoded></item><item><title><![CDATA[A Complete Guide to Git & GitHub: Hands-On Learning]]></title><description><![CDATA[Version control is one of the most essential skills for developers, and Git along with GitHub is the industry standard for managing code efficiently. In this blog, I will share what I learned about Git and GitHub, the differences between them, reposi...]]></description><link>https://pujanbhattarai0.com.np/a-complete-guide-to-git-and-github-hands-on-learning</link><guid isPermaLink="true">https://pujanbhattarai0.com.np/a-complete-guide-to-git-and-github-hands-on-learning</guid><category><![CDATA[GitHub]]></category><category><![CDATA[Git]]></category><category><![CDATA[Git Commands]]></category><category><![CDATA[Learning Journey]]></category><category><![CDATA[Devops]]></category><dc:creator><![CDATA[Pujan Bhattarai]]></dc:creator><pubDate>Fri, 29 Aug 2025 08:17:36 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/upload/v1756455699906/c02020ce-164b-4292-9c34-166524e3175c.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Version control is one of the most essential skills for developers, and Git along with GitHub is the industry standard for managing code efficiently. In this blog, I will share what I learned about Git and GitHub, the differences between them, repositories, key commands, authentication methods, and how they are crucial in DevOps.</p>
<h2 id="heading-what-is-git">What is Git?</h2>
<p>Git is a distributed version control system that helps developers track changes in their code. It allows multiple people to work on a project simultaneously without overwriting each other’s work. Git stores snapshots of your project at different points in time, making it easy to roll back changes or review history.</p>
<h2 id="heading-what-is-github">What is GitHub?</h2>
<p>GitHub is a cloud-based platform that hosts Git repositories. It enables collaboration by allowing developers to share their code, submit pull requests, review changes, and contribute to open-source projects. While Git manages the code locally, GitHub provides a remote backup and collaboration tools.</p>
<h2 id="heading-difference-between-git-and-github">Difference Between Git and GitHub</h2>
<div class="hn-table">
<table>
<thead>
<tr>
<td><strong>Feature</strong></td><td><strong>Git</strong></td><td><strong>Github</strong></td></tr>
</thead>
<tbody>
<tr>
<td>Type</td><td>Version control system</td><td>Cloud-based hosting service</td></tr>
<tr>
<td>Functionality</td><td>Track Changes locally</td><td>Enables collaboration &amp; remote repo</td></tr>
<tr>
<td>Installation</td><td>Installed Locally on your Machine</td><td>No installation required, web based</td></tr>
<tr>
<td>Use Case</td><td>Managing Project History</td><td>Sharing code, Collaboration, CI/CD</td></tr>
</tbody>
</table>
</div><h2 id="heading-repositories"><strong>Repositories</strong></h2>
<p>A <strong>repository (repo)</strong> is a place where your project files and their history are stored. Git supports different types of repositories:</p>
<ul>
<li><p><strong>Local Repository</strong>: Exists on your local machine.</p>
</li>
<li><p><strong>Remote Repository</strong>: Hosted on platforms like GitHub, GitLab, or Bitbucket.</p>
</li>
</ul>
<h2 id="heading-hands-on-learning-installing-git-amp-configuration"><strong>Hands-On Learning: Installing Git &amp; Configuration</strong></h2>
<ol>
<li><p><strong>Installing Git</strong>:<br /> Download Git from <a target="_blank" href="https://git-scm.com/?utm_source=chatgpt.com">git-scm.com</a> and follow the installation steps.</p>
</li>
<li><p><strong>Configuring Git</strong>:</p>
</li>
</ol>
<pre><code class="lang-plaintext">git config --global user.name "Your Name"
git config --global user.email "your.email@example.com"
</code></pre>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1756452260200/e9b92a23-a9f7-40e4-aa10-2afc2c5a96f0.png" alt class="image--center mx-auto" /></p>
<p>This sets your identity for commit history.</p>
<h2 id="heading-initializing-a-repository"><strong>Initializing a Repository</strong></h2>
<p>To start tracking a project with Git:</p>
<pre><code class="lang-plaintext">git init
</code></pre>
<p>This creates a <code>.git</code> folder in your project directory, enabling version control.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1756452586954/b594f479-a17a-447e-9bee-7f4431a24bdd.png" alt class="image--center mx-auto" /></p>
<h2 id="heading-key-git-commands-that-i-used"><strong>Key Git Commands that i Used</strong></h2>
<ol>
<li><p><code>git status</code> – Checks the current state of files in the working directory.</p>
<p> <img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1756453017897/00e0148e-e24e-443b-a6f1-984ab7d1a0fc.png" alt class="image--center mx-auto" /></p>
</li>
<li><p><code>git add .</code> – Stages all changes to be committed.</p>
<p> <img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1756453073714/e919f134-008d-49ab-bd08-e34d3a810e33.png" alt class="image--center mx-auto" /></p>
</li>
<li><p><code>git commit -m "message"</code> – Saves staged changes with a descriptive message.</p>
<p> <img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1756453122301/6013b0a1-27ed-4398-bcda-5cbab246bcff.png" alt class="image--center mx-auto" /></p>
</li>
<li><p><code>git log</code> – Displays commit history in the repository.</p>
<p> <img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1756453192859/25c77f6d-8786-4f91-88f4-671270eda148.png" alt class="image--center mx-auto" /></p>
</li>
<li><p><code>git push</code> – Sends local commits to a remote repository.</p>
<p> <img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1756453690308/36638e4d-77d4-406e-abcb-7305306e4799.png" alt class="image--center mx-auto" /></p>
</li>
<li><p><code>git checkout</code> - is used to <strong>switch between branches</strong> or <strong>restore files</strong> in your working directory to a specific state.</p>
<p> <img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1756453337537/344749b6-6a24-4ec9-86ca-5617d09af915.png" alt class="image--center mx-auto" /></p>
</li>
<li><p><code>git merge</code> – Integrates changes from one branch into another.</p>
<p> <img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1756453619060/95055c09-39ce-4fb4-915a-3b811c6f396e.png" alt /></p>
</li>
</ol>
<h2 id="heading-authentication-in-github"><strong>Authentication in GitHub</strong></h2>
<p>To interact with GitHub securely, authentication is required:</p>
<ul>
<li><p><strong>HTTPS</strong>: Uses your GitHub username and personal access token for authentication.</p>
</li>
<li><p><strong>SSH</strong>: Uses SSH keys to authenticate without entering credentials repeatedly:</p>
</li>
</ul>
<pre><code class="lang-plaintext">ssh-keygen -t rsa -b 4096 -C “pujanbhattarai361@gmail.com”
ls -la ../.ssh/
cat ~/.ssh/id_rsa.pub
</code></pre>
<p>SSH is recommended for convenience and security.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1756454634790/8ae90671-2dc9-4d53-b4d7-fa9120419ec3.png" alt class="image--center mx-auto" /></p>
<p>Now, Add the SSH key in github, go to Github Settings → SSH and GPG Keys → New SSH keys.</p>
<pre><code class="lang-plaintext">ssh -T git@github.com
</code></pre>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1756454729882/cbe214c2-5cd2-4926-8feb-a5fffba94c5f.png" alt class="image--center mx-auto" /></p>
<h2 id="heading-gitignore"><strong>.gitignore</strong></h2>
<p>The <code>.gitignore</code> file tells Git which files or directories to ignore. For example, sensitive data, log files, or IDE settings should not be tracked:</p>
<pre><code class="lang-plaintext">node_modules/
*.log
.env
</code></pre>
<h2 id="heading-git-with-vs-code"><strong>Git with VS Code</strong></h2>
<p>Visual Studio Code integrates Git natively:</p>
<ul>
<li><p>View changes in the <strong>Source Control</strong> panel.</p>
</li>
<li><p>Stage, commit, pull, and push directly from the editor.</p>
</li>
<li><p>Provides visual diff tools to compare changes.</p>
</li>
</ul>
<p>Using Git with VS Code makes version control more intuitive, especially for beginners.</p>
<h2 id="heading-importance-of-git-amp-github-in-devops"><strong>Importance of Git &amp; GitHub in DevOps</strong></h2>
<p>Git and GitHub are fundamental in <strong>DevOps</strong> for several reasons:</p>
<ul>
<li><p><strong>Collaboration</strong>: Teams can work on multiple branches simultaneously.</p>
</li>
<li><p><strong>Continuous Integration/Deployment (CI/CD)</strong>: GitHub repositories integrate with pipelines to automate testing and deployment.</p>
</li>
<li><p><strong>Traceability</strong>: Every change is tracked with commits, making debugging easier.</p>
</li>
<li><p><strong>Backup</strong>: Remote repositories serve as a cloud backup for code.</p>
</li>
</ul>
<p>Mastering Git and GitHub is a crucial step toward a career in DevOps or cloud-native development.</p>
<h2 id="heading-conclusion"><strong>Conclusion</strong></h2>
<p>Learning Git and GitHub hands-on has been a game-changer for me. From initializing repositories, committing changes, pushing and pulling updates, to creating branches and collaborating via pull requests, I now understand how version control and cloud collaboration work together to streamline software development.</p>
<p>Git and GitHub are more than just tools—they are <strong>essential skills</strong> for every modern developer and DevOps engineer.</p>
]]></content:encoded></item><item><title><![CDATA[Linux Commands Every DevOps Beginner Should Know]]></title><description><![CDATA[Linux is the backbone of modern IT infrastructure, powering most servers, cloud platforms, and DevOps workflows. Over 96% of the top 1 million web servers worldwide are Linux-powered, highlighting its dominance in the server ecosystem. For DevOps eng...]]></description><link>https://pujanbhattarai0.com.np/linux-commands-every-devops-beginner-should-know</link><guid isPermaLink="true">https://pujanbhattarai0.com.np/linux-commands-every-devops-beginner-should-know</guid><category><![CDATA[Linux]]></category><category><![CDATA[linux for beginners]]></category><category><![CDATA[command line]]></category><category><![CDATA[Devops]]></category><dc:creator><![CDATA[Pujan Bhattarai]]></dc:creator><pubDate>Fri, 22 Aug 2025 05:53:22 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/upload/v1755841668068/7d10d666-d514-4a7c-9783-fa45a3f4fb5c.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Linux is the backbone of modern IT infrastructure, powering most servers, cloud platforms, and DevOps workflows. Over 96% of the top 1 million web servers worldwide are Linux-powered, highlighting its dominance in the server ecosystem. For DevOps engineers, mastering Linux is not optional—it’s essential. This blog is based on what I recently learned in <strong>Pravin Mishra’s Live DevOps for Beginners cohort</strong>, covering the fundamentals of Linux and essential commands for daily DevOps tasks.</p>
<hr />
<h3 id="heading-what-is-linux"><strong>What is Linux?</strong></h3>
<p>Linux is an open-source, Unix-like operating system used across desktops, servers, embedded devices, and cloud platforms. Unlike proprietary systems, Linux can be freely modified, distributed, and customized, giving engineers complete control over their environment. Its flexibility, stability, and security make it the preferred OS for production servers, cloud infrastructure, and DevOps automation.</p>
<p>Linux isn’t just an operating system—it’s an ecosystem. From running lightweight web servers to powering containers in Kubernetes clusters, Linux forms the foundation of modern IT operations. Its open-source nature also means there is a vast community, plenty of documentation, and extensive support for developers and engineers worldwide.</p>
<hr />
<h3 id="heading-why-linux-is-important"><strong>Why Linux is Important</strong></h3>
<p>Linux is central to DevOps and cloud engineering for several reasons:</p>
<ol>
<li><p><strong>Stability and Reliability:</strong> Linux servers can run for years without rebooting. For critical applications, this reliability is crucial.</p>
</li>
<li><p><strong>Security:</strong> Its permission and user management systems, combined with open-source transparency, ensure strong security.</p>
</li>
<li><p><strong>Flexibility and Customization:</strong> You can build minimal systems for containers or fully-featured servers for enterprise workloads.</p>
</li>
<li><p><strong>Server and Cloud Dominance:</strong> Most of the world’s cloud platforms and over 96% of the top 1 million web servers run Linux.</p>
</li>
</ol>
<p>In short, Linux is not only the operating system of choice but also a skill every DevOps engineer must master to efficiently manage, monitor, and deploy applications at scale.</p>
<hr />
<h2 id="heading-essential-linux-commands"><strong>Essential Linux Commands</strong></h2>
<h3 id="heading-file-management">File Management</h3>
<p>File management is one of the most fundamental skills in Linux. As a DevOps engineer, you will constantly navigate the filesystem, organize scripts, and manage logs.</p>
<p>Some key commands:</p>
<p>ls - Lists files and directories. Options like <code>ls -l</code> show permissions, owners, and sizes, while <code>ls -a</code> shows hidden files.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1755835846396/4a93c4ad-e921-426e-82f2-0c6e3932b2e4.png" alt class="image--center mx-auto" /></p>
<p>cd - Changes directories. Navigating efficiently saves time when managing servers.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1755835795071/6d4caa36-a5d3-4911-a99c-f22b853d722b.png" alt class="image--center mx-auto" /></p>
<p>mkdir - Creates directories. Use <code>mkdir -p</code> to create nested directories in one command.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1755835920027/8be56e64-bd0f-4a8c-bdfa-920267022bd3.png" alt class="image--center mx-auto" /></p>
<p>cat - Displays the entire content of a file</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1755841892629/dd5b4b1b-8d47-4478-8563-5ebf588c6d41.png" alt class="image--center mx-auto" /></p>
<p>rm - Deletes files and directories. <code>rm -r</code> recursively removes directories; caution is needed to prevent accidental deletion.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1755835985938/ee315562-4d73-4ba3-82db-27e1c88b612c.png" alt class="image--center mx-auto" /></p>
<p><strong>Practical Use in DevOps:</strong> Organizing scripts for automation, backing up configuration files before deployment, and managing logs for troubleshooting.</p>
<h3 id="heading-text-editors">Text Editors</h3>
<p>Editing configuration files, writing scripts, and debugging are daily tasks for DevOps engineers. Linux provides multiple editors:</p>
<p><strong>Nano</strong> – Beginner-friendly terminal-based editor. Easy to learn, ideal for quick edits.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1755836368645/52beb4ce-08e7-4383-a6e0-c6344faebbfa.png" alt class="image--center mx-auto" /></p>
<p><strong>Vim</strong> – Advanced editor with powerful features like macros, multi-file editing, and syntax highlighting. Essential for server administration.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1755836423749/bd64d00e-cf50-4dec-bee8-e00e11b5ef4e.png" alt class="image--center mx-auto" /></p>
<p><strong>Practical Use:</strong> Text editors are critical for configuring applications, managing deployment scripts, and debugging automation pipelines.</p>
<h3 id="heading-file-permissions">File Permissions</h3>
<p>Linux file permissions control who can <strong>read, write, or execute files</strong>, maintaining system security and integrity.</p>
<p><code>ls -l</code> – Shows permissions, ownership, and file size.</p>
<p><code>chmod</code> – Changes file permissions. Example: <code>chmod 755</code> <a target="_blank" href="http://script.sh"><code>script.sh</code></a> makes it executable for everyone.</p>
<p><code>chown</code> – Changes file ownership. Example: <code>sudo chown user:group file.txt</code>.</p>
<p><img src="https://lh7-rt.googleusercontent.com/docsz/AD_4nXfUVz8aA3m_gu2M6A69dbOPKcUjG7ZjJTs9d4pd6dEInYbbQCDyPtHxCpNh7Ff7bhFsa-dCVHccGGX33Ng9jwbykWx-mq2LLw896PYXBJCjdmncQoMh2MvaW-uGLwa4HyTviyBgMQ?key=siJEzuj68CoGbv_PBXnHww" alt /></p>
<p><strong>Practical Use in DevOps:</strong> Ensuring scripts and configuration files have the correct permissions prevents unauthorized access, accidental modification, and security vulnerabilities on production servers.</p>
<h3 id="heading-process-monitoring-amp-control">Process Monitoring &amp; Control</h3>
<p>Processes are programs running on your system. Monitoring and controlling them ensures server stability and efficient resource usage.</p>
<p>ps -e - Lists all running processes on the system along with their process IDs (PIDs). It’s useful for monitoring active processes, checking what’s currently running, and troubleshooting system performance issues<strong>.</strong></p>
<p><img src="https://lh7-rt.googleusercontent.com/docsz/AD_4nXefgKYynNBRDR4XUW2aVwOQSa08liSJC2h5T1c8sroJbjWJRFJjx_b_luX9hvo8fGYMEco6qOGPIr9w66kO-hIQg2LfdRZQ4yUMnTjyP0hajfP7dcEkeFcXOENKGCjW6Tdk45m-dA?key=siJEzuj68CoGbv_PBXnHww" alt /></p>
<p>ps aux | grep nginx – Lists all running nginx with details like PID, CPU, and memory usage.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1755837009173/48db9f66-37c3-449a-8e29-c99e54ee2b68.png" alt class="image--center mx-auto" /></p>
<p>pstree – Displays running processes in a tree-like structure, showing parent–child relationships between them. It’s useful for understanding how processes are organized, spotting which processes were started by others, and troubleshooting process dependencies.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1755837205358/827f8e4e-e4ea-48e2-ab0a-783717b952d4.png" alt class="image--center mx-auto" /></p>
<p>kill - &lt;PID&gt; – Gracefully stops a process.</p>
<p>kill -9 &lt;PID&gt; – Forcefully terminates an unresponsive process.</p>
<p><strong>Practical Use:</strong> Identifying resource-hogging processes, restarting services, and debugging issues during deployment or automation.</p>
<h3 id="heading-networking-commands">Networking Commands</h3>
<p>ifconfig - ifconfig (interface configurator) is a command-line tool used in Linux and Unix systems to display and configure network interfaces. It shows details like IP addresses, MAC addresses, and network status, though it has been largely replaced by the ip command in modern systems</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1755839315890/5ef473d2-2ae2-4933-915c-acec78b071db.png" alt class="image--center mx-auto" /></p>
<p>ping - This command tests network connectivity between your system and another host (like a server or website) by sending small packets of data and measuring the response time. It’s useful for troubleshooting network issues, checking if a server is reachable, and measuring latency.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1755839483099/666c4891-b97a-4d6b-8cf8-ffa3786fafd9.png" alt class="image--center mx-auto" /></p>
<p>dig- The dig (Domain Information Groper) command is used to query DNS servers for information about domain names, such as IP addresses, mail servers, and name servers.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1755839681882/d0a381ca-c242-40e7-b8d7-241a1a76e7dc.png" alt class="image--center mx-auto" /></p>
<p>wget - The wget command is used to download files from the internet directly to your Linux system via HTTP, HTTPS, or FTP. In DevOps, it’s useful for fetching scripts, packages, backups, or configuration files from remote servers for automation, deployment, or testing purposes.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1755839792667/7f50b9ef-571f-496b-8cbe-3271057852f0.png" alt class="image--center mx-auto" /></p>
<p><strong>Practical Use:</strong> Testing network connectivity between servers, monitoring open ports for security, and automating API checks during deployments.</p>
<h3 id="heading-system-information"><strong>System Information</strong></h3>
<p>uname -a - It displays detailed information about your system, including the kernel name, network node hostname, kernel version, machine hardware name, processor type, and operating system. It’s useful for quickly checking system details for troubleshooting, system monitoring, or compatibility purposes.</p>
<p>uptime - It shows how long the system has been running, the current time, the number of logged-in users, and the system load averages for the past 1, 5, and 15 minutes.</p>
<p>who - It displays a list of users currently logged into the system, along with their login terminals, login times, and sometimes their originating IP addresses.</p>
<p>free -h - This command shows the system’s memory usage, including total, used, free, and cached memory, in a human-readable format.</p>
<p>df -h -  It shows the disk space usage of all mounted filesystems in a human-readable format (e.g., MB or GB). It’s useful for monitoring storage capacity, identifying full partitions, and managing disk resources effectively.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1755840908889/e7aff732-a705-4f36-b3fa-f6fb611e6fac.png" alt class="image--center mx-auto" /></p>
<p><strong>Practical Use:</strong> DevOps engineers rely on these commands to detect bottlenecks, optimize performance, and maintain healthy servers.</p>
<h3 id="heading-conclusion"><strong>Conclusion:</strong></h3>
<p>Linux is the foundation of DevOps, cloud computing, and modern IT operations. From file management to process monitoring, networking, and disk management, mastering Linux commands equips engineers to automate tasks, troubleshoot issues, and manage infrastructure efficiently. Practicing these commands regularly will not only improve your confidence but also prepare you to handle real-world DevOps scenarios.</p>
]]></content:encoded></item><item><title><![CDATA[My DevOps Journey: Deploying a React App on AWS EC2 with Nginx]]></title><description><![CDATA[As part of my DevOps learning journey, I recently deployed a React application on an Ubuntu EC2 instance using Nginx as the web server. This project gave me hands-on experience with AWS cloud, Linux commands, Nginx configuration, and deploying fronte...]]></description><link>https://pujanbhattarai0.com.np/my-devops-journey-deploying-a-react-app-on-aws-ec2-with-nginx</link><guid isPermaLink="true">https://pujanbhattarai0.com.np/my-devops-journey-deploying-a-react-app-on-aws-ec2-with-nginx</guid><category><![CDATA[Devops]]></category><category><![CDATA[deployment]]></category><category><![CDATA[nginx]]></category><dc:creator><![CDATA[Pujan Bhattarai]]></dc:creator><pubDate>Thu, 21 Aug 2025 15:47:58 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/upload/v1755791119891/7dc806ed-2dd3-4269-9cbc-2ba033814ee7.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>As part of my DevOps learning journey, I recently deployed a <strong>React application</strong> on an <strong>Ubuntu EC2 instance</strong> using <strong>Nginx</strong> as the web server. This project gave me hands-on experience with AWS cloud, Linux commands, Nginx configuration, and deploying frontend apps in production.</p>
<p>Here’s a detailed breakdown of what I did and what I learned.</p>
<hr />
<h2 id="heading-step-1-launching-an-ec2-instance-on-aws"><strong>Step 1: Launching an EC2 Instance on AWS</strong></h2>
<p>The first step was to create a <strong>virtual machine</strong> on AWS (called an EC2 instance).</p>
<ol>
<li><p>Logged into our AWS account and go to <strong>EC2 → Launch Instance</strong>.</p>
</li>
<li><p>Selected <strong>Ubuntu AMI (Amazon Machine Image)</strong> – since Ubuntu is stable and widely used for deployments.</p>
</li>
<li><p>Chose a <strong>Free Tier eligible instance type</strong> (t2.micro). This keeps costs at $0 while still giving us enough power to run small apps.</p>
</li>
<li><p>Created a new <strong>key pair (.pem file)</strong> which is necessary for securely logging in to the instance.</p>
</li>
<li><p>Configured the <strong>security group</strong> to:</p>
<ul>
<li><p>Allow <strong>SSH (port 22)</strong> → So we can connect from my local machine.</p>
</li>
<li><p>Allow <strong>HTTP (port 80)</strong> → So users can access our web app in a browser.</p>
</li>
</ul>
</li>
<li><p>Finally, We clicked <strong>Launch Instance</strong>.</p>
</li>
</ol>
<p>At this point, AWS gave us a <strong>public IP address</strong> for our instance.</p>
<p>To connect, We opened the terminal, navigated to the folder containing .pem key, and ran:</p>
<p>ssh -i "key-pair.pem" ubuntu@&lt;public-ip&gt;</p>
<p>This gave us terminal access to our remote Ubuntu server.</p>
<hr />
<h2 id="heading-step-2-installing-nodejs-and-npm"><strong>Step 2: Installing Node.js and npm</strong></h2>
<p>React apps require <strong>Node.js and npm</strong> (Node Package Manager).</p>
<ol>
<li>First, we updated the server’s package lists:</li>
</ol>
<p>sudo apt update</p>
<ol start="2">
<li>Then installed Node.js and npm:</li>
</ol>
<p>sudo apt install -y nodejs npm</p>
<ol start="3">
<li>Verified installation:</li>
</ol>
<p>node -v &amp;&amp; npm -v</p>
<p>This confirmed that both Node.js and npm were available to build my React project.</p>
<hr />
<h2 id="heading-step-3-installing-and-starting-nginx"><strong>Step 3: Installing and Starting Nginx</strong></h2>
<p>Nginx is a powerful <strong>web server</strong> that can serve static files (like React build files) efficiently.</p>
<p>We installed and started it with:</p>
<p>sudo apt install -y nginx</p>
<p>sudo systemctl start nginx</p>
<p>sudo systemctl enable nginx</p>
<ul>
<li><p>start → Starts Nginx immediately.</p>
</li>
<li><p>enable → Makes sure Nginx starts automatically when the server restarts.</p>
</li>
</ul>
<p>To confirm, I opened my instance’s public IP in a browser, and I saw the <strong>default Nginx welcome page</strong>.</p>
<hr />
<h2 id="heading-step-4-cloning-my-react-app-from-github"><strong>Step 4: Cloning My React App from GitHub</strong></h2>
<p>Next, We needed an application to deploy.</p>
<ol>
<li>I forked a React app on GitHub and cloned it onto my server:</li>
</ol>
<p>git clone <a target="_blank" href="https://github.com/23Pujan/my-react-app.git">https://github.com/23Pujan/my-react-app.git</a></p>
<p>cd my-react-app</p>
<ol start="2">
<li>To personalize it, I opened the App.js file in the src folder:</li>
</ol>
<p>cd src</p>
<p>vi App.js</p>
<p>Inside, I updated the code to display:</p>
<p>&lt;h2&gt;Deployed by: &lt;strong&gt;Pujan Bhattarai&lt;/strong&gt;&lt;/h2&gt;</p>
<p>&lt;p&gt;Date: &lt;strong&gt;21/08/2025&lt;/strong&gt;&lt;/p&gt;</p>
<p>This step made my deployment <strong>unique to me</strong>.</p>
<hr />
<h2 id="heading-step-5-building-the-react-app-for-production"><strong>Step 5: Building the React App for Production</strong></h2>
<p>A React app runs on a development server by default, but in production we need <strong>optimized static files</strong>.</p>
<p>So I ran:</p>
<p>npm install</p>
<p>npm run build</p>
<ul>
<li><p>npm install → Downloads all dependencies listed in package.json.</p>
</li>
<li><p>npm run build → Creates a production-ready version of the app in the build/ folder.</p>
</li>
</ul>
<p>The build/ folder contains <strong>HTML, CSS, and JavaScript files</strong> that Nginx can serve directly.</p>
<hr />
<h2 id="heading-step-6-deploying-react-app-with-nginx"><strong>Step 6: Deploying React App with Nginx</strong></h2>
<p>Now we needed to replace Nginx’s default files with my React build.</p>
<ol>
<li><p>Remove the default Nginx files:<br /> sudo rm -rf /var/www/html/*</p>
</li>
<li><p>Copy React build files into Nginx’s web directory:<br /> sudo cp -r build/* /var/www/html/</p>
</li>
<li><p>Set correct ownership and permissions so Nginx can serve them:<br /> sudo chown -R www-data:www-data /var/www/html</p>
<p> sudo chmod -R 755 /var/www/html</p>
</li>
</ol>
<hr />
<h2 id="heading-step-7-configuring-nginx-for-react"><strong>Step 7: Configuring Nginx for React</strong></h2>
<p>React is a <strong>Single Page Application (SPA)</strong>, meaning all routes should point to index.html. Without configuring this, refreshing the page could result in a <strong>404 error</strong>.</p>
<p>So I updated Nginx’s default configuration:</p>
<p>echo 'server {</p>
<p>listen 80;</p>
<p>server_name _;</p>
<p>root /var/www/html;</p>
<p>index index.html;</p>
<p>location / {</p>
<p>try_files $uri /index.html;</p>
<p>}</p>
<p>error_page 404 /index.html;</p>
<p>}' | sudo tee /etc/nginx/sites-available/default &gt; /dev/null</p>
<p>Then restarted Nginx to apply changes:</p>
<p>sudo systemctl restart nginx</p>
<p>Now, Nginx was ready to serve my React app properly.</p>
<hr />
<h2 id="heading-step-8-testing-my-deployment"><strong>Step 8: Testing My Deployment</strong></h2>
<p>Finally, I got my server’s public IP by running:</p>
<p>curl ifconfig.me</p>
<p>When I opened http://&lt;public-ip&gt; in my browser →  my React app appeared, with my custom message:</p>
<p><strong>“Deployed by: Pujan Bhattarai – Date: 21/08/2025”</strong></p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1755789359561/83647344-742d-4cab-961a-d41d16f10b9c.png" alt class="image--center mx-auto" /></p>
<h2 id="heading-key-takeaways">Key Takeaways</h2>
<p>This project gave me hands-on DevOps skills:</p>
<ul>
<li><p>How to <strong>set up and connect to an EC2 instance</strong>.</p>
</li>
<li><p>Installing <strong>Node.js, npm, and Nginx</strong> on a Linux server.</p>
</li>
<li><p>Building and serving a React app in a <strong>production-ready environment</strong>.</p>
</li>
<li><p>Configuring Nginx to support <strong>single page applications</strong>.</p>
</li>
</ul>
<p>Most importantly, I experienced the full process of taking an app from <strong>local development → cloud deployment → production hosting</strong>.</p>
<h2 id="heading-credits">Credits</h2>
<p><em>This hands-on exercise was guided by the practices shared in Pravin Mishra’s</em> <strong><em>FREE DevOps for Beginners Cohort</em></strong>. For anyone beginning their DevOps journey, it’s an excellent resource to explore and experiment with practical, end-to-end deployments.</p>
]]></content:encoded></item><item><title><![CDATA[Visualize data using Amazon QuickSight]]></title><description><![CDATA[Introduction
The ability to transform raw data into useful insights is critical in today's data-driven world. The secret to unlocking the potential in your data is data visualization. Amazon QuickSight is a cloud-based business intelligence (BI) serv...]]></description><link>https://pujanbhattarai0.com.np/visualize-data-using-amazon-quicksight</link><guid isPermaLink="true">https://pujanbhattarai0.com.np/visualize-data-using-amazon-quicksight</guid><category><![CDATA[Cloud]]></category><category><![CDATA[#data visualisation]]></category><dc:creator><![CDATA[Pujan Bhattarai]]></dc:creator><pubDate>Tue, 28 Jan 2025 11:47:34 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/upload/v1738064660077/68554ce5-31b0-4081-9264-1fef9b74f2f7.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<h2 id="heading-introduction">Introduction</h2>
<p>The ability to transform raw data into useful insights is critical in today's data-driven world. The secret to unlocking the potential in your data is data visualization. Amazon QuickSight is a cloud-based business intelligence (BI) service provided by Amazon Web Services (AWS) that specializes in data visualization. It enables organizations to create interactive dashboards, analyze data, and generate insights through visualizations. QuickSight is designed to be scalable, user-friendly, and cost-effective, making it suitable for businesses of all sizes.</p>
<p>In this blog, we’ll explore the powerful integration of Amazon QuickSight and Amazon S3, showcasing how this dynamic pair can transform your data into impactful insights and compelling visualizations.</p>
<h2 id="heading-prerequisites"><strong>Prerequisites</strong></h2>
<ol>
<li><p>Have an AWS account. If you don’t have one, sign up here <a target="_blank" href="https://aws.amazon.com/">free</a> and enjoy the benefit of 12 month free tier account.</p>
</li>
<li><p>Download the datasets from Kaggle. In this demo, I use Netflix dataset from Kaggle and other manifest is used to define the data source and data import configuration in Quick sight.</p>
</li>
</ol>
<h2 id="heading-high-level-architecture">High-Level Architecture</h2>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1738042719044/682b9688-f490-4175-8f90-8803fa2291d4.jpeg" alt class="image--center mx-auto" /></p>
<h2 id="heading-store-the-dataset-in-s3-bucket">Store the Dataset in S3 Bucket</h2>
<ol>
<li><p>Open AWS management Console and search S3.</p>
</li>
<li><p>Click on the Create Bucket.</p>
</li>
<li><p>Give the bucket name unique , leave all the default setting as it is and click on the create bucket.</p>
</li>
<li><p>Click on the bucket that looks like this:</p>
<p> <img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1738043381531/40e77609-4042-4b2d-a675-3e4106cbde5d.png" alt class="image--center mx-auto" /></p>
</li>
<li><p>Upload the dataset that we downloaded Earlier.</p>
<p> <img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1738043640378/a11ae1e1-2244-47c3-aaf1-b077af9b7c4a.png" alt class="image--center mx-auto" /></p>
</li>
<li><p>Now, we have created a bucket which contains our netflix.csv and manifest.json file and copy the netflix_titles.csv URI.</p>
<p> <img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1738044117226/30976f60-6243-441a-9339-5cb05b97be32.png" alt class="image--center mx-auto" /></p>
</li>
<li><p>Open manifest.json file and change the file location URI with your S3 URI which is copied earlier and re-upload the manifest.json file in your s3 bucket.</p>
</li>
</ol>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1738044307269/cbd28015-36c7-49f9-9beb-3ade44edf2ac.png" alt class="image--center mx-auto" /></p>
<h2 id="heading-create-amazon-quicksight-account">Create Amazon Quicksight Account</h2>
<ol>
<li><p>Search for Amazon QuickSight in AWS management Console.</p>
</li>
<li><p>Select Sign up for QuickSight.</p>
</li>
</ol>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1738045829344/6c2e2158-2f3b-4306-a4fa-c9ac104c2d40.png" alt class="image--center mx-auto" /></p>
<ol start="3">
<li>Enter the email address, scroll down, and select the Amazon S3 bucket. Then, click on the bucket that was created previously. Lastly, Don’t forgot to untick the Add Pixel-Perfect Reports.</li>
</ol>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1738046175988/4335915c-7a2c-4e79-8fdf-ddc3ca934e32.png" alt class="image--center mx-auto" /></p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1738046311716/916eb096-9dbf-4b1d-9343-c3a52c2dd485.png" alt class="image--center mx-auto" /></p>
<ol start="4">
<li>Finally, Our Amazon QuickSight account has been created. Let’s play around it.</li>
</ol>
<h2 id="heading-connect-s3-bucket-to-amazon-quicksight">Connect S3 bucket to Amazon QuickSight</h2>
<ol>
<li><p>From the left hand navigation bar, select <strong>Datasets</strong>, then <strong>New dataset</strong>.</p>
</li>
<li><p>Select <strong>S3</strong>.</p>
</li>
<li><p>For the first field (data source name), enter netflix-data.</p>
</li>
<li><p>Ooo there's also a second field called <strong>manifest.json URL</strong> - does manifest.json found familiar?</p>
</li>
<li><p>Open a <strong>new tab</strong> to open your AWS Management Console again. Head back to your S3 bucket.</p>
</li>
<li><p>Select the checkbox next to <strong>manifest.json</strong>, then select <strong>Copy S3 URI</strong>.</p>
</li>
</ol>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1738046959728/eb3ac122-f882-4705-a32b-c3f2d5f661fd.png" alt class="image--center mx-auto" /></p>
<ol start="7">
<li>Enter the S3 URI to our <strong>manifest.json</strong> file.</li>
</ol>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1738047021727/e380e066-6a4a-4f09-9947-45bb7d93e4ae.png" alt class="image--center mx-auto" /></p>
<ol start="8">
<li>Select Visualize and <strong>Interactive sheet</strong> to start creating visualizations.</li>
</ol>
<h2 id="heading-create-our-first-quicksight-visualisation">Create Our First QuickSight Visualisation</h2>
<p>Now we get to the creative part! With QuickSight, you can sort, filter, and customize your data to create visualizations. You can also experiment with different types of graphs like bar charts, pie charts, line graphs, etc.</p>
<p>We can see on the left hand panel that the dataset's fields are already imported.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1738048103401/9d5a8df9-5923-423a-a941-1cba46e0a65e.png" alt class="image--right mx-auto mr-0" /></p>
<ol>
<li>There you can see a different types of charts to create a visualization. Drag <strong>release_year</strong> into the Y-Axis heading. Woo! Now you can see a breakdown on the year that these Netflix-featured TV shows and movies were released.</li>
</ol>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1738048752525/be6322fa-204b-42f3-8c8a-305450f5bdbe.png" alt class="image--center mx-auto" /></p>
<ol start="2">
<li>Let's save this in a dashboard. Click on the frame surrounding our lovely donut chart, and click on the white boxes at the edges to resize it.</li>
</ol>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1738060046709/65b24e4b-470a-478d-bcce-961660a15c69.png" alt class="image--center mx-auto" /></p>
<ol start="3">
<li><p>Now let's create a new visual, select <strong>+ ADD</strong> under the <strong>Visuals</strong> heading on our middle navigation bar, and we'll see another blank frame pop out.</p>
</li>
<li><p>Drag the <strong>release_year</strong> label into the <strong>Y Axis</strong> heading.</p>
</li>
<li><p>Next, drag the <strong>type</strong> label into the <strong>Group/Color</strong> heading.</p>
</li>
</ol>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1738049187894/93ef1d43-ee99-42df-b3b4-4e519ad22c35.png" alt class="image--center mx-auto" /></p>
<ol start="6">
<li>Chart that shows the breakdown of TV shows/movies for each release year. Change the graph type to <strong>Horizontal stacked 100% bar chart.</strong></li>
</ol>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1738049505566/e0f0d41b-bc9f-466e-893c-dcd79c1332f4.png" alt class="image--center mx-auto" /></p>
<p>Now, Let’s demonstrate the same thing on table format.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1738050570130/15241bea-346b-4392-b052-853def7f253d.png" alt class="image--center mx-auto" /></p>
<p>Finally, we can see our dashboard looks like this.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1738050651629/3d7ed2e0-340a-4b70-9b17-2ca56baa5f80.png" alt class="image--center mx-auto" /></p>
<p>Double-click on the titles of all the charts you see in front of you, and the 'Edit Title' panel will pop up. You can give each chart a name which is clear and easy to understand.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1738050997650/fb4b1f23-7fa7-43b2-b86b-e892329f8f05.png" alt class="image--center mx-auto" /></p>
<p>Final step, On the top right hand corner, select <strong>Publish</strong>. This will make your dashboard public so you can share it with your team.</p>
<h2 id="heading-conclusion">Conclusion</h2>
<p>In this project, we explored the powerful capabilities of Amazon QuickSight to transform raw data into actionable insights through intuitive visualizations that enable better decision-making. Through this hands-on exploration of QuickSight, we demonstrated its potential to enhance data-driven decision-making within businesses by providing intuitive, real-time visual analytics. The flexibility, scalability, and cost-effectiveness of QuickSight make it a powerful tool for organizations looking to harness the value of their data.</p>
<h2 id="heading-resources">Resources</h2>
<p>Amazon S3: <a target="_blank" href="https://aws.amazon.com/s3/">Amazon S3 - Cloud Object Storage - AWS</a></p>
<p>Amazon QuickSight: <a target="_blank" href="https://aws.amazon.com/quicksight/?amazon-quicksight-whats-new.sort-by=item.additionalFields.postDateTime&amp;amazon-quicksight-whats-new.sort-order=desc">Business Intelligence Tools - Amazon QuickSight - AWS</a></p>
]]></content:encoded></item></channel></rss>