Full Trust European Hosting

BLOG about Full Trust Hosting and Its Technology - Dedicated to European Windows Hosting Customer

Node.js Hosting - HostForLIFE :: Node.js API Rate Limiting Explained: Token Bucket & Leaky Bucket Techniques

clock August 25, 2025 09:25 by author Peter

By restricting the number of requests a client may make in a given amount of time, rate limiting guards against abuse and evens out spikes. Without it, a problem or a noisy neighbor could overload your server, raise expenses, and make the experience worse for everyone. Rate limitation is usually included as Express middleware in Node.js, and you select an algorithm based on your traffic trends.

Why Rate Limit? (Simple Words)

  • Fairness: Prevent one user from hogging resources.
  • Stability: Avoid sudden traffic spikes that crash servers.
  • Security: Mitigate brute‑force login attempts and scraping.
  • Cost Control: Keep bandwidth and compute costs predictable.

Core Ideas You’ll Use

  • Identity (the key): How you group requests (e.g., by IP, API key, user ID).
  • Allowance: How many requests are allowed per window or per second.
  • Storage: Where you remember counts/tokens (in‑memory for a single instance; Redis for a cluster).
  • Backoff/Signals: How the client should slow down (HTTP 429 + headers like Retry-After).

Algorithm Overview (When to Use What)

  • Fixed Window Counter: Simple. “100 requests every 60s.” Can burst at window edges.
  • Sliding Window (Log or Rolling): Smoother than fixed. More accurate but heavier.
  • Token Bucket: Allows short bursts but enforces an average rate. Great for user‑facing APIs.
  • Leaky Bucket (Queue/Drip): Smooth, constant outflow; good when you must strictly pace downstream systems.


Baseline: Fixed Window Counter (In‑Memory)
Good as a learning step or for single‑process dev environments.
// middleware/fixedWindowLimiter.js
const WINDOW_MS = 60_000; // 60 seconds
const MAX_REQUESTS = 100; // per window per key

const store = new Map(); // key -> { count, windowStart }

function getKey(req) {
  return req.ip; // or req.headers['x-api-key'], req.user.id, etc.
}

module.exports = function fixedWindowLimiter(req, res, next) {
  const key = getKey(req);
  const now = Date.now();
  const entry = store.get(key) || { count: 0, windowStart: now };

  if (now - entry.windowStart >= WINDOW_MS) {
    entry.count = 0;
    entry.windowStart = now;
  }

  entry.count += 1;
  store.set(key, entry);

  const remaining = Math.max(0, MAX_REQUESTS - entry.count);
  res.setHeader('X-RateLimit-Limit', MAX_REQUESTS);
  res.setHeader('X-RateLimit-Remaining', Math.max(0, remaining));
  res.setHeader('X-RateLimit-Reset', Math.ceil((entry.windowStart + WINDOW_MS) / 1000));

  if (entry.count > MAX_REQUESTS) {
    res.setHeader('Retry-After', Math.ceil((entry.windowStart + WINDOW_MS - now) / 1000));
    return res.status(429).json({ error: 'Too Many Requests' });
  }

  next();
};

Token Bucket (Burst‑friendly Average Rate)
How it works: You have a bucket that slowly refills with tokens (e.g., 5 tokens/second) up to a max capacity (burst). Each request consumes a token. No tokens? The request is limited.
// middleware/tokenBucketLimiter.js
const RATE_PER_SEC = 5;      // refill speed
const BURST_CAPACITY = 20;   // max tokens

const buckets = new Map();   // key -> { tokens, lastRefill }

function getKey(req) { return req.ip; }

module.exports = function tokenBucketLimiter(req, res, next) {
  const key = getKey(req);
  const now = Date.now();
  let bucket = buckets.get(key);
  if (!bucket) {
    bucket = { tokens: BURST_CAPACITY, lastRefill: now };
    buckets.set(key, bucket);
  }

  // Refill based on elapsed time
  const elapsedSec = (now - bucket.lastRefill) / 1000;
  bucket.tokens = Math.min(BURST_CAPACITY, bucket.tokens + elapsedSec * RATE_PER_SEC);
  bucket.lastRefill = now;

  if (bucket.tokens >= 1) {
    bucket.tokens -= 1; // consume for this request
    res.setHeader('X-RateLimit-Policy', `${RATE_PER_SEC}/sec; burst=${BURST_CAPACITY}`);
    res.setHeader('X-RateLimit-Tokens', Math.floor(bucket.tokens));
    return next();
  }

  const needed = 1 - bucket.tokens;
  const waitSeconds = needed / RATE_PER_SEC;
  res.setHeader('Retry-After', Math.ceil(waitSeconds));
  return res.status(429).json({ error: 'Too Many Requests' });
};

When to use: You want to permit quick bursts (nice UX) but keep a sustained average.

Leaky Bucket (Constant Outflow) 

How it works: Requests enter a queue (the bucket). They “leak” at a fixed rate. If the bucket is full, you reject or drop new requests.
// middleware/leakyBucketLimiter.js
const LEAK_RATE_PER_SEC = 5;    // how many requests per second can pass
const BUCKET_CAPACITY = 50;     // max queued requests

const buckets = new Map();      // key -> { queue, lastLeak }

function getKey(req) { return req.ip; }

module.exports = function leakyBucketLimiter(req, res, next) {
  const key = getKey(req);
  const now = Date.now();
  let bucket = buckets.get(key);
  if (!bucket) {
    bucket = { queue: 0, lastLeak: now };
    buckets.set(key, bucket);
  }

  // Leak based on elapsed time
  const elapsedSec = (now - bucket.lastLeak) / 1000;
  const leaked = Math.floor(elapsedSec * LEAK_RATE_PER_SEC);
  if (leaked > 0) {
    bucket.queue = Math.max(0, bucket.queue - leaked);
    bucket.lastLeak = now;
  }

  if (bucket.queue >= BUCKET_CAPACITY) {
    res.setHeader('Retry-After', 1);
    return res.status(429).json({ error: 'Too Many Requests (bucket full)' });
  }

  bucket.queue += 1; // enqueue this request
  // In practice, you would defer processing; for middleware demo we let it pass immediately
  next();
};

When to use: You must strictly pace downstream dependencies (e.g., payment gateway rate caps).

Wiring It Up in Express
// server.js
const express = require('express');
const fixedWindowLimiter = require('./middleware/fixedWindowLimiter');
const tokenBucketLimiter = require('./middleware/tokenBucketLimiter');
// const leakyBucketLimiter = require('./middleware/leakyBucketLimiter');

const app = express();

// Example: apply global limiter
app.use(tokenBucketLimiter);

// Or apply per‑route
app.get('/public', fixedWindowLimiter, (req, res) => res.send('ok'));
app.get('/payments', /* leakyBucketLimiter, */ (req, res) => res.send('paid'));

app.listen(3000, () => console.log('API on :3000'));


Production‑Ready Storage with Redis

In clustered or serverless environments, in‑memory maps don’t work across instances. Use a shared store like Redis to coordinate limits.
// middleware/redisTokenBucket.js
const IORedis = require('ioredis');
const redis = new IORedis(process.env.REDIS_URL);

const RATE_PER_SEC = 10;
const BURST_CAPACITY = 40;

function keyFor(clientKey) { return `rl:tb:${clientKey}`; }

module.exports = async function redisTokenBucket(req, res, next) {
  try {
    const clientKey = req.ip; // replace with API key or user id in real apps
    const now = Date.now();
    const k = keyFor(clientKey);

    // Read bucket state
    const data = await redis.hmget(k, 'tokens', 'lastRefill');
    let tokens = parseFloat(data[0]);
    let lastRefill = parseInt(data[1], 10);

    if (Number.isNaN(tokens)) tokens = BURST_CAPACITY;
    if (Number.isNaN(lastRefill)) lastRefill = now;

    const elapsedSec = (now - lastRefill) / 1000;
    tokens = Math.min(BURST_CAPACITY, tokens + elapsedSec * RATE_PER_SEC);

    if (tokens >= 1) {
      tokens -= 1;
      await redis.hmset(k, 'tokens', tokens, 'lastRefill', now);
      await redis.expire(k, Math.ceil(BURST_CAPACITY / RATE_PER_SEC) + 60);
      res.setHeader('X-RateLimit-Policy', `${RATE_PER_SEC}/sec; burst=${BURST_CAPACITY}`);
      res.setHeader('X-RateLimit-Tokens', Math.floor(tokens));
      return next();
    }

    const needed = 1 - tokens;
    const waitSeconds = needed / RATE_PER_SEC;
    res.setHeader('Retry-After', Math.ceil(waitSeconds));
    return res.status(429).json({ error: 'Too Many Requests' });
  } catch (err) {
    // Fail‑open or fail‑closed? Choose policy. Here we fail‑open so API stays usable.
    console.error('Rate limiter error', err);
    next();
  }
};


Testing Your Limiter (Quick Ideas)

  • Unit tests: Simulate timestamps and assert counters/tokens.
  • Load tests: Use autocannon or k6 to verify 429 rates, latencies, and headers.
  • Chaos tests: Kill Redis or introduce latency—does your API fail open or closed?


Helpful HTTP Headers
Return clear metadata so clients can self‑throttle:

  • X-RateLimit-Limit, X-RateLimit-Remaining, X-RateLimit-Reset
  • Retry-After on 429
  • (Optional, standardized) RateLimit-Limit, RateLimit-Remaining, RateLimit-Reset

Best Practices & Tips

  • Choose the key wisely: Prefer API key/user ID over raw IP (NATs/proxies share IPs).
  • Protect sensitive routes more: e.g., logins: 5/min per user + per IP.
  • Combine with caching & auth: Rate limit after auth to identify the true principal.
  • Use Redis for scale: In‑memory only works on a single instance.
  • Expose headers & docs: Tell clients how to back off.
  • Observe: Log 429s, export metrics (Prometheus) and set alerts.
  • Legal & UX: Don’t silently drop; return 429 with guidance.

Choosing an Algorithm (Cheat Sheet)

  • Public API with bursts OK: Token Bucket
  • Strict pacing to external vendor: Leaky Bucket
  • Simple per‑minute cap: Fixed/Sliding Window
  • High accuracy under spiky traffic: Sliding Window (rolling)

Summary
Rate limiting is essential for reliable Node.js APIs. Start by defining who you limit (key), how much (policy), and where you store state (Redis for multi‑instance). Pick an algorithm that matches your needs: fixed/sliding windows for simplicity, a token bucket for burst‑friendly average rates, or a leaky bucket for steady pacing. Implement as Express middleware, return helpful headers, test under load, and monitor 429s. With these patterns, your API stays fast, fair, and resilient—even during traffic spikes.



Node.js Hosting - HostForLIFE :: What Are Node.js's Typical Use Cases?

clock August 20, 2025 08:11 by author Peter

Why Node.js is Popular?
Node.js is fast, event-driven, and non-blocking, which means it can handle many tasks at the same time without slowing down. This makes it a popular choice for developers who need scalable and efficient applications.

 


Building APIs
Node.js is commonly used to build RESTful or GraphQL APIs. APIs allow different applications or services to communicate with each other.

Example
const express = require('express');
const app = express();
app.use(express.json());

app.get('/users', (req, res) => {
  res.json([{ id: 1, name: 'Alice' }, { id: 2, name: 'Bob' }]);
});

app.listen(3000, () => {
  console.log('API server running on port 3000');
});


Node.js handles multiple API requests at the same time, making it suitable for backend services.

Real-Time Applications
Node.js is perfect for real-time apps such as chat applications, online games, or collaborative tools because it supports fast, two-way communication using WebSockets.

Example
const WebSocket = require('ws');
const wss = new WebSocket.Server({ port: 8080 });

wss.on('connection', ws => {
  ws.send('Welcome!');
  ws.on('message', message => {
    console.log(`Received: ${message}`);
  });
});


WebSockets allow the server and client to communicate instantly, making real-time interactions possible.

Streaming Applications
Node.js is ideal for streaming audio, video, or large files efficiently because it processes data in chunks.

Example
const fs = require('fs');
const http = require('http');

http.createServer((req, res) => {
  const stream = fs.createReadStream('large-video.mp4');
  stream.pipe(res);
}).listen(3000, () => {
  console.log('Streaming server running on port 3000');
});


Streams send data in small pieces, preventing memory overload and improving performance.

Microservices

Node.js works well for microservices, where an application is divided into small, independent services that handle specific tasks.

Example
const express = require('express');
const app = express();
app.use(express.json());

app.post('/orders', (req, res) => {
  const order = req.body;
  res.json({ message: 'Order created', order });
});

app.listen(4000, () => {
  console.log('Order microservice running on port 4000');
});

Each microservice handles a specific domain, communicates via APIs, and can be scaled independently.

Summary
Node.js is widely used for APIs, real-time applications, streaming services, and microservices. Its event-driven, non-blocking architecture allows developers to handle multiple tasks efficiently, making it perfect for scalable and responsive applications. Understanding these use cases helps developers choose Node.js for projects requiring speed, performance, and easy scalability.

HostForLIFE.eu Node.js Hosting
HostForLIFE.eu is European Windows Hosting Provider which focuses on Windows Platform only. We deliver on-demand hosting solutions including Shared hosting, Reseller Hosting, Cloud Hosting, Dedicated Servers, and IT as a Service for companies of all sizes. We have customers from around the globe, spread across every continent. We serve the hosting needs of the business and professional, government and nonprofit, entertainment and personal use market segments.



Node.js Hosting - HostForLIFE :: What is the Event Loop in Node.js, and How Does It Work?

clock August 14, 2025 07:41 by author Peter

The Event Loop is the secret of Node.js's ability to manage thousands of concurrent actions despite being single-threaded, as developers often learn. Even with a single main thread, this approach makes sure Node.js runs code effectively without interfering with other processes.

The Reason for the Event Loop
JavaScript was created to manage keystrokes and clicks on interactive web sites when they ran in browsers.  The event loop in a browser guarantees fluid interactions without causing the user interface to freeze. JavaScript was brought to the server side by Node.js, where it handles I/O tasks including sending network requests, reading files, and querying databases.  These can be managed without halting the execution of other code thanks to the Event Loop.

How the Event Loop Works in Node.js
The Event Loop is managed by libuv, a C library that provides asynchronous I/O. Here’s the step-by-step process:

  • Call Stack Execution: Node.js runs your synchronous code first.
  • Delegating Tasks: When asynchronous functions like setTimeout or fs.readFile are called, they are handed over to background APIs or the thread pool.
  • Callback Queue: Once the background task is done, its callback is added to the queue.
  • Event Loop Processing: The event loop checks if the call stack is empty and then pushes the next callback from the queue to be executed.

Event Loop Phases
The Node.js Event Loop runs in phases:

  • Timers: Executes callbacks from setTimeout and setInterval.
  • Pending Callbacks: Executes callbacks for system operations.
  • Idle, Prepare: Internal use only.
  • Poll: Retrieves new I/O events; executes I/O callbacks.
  • Check: Executes setImmediate callbacks.
  • Close Callbacks: Executes close events (e.g., socket.on('close')).


Microtasks (like process.nextTick() and resolved promises) run between these phases, before moving to the next phase.

Example: Event Loop in Action
Example:
console.log("Start");

setTimeout(() => {
  console.log("Timeout callback");
}, 0);

Promise.resolve().then(() => {
  console.log("Promise callback");
});

console.log("End");


Output:

  • Start
  • End
  • Promise callback
  • Timeout callback

Explanation:
Promise callback runs before Timeout callback because promises are microtasks, which have higher priority than macrotasks like setTimeout.

Understanding Microtasks vs. Macrotasks
Microtasks: process.nextTick(), Promise.then(). Run immediately after the current operation.
Macrotasks: setTimeout(), setImmediate(), I/O callbacks. Run in the normal event loop phases.

Key Points to Remember
Node.js is single-threaded for JavaScript execution.
The Event Loop allows asynchronous, non-blocking operations.
Microtasks always run before the next macrotask.
libuv handles background tasks and the thread pool.

Summary
The Event Loop is the heart of Node.js's asynchronous programming model. It ensures that even though JavaScript runs on a single thread, Node.js can handle thousands of concurrent tasks without blocking. By delegating I/O operations to the background and using a queue system for callbacks, it keeps applications fast and responsive. Understanding the Event Loop is essential for writing efficient Node.js applications.



Node.js Hosting - HostForLIFE :: Uploading Files from React to Cloudinary: A Comprehensive Guide with Secure Backend, Progress, and Preview (2025)

clock July 28, 2025 07:24 by author Peter

A common feature of many contemporary web applications is the ability to upload files, such as documents, videos, and images. In this tutorial, we'll demonstrate how to use Cloudinary, a potent media management platform, to create a robust file upload feature in a React (v18+) application. We'll also go over secure backend-signed uploads using Node.js (Express). Using functional components, React Hooks, and contemporary best practices, we'll create a reusable upload component that supports a variety of file types, displays a preview when feasible, tracks the upload process, and securely uploads media using a backend-generated signature.

 

What is Cloudinary?
Cloudinary is a cloud-based service for storing, optimizing, and delivering images, videos, and other media files. It simplifies media handling by providing:

  • Media upload and storage
  • CDN delivery and transformation
  • Automatic optimization and responsive images
  • Support for multiple media types

What Will We Build?
A full-stack app (React + Node.js) that:

  • Accepts images, videos, and documents as input
  • Shows previews for image/video types
  • Tracks upload progress
  • Generates a secure upload signature on the backend
  • Uploads securely to Cloudinary

Project Structure

 

cloudinary-react-upload/
├── client/            # React frontend
│   ├── src/
│   │   ├── components/FileUploader.jsx
│   │   ├── App.jsx
│   │   └── main.jsx
│   └── .env
├── server/            # Node.js backend
│   ├── index.js
│   └── .env
├── package.json (root - manages both client/server via scripts)

 

Step 1. Cloudinary Setup

  • Sign up at cloudinary.com
  • Go to your dashboard and note:
    • Cloud Name
    • API Key
    • API Secret
  • Navigate to Settings > Upload > Upload Presets
    • Create a new signed preset
    • Enable "Auto format" and "Auto resource type"

Backend .env (in server/.env)
CLOUD_NAME=your_cloud_name
CLOUD_API_KEY=your_api_key
CLOUD_API_SECRET=your_api_secret
UPLOAD_PRESET=your_signed_preset

Step 2: Backend Setup with Node.js (Express)
Install dependencies
cd server
npm init -y
npm install express dotenv cors cloudinary

server/index.js
import express from 'express';
import cors from 'cors';
import dotenv from 'dotenv';
import { v2 as cloudinary } from 'cloudinary';

dotenv.config();
const app = express();
app.use(cors());

cloudinary.config({
  cloud_name: process.env.CLOUD_NAME,
  api_key: process.env.CLOUD_API_KEY,
  api_secret: process.env.CLOUD_API_SECRET
});

app.get('/get-signature', (req, res) => {
  const timestamp = Math.floor(Date.now() / 1000);
  const signature = cloudinary.utils.api_sign_request(
    {
      timestamp,
      upload_preset: process.env.UPLOAD_PRESET,
    },
    process.env.CLOUD_API_SECRET
  );

  res.json({
    timestamp,
    signature,
    cloudName: process.env.CLOUD_NAME,
    apiKey: process.env.CLOUD_API_KEY,
    uploadPreset: process.env.UPLOAD_PRESET,
  });
});

const PORT = process.env.PORT || 4000;
app.listen(PORT, () => console.log(`Server running on port ${PORT}`));


Run the backend:
node index.js

Step 3. React Frontend Setup (Vite)
Create project and install dependencies:
npm create vite@latest client -- --template react
cd client
npm install axios

Frontend .env (in client/.env)
VITE_API_URL=http://localhost:4000

Step 4. FileUploader Component (Secure Upload)

client/src/components/FileUploader.jsx

import { useState, useRef } from 'react';
import axios from 'axios';

const FileUploader = () => {
  const [file, setFile] = useState(null);
  const [previewUrl, setPreviewUrl] = useState(null);
  const [progress, setProgress] = useState(0);
  const [uploadedUrl, setUploadedUrl] = useState(null);
  const inputRef = useRef();

  const handleFileChange = (e) => {
    const selected = e.target.files[0];
    setFile(selected);
    setUploadedUrl(null);
    setProgress(0);

    if (selected?.type.startsWith('image') || selected?.type.startsWith('video')) {
      const url = URL.createObjectURL(selected);
      setPreviewUrl(url);
    } else {
      setPreviewUrl(null);
    }
  };

  const handleUpload = async () => {
    if (!file) return;

    try {
      const { data: signatureData } = await axios.get(`${import.meta.env.VITE_API_URL}/get-signature`);

      const formData = new FormData();
      formData.append('file', file);
      formData.append('api_key', signatureData.apiKey);
      formData.append('timestamp', signatureData.timestamp);
      formData.append('upload_preset', signatureData.uploadPreset);
      formData.append('signature', signatureData.signature);

      const { data } = await axios.post(
        `https://api.cloudinary.com/v1_1/${signatureData.cloudName}/auto/upload`,
        formData,
        {
          onUploadProgress: (e) => {
            const percent = Math.round((e.loaded * 100) / e.total);
            setProgress(percent);
          },
        }
      );

      setUploadedUrl(data.secure_url);
      inputRef.current.value = null;
    } catch (err) {
      console.error('Upload failed:', err);
      alert('Upload failed. Check console.');
    }
  };

  return (
    <section style={{ padding: '1rem' }}>
      <h2>Secure File Upload to Cloudinary</h2>

      <input
        ref={inputRef}
        type="file"
        accept="image/*,video/*,.pdf,.doc,.docx"
        onChange={handleFileChange}
      />

      {previewUrl && file?.type.startsWith('image') && (
        <img src={previewUrl} alt="Preview" width={200} style={{ marginTop: '1rem' }} />
      )}

      {previewUrl && file?.type.startsWith('video') && (
        <video width={300} controls style={{ marginTop: '1rem' }}>
          <source src={previewUrl} type={file.type} />
        </video>
      )}

      {!previewUrl && file && (
        <p style={{ marginTop: '1rem' }}>Selected File: {file.name}</p>
      )}

      <button onClick={handleUpload} disabled={!file} style={{ marginTop: '1rem' }}>
        Upload
      </button>

      {progress > 0 && <p>Progress: {progress}%</p>}

      {uploadedUrl && (
        <div style={{ marginTop: '1rem' }}>
          <p>Uploaded Successfully!</p>
          <a href={uploadedUrl} target="_blank" rel="noopener noreferrer">View File</a>
        </div>
      )}
    </section>
  );
};

export default FileUploader;


Step 5. Use Component in App
client/src/App.jsx

import FileUploader from './components/FileUploader';

function App() {
  return (
    <div style={{ maxWidth: '600px', margin: '0 auto', fontFamily: 'sans-serif' }}>
      <h1>Cloudinary File Uploader</h1>
      <FileUploader />
    </div>
  );
}

export default App;

Why Use Signed Uploads?

Cloudinary offers two ways to upload files:

  • Unsigned Uploads: Anyone with your upload preset can upload files. Not recommended for production because it's insecure.
  • Signed Uploads (used in this guide): The backend signs each upload request using your Cloudinary secret key, making it secure. This ensures:
    • Files are uploaded only by authenticated users (if you add auth)
    • Upload presets can't be abused
    • You have more control over what's uploaded

Best Practices

  • Use /auto/upload endpoint to auto-detect file type (image/video/raw)
  • Don’t expose Cloudinary secret API keys in frontend
  • Limit file size on client and/or backend

Supported File Types
Cloudinary accepts:

  • Images: jpg, png, webp, etc.
  • Videos: mp4, mov, avi
  • Documents: pdf, doc, docx, txt (uploaded as raw)

Conclusion
In this post, we developed a cutting-edge React file uploader that works flawlessly with Cloudinary. It offers a safe, production-ready starting point, preview capabilities, progress tracking, and support for a variety of file types. Blogs, admin panels, profile setups, and CMSs can all make use of this uploader. Take into account backend signed uploads or Cloudinary's transformation capabilities for more complex use cases.

HostForLIFE.eu Node.js Hosting
HostForLIFE.eu is European Windows Hosting Provider which focuses on Windows Platform only. We deliver on-demand hosting solutions including Shared hosting, Reseller Hosting, Cloud Hosting, Dedicated Servers, and IT as a Service for companies of all sizes. We have customers from around the globe, spread across every continent. We serve the hosting needs of the business and professional, government and nonprofit, entertainment and personal use market segments.



AngularJS Hosting Europe - HostForLIFE :: Angular Subscription Management: Using RxJS to Fix Memory Leaks

clock July 21, 2025 08:26 by author Peter

Angular uses RxJS Observables quite extensively for asynchronous data anything from HTTP requests, form value changes, events, route parameters, and many more. Most of the time, you would subscribe to them, but not unsubscribing properly may cause memory leaks and unexpected behavior, especially in large or long-running apps. In this article, we share how we faced a real-life issue related to missed unsubscriptions, how we identified the leak, and how we applied best practices, such as the takeUntil operator and a reusable base class.

Real Scenario
In various dashboard applications, several components had the possibility of listening to data streams coming from the API, user interactions, and changes in route parameters. Such components use subscription on observables through the RxJS subscribe() method inside the Angular lifecycle hooks of ngOnInit().

Example
ngOnInit(): void {
  this.route.params.subscribe(params => {
    this.loadData(params['id']);
  });

  this.userService.getUser().subscribe(user => {
    this.user = user;
  });
}


After navigating between routes multiple times, we noticed the following issues.

  • Console logs appeared multiple times for the same action.
  • Network requests were duplicated.
  • The browser’s memory usage slowly increased over time.

Root Cause

  • Upon inspection using Chrome DevTools memory tab and Angular DevTools, we found that components were not being garbage collected. This was due to active subscriptions holding references to destroyed components.
  • Solution: Using the takeUntil Pattern with Subject.

To fix this, we implemented the takeUntil pattern with a private Subject.

Step 1. Declare an Unsubscribe Subject.
private destroy$ = new Subject<void>();

Step 2. Use takeUntil(this.destroy$) in Every Subscription.
ngOnInit(): void {
  this.route.params
    .pipe(takeUntil(this.destroy$))
    .subscribe(params => this.loadData(params['id']));

  this.userService.getUser()
    .pipe(takeUntil(this.destroy$))
    .subscribe(user => this.user = user);
}


Step 3. Emit and complete the Subject in ngOnDestroy().
ngOnDestroy(): void {
  this.destroy$.next();
  this.destroy$.complete();
}

This pattern ensures that all subscriptions automatically unsubscribe when the component is destroyed.

Improvement: Create a Base Component Class

To avoid repeating the same code in every component, we created a base class.
export abstract class BaseComponent implements OnDestroy {
  protected destroy$ = new Subject<void>();

  ngOnDestroy(): void {
    this.destroy$.next();
    this.destroy$.complete();
  }
}


Now, in any component.
export class MyComponent extends BaseComponent implements OnInit {

  ngOnInit(): void {
    this.dataService.getData()
      .pipe(takeUntil(this.destroy$))
      .subscribe(data => this.data = data);
  }

}


Alternative Approach: AsyncPipe for Simpler Cases
In cases where we can bind observables directly in the template, we prefer using Angular’s AsyncPipe, which handles subscription and unsubscription automatically.

Instead of,
this.dataService.getData().subscribe(data => {
  this.data = data;
});


Use in the template.
data$ = this.dataService.getData();

<div *ngIf="data$ | async as data">
  {{ data.name }}
</div>


Conclusion
Failing to unsubscribe from observables in Angular can lead to performance issues, duplicate API calls, and memory leaks. Using takeUntil with a Subject is a reliable and scalable solution, especially when combined with a base component class. For simpler use cases, Angular's AsyncPipe provides a clean and safe way to handle subscriptions in templates. Adhering to these practices keeps your Angular applications running smoothly, easy to maintain, and protected from those memory leaks that can improve performance. You will maintain both efficiency and code clarity as a result.



AngularJS Hosting Europe - HostForLIFE :: Using Angular Route Guards to Secure Routes

clock July 16, 2025 10:07 by author Peter

Depending on whether a user is logged in or has particular permissions, we frequently need to limit access to particular routes in Angular apps. Angular offers Route Guards, like CanActivate, to protect certain routes. In one of our projects, we had to prevent users from accessing the dashboard unless they were authenticated. We created an AuthGuard with CanActivate and added logic to check if the user token was locally stored. Everything was running fine until we released the app.


 Some users claimed that they were repeatedly taken to the login screen even if they were already logged in.

Reasons for the Problem
We found that timing was the cause of the problem. The guard attempted to verify the token before the validation was finished, even though the app had already called the API to validate the token at startup. It is therefore believed that the user was not authenticated.

How did we fix it?

Our AuthGuard logic has been modified to wait for validation to complete before granting or denying access. We used a shared AuthService with an isAuthenticated$ observable rather than just checking local storage.

Here’s how we adjusted the AuthGuard.
canActivate(): Observable<boolean> {
  return this.authService.isAuthenticated$.pipe(
    take(1),
    map(isAuth => {
      if (!isAuth) {
        this.router.navigate(['/login']);
        return false;
      }
      return true;
    })
  );
}


And in the AuthService, we updated the token status using a BehaviorSubject once the API response came back.
private authStatus = new BehaviorSubject<boolean>(false);

isAuthenticated$ = this.authStatus.asObservable();

validateToken() {
  // Call backend to validate token
  this.http.get('/api/validate-token').subscribe(
    () => this.authStatus.next(true),
    () => this.authStatus.next(false)
  );
}

We called validateToken() once in AppComponent during app initialization.

Conclusion
Route guards are essential for secure routing in Angular apps. But they must be carefully integrated with authentication logic, especially when token validation involves an async call. Using an observable approach helps in handling real-time state and avoiding premature navigation decisions.



AngularJS Hosting Europe - HostForLIFE :: Using Angular Route Guards to Secure Routes

clock July 14, 2025 08:33 by author Peter

Depending on whether a user is logged in or has particular permissions, we frequently need to limit access to particular routes in Angular apps. Angular offers Route Guards, like CanActivate, to protect certain routes. In one of our projects, we had to prevent users from accessing the dashboard unless they were authenticated. We created an AuthGuard with CanActivate and added logic to check if the user token was locally stored. Everything was running fine until we released the app.


Some users claimed that they were repeatedly taken to the login screen even if they were already logged in.

Reasons for the Problem
We found that timing was the cause of the problem. The guard attempted to verify the token before the validation was finished, even though the app had already called the API to validate the token at startup. It is therefore believed that the user was not authenticated.

How did we fix it?
Our AuthGuard logic has been modified to wait for validation to complete before granting or denying access. We used a shared AuthService with an isAuthenticated$ observable rather than just checking local storage.

Here’s how we adjusted the AuthGuard.

canActivate(): Observable<boolean> {
  return this.authService.isAuthenticated$.pipe(
    take(1),
    map(isAuth => {
      if (!isAuth) {
        this.router.navigate(['/login']);
        return false;
      }
      return true;
    })
  );
}


And in the AuthService, we updated the token status using a BehaviorSubject once the API response came back.
private authStatus = new BehaviorSubject<boolean>(false);

isAuthenticated$ = this.authStatus.asObservable();

validateToken() {
  // Call backend to validate token
  this.http.get('/api/validate-token').subscribe(
    () => this.authStatus.next(true),
    () => this.authStatus.next(false)
  );
}


We called validateToken() once in AppComponent during app initialization.

Conclusion
Route guards are essential for secure routing in Angular apps. But they must be carefully integrated with authentication logic, especially when token validation involves an async call. Using an observable approach helps in handling real-time state and avoiding premature navigation decisions.



Node.js Hosting - HostForLIFE :: Building a Simple REST API in Node.js (GET, POST, PUT, DELETE)

clock July 10, 2025 08:19 by author Peter

Node.js is a robust framework for JavaScript server-side application development. GET, POST, PUT, and DELETE are the fundamental methods of every CRUD (Create, Read, Update, Delete) action, and we'll show you how to create a basic REST API that supports them in this article.

To keep things clear and easy, we'll use the well-liked Express framework.

First, start a Node.js project
To begin, make a new folder for your project and set up npm for it.

mkdir nodejs-api-demo
cd nodejs-api-demo
npm init -y


Then, install Express,
npm install express

Step 2. Create the API Server
Create a file named server.js and write the following code.
const express = require('express');
const app = express();
const port = 3000;

app.use(express.json()); // Middleware to parse JSON
let items = []; // Sample in-memory data store
// GET all items
app.get('/api/items', (req, res) => {
    res.json(items);
});
// POST a new item
app.post('/api/items', (req, res) => {
    const newItem = {
        id: Date.now(),
        name: req.body.name
    };
    items.push(newItem);
    res.status(201).json(newItem);
});

// PUT (update) an item by ID
app.put('/api/items/:id', (req, res) => {
    const id = parseInt(req.params.id);
    const index = items.findIndex(item => item.id === id);

    if (index !== -1) {
        items[index].name = req.body.name;
        res.json(items[index]);
    } else {
        res.status(404).json({ message: 'Item not found' });
    }
});
// DELETE an item by ID
app.delete('/api/items/:id', (req, res) => {
    const id = parseInt(req.params.id);
    const initialLength = items.length;
    items = items.filter(item => item.id !== id);

    if (items.length < initialLength) {
        res.json({ message: 'Item deleted' });
    } else {
        res.status(404).json({ message: 'Item not found' });
    }
});
// Hello World Endpoint
app.get('/', (req, res) => {
    res.send('Hello, World!');
});
app.listen(port, () => {
    console.log(`Server running at http://localhost:${port}`);
});


Step 3. Run Your API
Start your server using.

node server.js
Open your browser and go to:http://localhost:3000/ You should see "Hello, World!"

Use tools like Postman, Insomnia, or curl to test other endpoints:

  • GET http://localhost:3000/api/items
  • POST http://localhost:3000/api/items with JSON body: { "name": "Apple" }
  • PUT http://localhost:3000/api/items/123456789 with JSON: { "name": "Banana" }
  • DELETE http://localhost:3000/api/items/123456789

Summary
In this tutorial, you learned how to.

  • Initialize a Node.js project
  • Install and use Express.js
  • This setup is great for learning. For production, consider connecting to a real database, such as MongoDB, PostgreSQL, or MySQL.
  • Handle JSON data and HTTP methods (GET, POST, PUT, DELETE)
  • Build a basic in-memory CRUD API


Node.js Hosting - HostForLIFE :: Knowing Node.js's Event Loop, Callbacks, and Promises

clock July 3, 2025 08:00 by author Peter

The ability of Node.js to manage asynchronous processes is a major factor in its strength for creating quick, scalable network applications. You've probably heard of concepts like promises, callbacks, and event loops if you've ever dealt with Node.js.

In this post, we'll simplify these, go over actual code samples, and discuss how they all work together in the context of asynchronous programming.

Asynchronous programming: what is it?
JavaScript (as well as Node.js) only handles one instruction at a time since it is single-threaded. Unless we handle it asynchronously, if one operation (such as reading a file or sending a network request) takes a long time, it may prevent all other tasks from operating. Node.js can manage operations like file I/O, database queries, and API calls thanks to asynchronous programming, which keeps the application from stopping altogether.

The Event Loop: Node's Brain
The event loop is a mechanism in Node.js that allows it to perform non-blocking I/O operations. It listens for events and runs tasks from different queues (like timers or promise resolutions).

How does it work?

  • Executes synchronous code first.
  • Handles microtasks (like Promise.then()).
  • Then processes macrotasks (like setTimeout).
  • Repeats the cycle.

Example
console.log('Start');

setTimeout(() => {
  console.log('Timeout callback');
}, 0);

Promise.resolve().then(() => {
  console.log('Promise callback');
});
console.log('End');


Output
Start
End
Promise callback
Timeout callback


Why?

  • Promise.then() is a microtask, executed right after the synchronous code.
  • setTimeout() is a macrotask, executed after microtasks.

Callbacks: The Classic Asynchronous Tool
What is a Callback?

A callback is a function passed as an argument to another function. It’s executed when the first function completes - often asynchronously.
const fs = require('fs');

fs.readFile('file.txt', 'utf8', (err, data) => {
  if (err) {
    return console.error(err);
  }
  console.log(data);
});

Callback Hell
As you nest callbacks deeper, code can become hard to manage.
doTask1((res1) => {
  doTask2(res1, (res2) => {
    doTask3(res2, (res3) => {
      console.log(res3);
    });
  });
});


This “pyramid of doom” led to the evolution of promises.
Promises: A Modern Alternative

What is a Promise?
A promise is an object representing the eventual completion or failure of an asynchronous operation.
const myPromise = new Promise((resolve, reject) => {
  setTimeout(() => {
    resolve('Success!');
  }, 1000);
});
myPromise
  .then((value) => console.log(value))
  .catch((err) => console.error(err));

Promise States

  • Pending: Initial state.
  • Fulfilled: Operation completed successfully.
  • Rejected: Operation failed.

Promises in Action
function asyncTask() {
  return new Promise((resolve, reject) => {
    setTimeout(() => resolve('Task Done'), 2000);
  });
}
asyncTask().then(console.log); // Outputs "Task Done" after 2 seconds


Async/Await: Cleaner Syntax with Promises

Introduced in ES2017, async/await allows you to write asynchronous code like it’s synchronous.
async function fetchData() {
  try {
    const data = await asyncTask();
    console.log(data);
  } catch (error) {
    console.error(error);
  }
}
fetchData();

This is still based on promises, but it's easier to read and write.

Event Loop + Promises + Callbacks: Putting It Together

console.log('Start');

setTimeout(() => {
  console.log('setTimeout');
}, 0);

Promise.resolve().then(() => {
  console.log('Promise');
});

process.nextTick(() => {
  console.log('nextTick');
});
console.log('End');

Output
Start
End
nextTick
Promise
setTimeout

Execution order

  • Synchronous: Start, End
  • process.nextTick() (before promises)
  • Promises
  • Timers like setTimeout

Summary Table

Concept Purpose Queue
Event Loop Runs and manages all tasks          —
Callback Function called after async operation Callback queue
Promise Handles future values (success/failure) Microtask queue
setTimeout Delay execution of a task Macrotask queue
process.nextTick Runs after the current phase, before promises Microtask queue (special)

Conclusion

Node.js manages concurrency via non-blocking, asynchronous APIs and a clever event loop rather than threads. The event loop, promises, and callbacks are all crucial for creating Node.js applications with great performance. You will soon be able to create async code with confidence if you start small, try out samples, and gradually develop your mental model.



Node.js Hosting Europe - HostForLIFE :: Key Features of Node.js: A Clear Explanation

clock July 1, 2025 07:24 by author Peter

An open-source, cross-platform runtime environment called Node.js enables server-side JavaScript execution. It is incredibly fast because it is based on Chrome's V8 engine. Node.js's non-blocking, event-driven architecture is one of its strongest features; it makes it easier to create high-performance applications. Because we can use the same language-JavaScript-for both front-end and back-end development, many Indian developers.

Why Node.js is So Popular: 7 Key Features Explained?
Speed, scalability, and performance are essential for creating contemporary web applications in the fast-paced digital world of today. Node.js excels in this situation. Node.js has grown to be one of the most popular platforms since its launch, used by both developers and businesses, ranging from startups to internet behemoths like Netflix, PayPal, and LinkedIn.

1. Single-Threaded but Super Efficient
Node.js operates on a single-threaded event loop, in contrast to conventional server models that employ many threads. This implies that thousands of requests can be handled concurrently without requiring the creation of a new thread for each one. I/O-intensive applications like as file upload systems, chat apps, and streaming services are best suited for this.

2. Asynchronous and Non-Blocking
In Node.js, operations like reading files or querying a database do not block the main thread. Instead, they run in the background and notify when the task is complete. This non-blocking nature allows other tasks to continue running smoothly, perfect for high-performance apps.

3. Event-Driven Architecture
Everything in Node.js revolves around events. Instead of waiting for a process to complete, Node.js listens for events and responds accordingly. This event-driven nature improves performance and helps build scalable applications.

4. Powered by Google’s V8 Engine
Node.js uses Google’s powerful V8 JavaScript engine, the same engine used in Chrome. It compiles JavaScript directly into machine code, making it extremely fast and efficient.

5. Cross-Platform Support

Whether you are using Windows, macOS, or Linux, Node.js works seamlessly across platforms. This makes development flexible and accessible to a wider range of developers.

6. NPM – Node Package Manager
With Node.js, you get access to NPM, the world’s largest software registry. It contains thousands of free packages and libraries that you can easily install and use in your projects, saving a lot of development time.

7. Built for Real-Time Applications

One of the biggest strengths of Node.js is its real-time capabilities. Whether you’re building a messaging app, online multiplayer game, or live tracking system, Node.js offers the perfect foundation for real-time communication.

Conclusion
Node.js is a comprehensive ecosystem that enables developers to create quick, scalable, and contemporary apps. It is not just another server-side technology. It is perfect for today's digital demands because of its single-threaded paradigm, asynchronous nature, and real-time characteristics. If you're an Indian developer who wants to improve your backend skills, learning Node.js can lead to a lot of options.



About HostForLIFE

HostForLIFE is European Windows Hosting Provider which focuses on Windows Platform only. We deliver on-demand hosting solutions including Shared hosting, Reseller Hosting, Cloud Hosting, Dedicated Servers, and IT as a Service for companies of all sizes.

We have offered the latest Windows 2019 Hosting, ASP.NET 5 Hosting, ASP.NET MVC 6 Hosting and SQL 2019 Hosting.


Tag cloud

Sign in