Full Trust European Hosting

BLOG about Full Trust Hosting and Its Technology - Dedicated to European Windows Hosting Customer

AngularJS Hosting Europe - HostForLIFE ::Using Module Federation in Angular to Integrate Micro Frontends

clock November 14, 2025 09:03 by author Peter

The size and complexity of contemporary enterprise applications are increasing. A single monolithic frontend becomes challenging to maintain and implement when teams grow and features change on their own. By dividing huge applications into smaller, independently deployable components that are individually owned by different teams, Micro Frontends (MFEs) provide a scalable solution.

Angular now offers a natural and effective method of implementing Micro Frontends without complicated runtime integrations thanks to Webpack 5 Module Federation. This article provides a step-by-step implementation guide for integrating numerous micro frontends within a single shell application, as well as an explanation of Module Federation's benefits in Angular.

1. Understanding Micro Frontends
A Micro Frontend Architecture divides a large web application into multiple independent applications — or micro apps — that can:

  • Be developed and deployed separately
  • Use different versions of Angular or other frameworks
  • Communicate seamlessly within a unified user experience


Each micro frontend (MFE) handles a specific domain, such as:

  • orders
  • inventory
  • customers


These are then composed dynamically by a Shell (Host) application.

2. What is Module Federation?

Module Federation (introduced in Webpack 5) allows multiple applications to share code and load remote modules dynamically at runtime.

Key Concepts

  • Host – The main application that loads other micro frontends.
  • Remote – A standalone micro frontend that exposes certain modules/components.
  • Shared Libraries – Dependencies (like Angular, RxJS, or common UI libraries) that can be shared between host and remotes to prevent duplication.


3. Setting Up the Environment
Prerequisites

    Node.js (>= 18)

    Angular CLI (>= 17)

    Webpack 5 (comes by default with Angular 12+)

    @angular-architects/module-federation plugin

Step 1: Install the Module Federation Plugin

npm install @angular-architects/module-federation --save-dev

4. Creating the Applications
Step 2: Generate the Shell and Micro Frontends

ng new mfe-shell --routing --style=scss
ng new mfe-orders --routing --style=scss
ng new mfe-inventory --routing --style=scss

Each app runs independently at first.

5. Configuring Module Federation
Step 3: Configure the Host (Shell)

Navigate to the shell app and run:
ng add @angular-architects/module-federation --project mfe-shell --type host

This creates a webpack.config.js with placeholders to load remote apps.

Example configuration:
// mfe-shell/webpack.config.js
const { ModuleFederationPlugin } = require('webpack').container;
const mf = require('@angular-architects/module-federation/webpack');

module.exports = {
  output: {
    uniqueName: "mfeShell",
    publicPath: "auto",
  },
  plugins: [
    new ModuleFederationPlugin({
      remotes: {
        "mfeOrders": "mfeOrders@http://localhost:4201/remoteEntry.js",
        "mfeInventory": "mfeInventory@http://localhost:4202/remoteEntry.js",
      },
      shared: mf.share({
        "@angular/core": { singleton: true, strictVersion: true },
        "@angular/common": { singleton: true, strictVersion: true },
        "@angular/router": { singleton: true, strictVersion: true },
      }),
    }),
  ],
};


Step 4: Configure a Remote Application
For the mfe-orders app:
ng add @angular-architects/module-federation --project mfe-orders --type remote --port 4201

Generated webpack.config.js example:
new ModuleFederationPlugin({
  name: "mfeOrders",
  filename: "remoteEntry.js",
  exposes: {
    "./OrdersModule": "./src/app/orders/orders.module.ts",
  },
  shared: mf.share({
    "@angular/core": { singleton: true, strictVersion: true },
    "@angular/common": { singleton: true, strictVersion: true },
    "@angular/router": { singleton: true, strictVersion: true },
  }),
}),


Do the same for mfe-inventory (on port 4202).

6. Consuming Remote Modules in the Shell

In the shell’s routing configuration:
const routes: Routes = [
  {
    path: 'orders',
    loadChildren: () =>
      loadRemoteModule({
        type: 'module',
        remoteEntry: 'http://localhost:4201/remoteEntry.js',
        exposedModule: './OrdersModule',
      }).then(m => m.OrdersModule),
  },
  {
    path: 'inventory',
    loadChildren: () =>
      loadRemoteModule({
        type: 'module',
        remoteEntry: 'http://localhost:4202/remoteEntry.js',
        exposedModule: './InventoryModule',
      }).then(m => m.InventoryModule),
  },
];

Now, navigating to /orders or /inventory in the shell dynamically loads the respective remote application.

7. Sharing Common Libraries
To reduce bundle size and maintain consistency:

  • Share Angular core libraries (@angular/core, @angular/common)
  • Share UI component libraries (like PrimeNG or Material)
  • Share custom libraries (like a design system or common services)

You can modify webpack.config.js in each project:
shared: mf.share({
  "@angular/core": { singleton: true, strictVersion: true },
  "@angular/common": { singleton: true, strictVersion: true },
  "@angular/router": { singleton: true, strictVersion: true },
  "shared-lib": { singleton: true, import: "shared-lib" },
}),


8. Running and Testing
Start all apps in separate terminals:
ng serve mfe-orders --port 4201
ng serve mfe-inventory --port 4202
ng serve mfe-shell --port 4200


Open
http://localhost:4200/orders
http://localhost:4200/inventory

You’ll notice the shell dynamically loads content from the micro frontends.

9. Deployment and Versioning

Each MFE can:

  • Have its own repository
  • Be deployed independently (e.g., on Azure Blob, AWS S3, or CDN)
  • Expose its remoteEntry.js file via public URL

The shell always loads the latest version at runtime — no redeployment required for the entire app.
10. Advantages of Using Module Federation in Angular

  • True decoupling: Teams can work independently without merge conflicts.
  • Independent deployments: Each feature or domain can be deployed separately.
  • Version flexibility: Micro frontends can use different Angular versions.
  • Performance optimization: Shared dependencies reduce duplication.
  • Scalable architecture: Perfect for enterprise-level applications.


11. Real-World Use Cases

  • ERP systems with independent modules like Finance, Inventory, and Sales.
  • Multi-tenant platforms where different clients require customized frontends.
  • Product suites where each product has its own lifecycle but unified branding.

12. Conclusion
Integrating Micro Frontends using Module Federation in Angular provides a robust foundation for building scalable, modular, and future-proof enterprise applications.

By combining independent deployability with shared runtime integration, Module Federation eliminates the pain of monolithic frontends and empowers teams to innovate faster without sacrificing consistency or maintainability.



AngularJS Hosting Europe - HostForLIFE :: Using Angular Apps with Role-Based Access Control (RBAC)

clock November 12, 2025 07:40 by author Peter

A popular authorization paradigm for applications is called Role-Based Access Control (RBAC), in which users are given access to features or data according to their roles (Admin, Manager, User, etc.). RBAC should be applied neatly and consistently in Angular projects so that:

  • UI elements are disabled or hidden when users don't have permission.
  • To prevent unauthorized customers from navigating, routes are secured.
  • Decisions about authorization are safe and effective.
  • As roles and permissions expand, the system is still maintainable.

Using JWT tokens with role claims, lazy loading, secure server-side checks, and contemporary Angular patterns (services, guards, and directives), this article provides a useful, step-by-step method for building RBAC in an Angular application. TypeScript code samples assume Angular 14+ (but the patterns work with Angular 17 as well). The focus is on using secure backend checks in conjunction with a pure frontend implementation approach (important reminder: never rely entirely on frontend checks).

2. Core concepts: roles, permissions, and claims
Before coding, clarify your model.

  • Role: a label for a set of capabilities (e.g., Admin, Editor).
  • Permission: a fine-grained capability or action (e.g., order.create, order.view, user.manage).
  • Claims: information encoded in JWT or user profile (e.g., roles: ["Admin", "Manager"] or permissions: ["order.create"]) that the client can use to authorize UI and route behaviour.

Two approaches:

  • Role-based only: simpler, map roles directly to UI/route checks.
  • Role + Permission (recommended for large apps): map roles → permissions on the server and use permission checks in the client for fine-grained control.

Prefer keeping authoritative role→permission mapping on the server; send roles or permissions as claims in JWT (small list) or fetch user permissions from a secure API at login.

3. Authentication vs Authorization

  • Authentication = who are you (login).
  • Authorization = what can you do (RBAC).

Angular handles the frontend part (storing token, exposing roles to components). But every protected API must validate JWT and roles server-side — the frontend is only UX & convenience.

4. Implementation overview
We’ll implement:

  • JWT auth service that extracts roles/permissions.
  • Route guards (CanActivate, CanLoad) for route protection and lazy modules.
  • Structural directive (*hasRole / *hasPermission) to show/hide UI.
  • Interceptor to attach token and optionally refresh it.
  • Example route config and lazy module protection.
  • Notes on server claims shape and security.

5. JWT payload and server contract
Agree on a standard JWT payload. Minimal example:
{
  "sub": "12345",
  "name": "Peter",
  "email": "[email protected]",
  "roles": ["Admin", "Manager"],
  "permissions": ["orders.view", "orders.create"],
  "iat": 1600000000,
  "exp": 1600003600
}

Your backend should:

  • Sign JWTs securely (RS256 recommended).
  • Keep the token payload small.
  • Revoke tokens via short expiry + refresh tokens or server revocation list for highly sensitive apps.
  • Map roles → permissions server-side so RBAC policy remains authoritative.

6. AuthService: parse token & expose observables
AuthService manages the token, provides current user roles and permission helpers, and exposes an observable so the rest of the app can react to auth changes.
// auth.service.ts
import { Injectable } from '@angular/core';
import { BehaviorSubject, Observable } from 'rxjs';

export interface UserInfo {
  sub: string;
  name?: string;
  email?: string;
  roles: string[];
  permissions?: string[];
  exp?: number;
}

@Injectable({ providedIn: 'root' })
export class AuthService {
  private tokenKey = 'app_token';
  private userSubject = new BehaviorSubject<UserInfo | null>(null);
  public user$ = this.userSubject.asObservable();

  constructor() {
    const token = this.getToken();
    if (token) {
      const info = this.parseToken(token);
      if (info) this.userSubject.next(info);
    }
  }

  setToken(token: string) {
    localStorage.setItem(this.tokenKey, token);
    const info = this.parseToken(token);
    this.userSubject.next(info);
  }

  getToken(): string | null {
    return localStorage.getItem(this.tokenKey);
  }

  clear() {
    localStorage.removeItem(this.tokenKey);
    this.userSubject.next(null);
  }

  isAuthenticated(): boolean {
    const info = this.userSubject.value;
    return !!info && !(info.exp && info.exp * 1000 < Date.now());
  }

  hasRole(role: string): boolean {
    const info = this.userSubject.value;
    return !!info && info.roles?.includes(role);
  }

  hasAnyRole(roles: string[]): boolean {
    const info = this.userSubject.value;
    if (!info) return false;
    return roles.some(r => info.roles?.includes(r));
  }

  hasPermission(perm: string): boolean {
    const info = this.userSubject.value;
    return !!info && info.permissions?.includes(perm);
  }

  private parseToken(token: string): UserInfo | null {
    try {
      const payload = token.split('.')[1];
      const decoded = JSON.parse(atob(payload.replace(/-/g, '+').replace(/_/g, '/')));
      return {
        sub: decoded.sub,
        name: decoded.name,
        email: decoded.email,
        roles: decoded.roles || [],
        permissions: decoded.permissions || [],
        exp: decoded.exp
      };
    } catch {
      return null;
    }
  }
}

Notes

  • Use atob for Base64 decode (works in browsers). In Node or SSR consider safe decoding.
  • For security, prefer HttpOnly cookies for tokens in some contexts—localStorage is simpler but vulnerable to XSS.
  • Expose user$ for templates and components.

7. HTTP interceptor to attach token & handle 401/refresh
Attach token to outgoing requests and centrally handle 401 responses (refresh flow).
// auth.interceptor.ts
import { Injectable } from '@angular/core';
import {
  HttpEvent, HttpHandler, HttpInterceptor, HttpRequest, HttpErrorResponse
} from '@angular/common/http';
import { Observable, throwError, from } from 'rxjs';
import { catchError, switchMap } from 'rxjs/operators';
import { AuthService } from './auth.service';
import { TokenRefreshService } from './token-refresh.service'; // optional

@Injectable()
export class AuthInterceptor implements HttpInterceptor {
  constructor(private auth: AuthService, private refresh: TokenRefreshService) {}

  intercept(req: HttpRequest<any>, next: HttpHandler): Observable<HttpEvent<any>> {
    const token = this.auth.getToken();
    let authReq = req;
    if (token) {
      authReq = req.clone({
        setHeaders: { Authorization: `Bearer ${token}` }
      });
    }
    return next.handle(authReq).pipe(
      catchError((err: HttpErrorResponse) => {
        if (err.status === 401 && token) {
          // Attempt refresh logic
          return from(this.refresh.tryRefresh()).pipe(
            switchMap(newToken => {
              if (newToken) {
                this.auth.setToken(newToken);
                const retryReq = req.clone({
                  setHeaders: { Authorization: `Bearer ${newToken}` }
                });
                return next.handle(retryReq);
              }
              this.auth.clear();
              return throwError(() => err);
            })
          );
        }
        return throwError(() => err);
      })
    );
  }
}


Register interceptor in app.module.ts providers.

Note: token refresh flows can be complex — implement queuing to avoid parallel refresh attempts.

8. Route guards for authorization
Protect routes with CanActivate and CanLoad guards. CanLoad prevents lazy module download.
// roles.guard.ts
import { Injectable } from '@angular/core';
import { CanActivate, ActivatedRouteSnapshot, RouterStateSnapshot, Router, CanLoad, Route } from '@angular/router';
import { AuthService } from './auth.service';

@Injectable({ providedIn: 'root' })
export class RolesGuard implements CanActivate, CanLoad {
  constructor(private auth: AuthService, private router: Router) {}

  canActivate(route: ActivatedRouteSnapshot): boolean {
    const roles = route.data['roles'] as string[] | undefined;
    if (!roles || roles.length === 0) return true;
    if (this.auth.hasAnyRole(roles)) return true;

    this.router.navigate(['/forbidden']);
    return false;
  }

  canLoad(route: Route): boolean {
    const roles = route.data && route.data['roles'] as string[] | undefined;
    if (!roles || roles.length === 0) return true;
    if (this.auth.hasAnyRole(roles)) return true;
    return false;
  }
}


Route config example
// app-routing.module.ts
const routes: Routes = [
  {
    path: 'admin',
    loadChildren: () => import('./admin/admin.module').then(m => m.AdminModule),
    canLoad: [RolesGuard],
    data: { roles: ['Admin'] }
  },
  {
    path: 'orders',
    component: OrdersComponent,
    canActivate: [RolesGuard],
    data: { roles: ['Admin', 'Manager'] }
  }
];


Remember: CanLoad blocks async module loading; CanActivate protects navigation.

9. Structural directives for UI control

Create a directive to conditionally render parts of the UI based on roles or permissions:
// has-role.directive.ts
import { Directive, Input, TemplateRef, ViewContainerRef } from '@angular/core';
import { AuthService } from './auth.service';
import { Subscription } from 'rxjs';

@Directive({
  selector: '[hasRole]'
})
export class HasRoleDirective {
  private roles: string[] = [];
  private sub: Subscription;

  constructor(
    private tpl: TemplateRef<any>,
    private vc: ViewContainerRef,
    private auth: AuthService
  ) {
    this.sub = this.auth.user$.subscribe(() => this.updateView());
  }

  @Input() set hasRole(value: string | string[]) {
    this.roles = Array.isArray(value) ? value : [value];
    this.updateView();
  }

  private updateView() {
    this.vc.clear();
    if (!this.roles || this.roles.length === 0) {
      return;
    }
    if (this.auth.hasAnyRole(this.roles)) {
      this.vc.createEmbeddedView(this.tpl);
    }
  }

  ngOnDestroy() {
    this.sub.unsubscribe();
  }
}

Usage in template
<button *hasRole="'Admin'">Delete User</button>
<div *hasRole="['Manager','Admin']">Manager Dashboard</div>


Build a similar hasPermission directive if you use permission claims.

10. Dynamic menu & navigation
Generate menus based on roles to improve UX and avoid showing dead links.

Example menu service
export interface MenuItem { label: string; route?: string; roles?: string[]; children?: MenuItem[]; }

@Injectable({ providedIn: 'root' })
export class MenuService {
  constructor(private auth: AuthService) {}

  getMenu(): MenuItem[] {
    const baseMenu: MenuItem[] = [
      { label: 'Home', route: '/' },
      { label: 'Orders', route: '/orders', roles: ['Manager','Admin'] },
      { label: 'Admin', route: '/admin', roles: ['Admin'] }
    ];
    return baseMenu.filter(item => !item.roles || this.auth.hasAnyRole(item.roles));
  }
}

11. Lazy loading & module-level guards
Always use CanLoad for lazy modules to prevent module download for unauthorized users.

Also, within lazy modules, consider protecting child routes with CanActivateChild.
// in admin-routing.module.ts
const routes: Routes = [
  {
    path: '',
    component: AdminHomeComponent,
    canActivateChild: [RolesGuard],
    children: [
      { path: 'users', component: UserListComponent, data: { roles: ['Admin'] } },
    ]
  }
];


12. Token expiry, refresh & session management

  • Use short-lived access tokens and refresh tokens for security.
  • Implement refresh flow in TokenRefreshService.
  • On refresh failure, redirect to login.

Guide

  • Refresh tokens should be HttpOnly cookies where possible.
  • Keep token expiry checks in AuthService.isAuthenticated() using exp claim.

13. Secure coding reminders

  • Never trust frontend for authorization — always validate tokens & roles server-side for APIs.
  • Protect sensitive operations server-side even if you hide UI elements in the client.
  • Escape & sanitize inputs to avoid XSS which can reveal tokens in localStorage.
  • Use Content Security Policy (CSP) and secure headers.
  • Consider storing refresh tokens in HttpOnly secure cookies to reduce XSS risks.

14. Testing RBAC behaviour
Add unit and e2e tests for:

  • AuthService parsing of tokens and role logic.
  • RolesGuard responses for allowed/forbidden routes.
  • Directives rendering behavior under different role sets.
  • Integration tests: fake login with token claims + route navigation.

15. Example Jasmine unit test for directive
it('should render element only for Admin', () => {
  authService.setToken(mockAdminToken);
  fixture.detectChanges();
  expect(fixture.nativeElement.querySelector('button')).toBeTruthy();

  authService.setToken(mockUserToken);
  fixture.detectChanges();
  expect(fixture.nativeElement.querySelector('button')).toBeNull();
});


16. Advanced topics
Attribute-based RBAC & policy engines

For complex rules (time-based access, multi-claim rules), consider using a policy engine (e.g., OPA) and fetch a decision from backend for critical flows.
Claims mapping & role changes

If roles change often, prefer fetching current permissions from an API during login rather than relying only on JWT. Combine JWT for offline, and a permissions API for dynamic checks.

17. Caching permissions
Cache permissions for short TTL to reduce round trips. Invalidate on logout or role update.

Audit & traceability
Record which role performed critical operations. Include role + user id in logs and, where required, use server-side audit tables.

18. Example: Putting it all together

  • User logs in → backend returns JWT with roles claim and sets refresh cookie.
  • Angular stores token (or reads from cookie) → AuthService parses token and broadcasts user$.
  • Router triggers CanLoad / CanActivate for requested routes; guard checks required roles in route data.
  • Components use *hasRole or *hasPermission directives to show/hide buttons.
  • HTTP interceptor attaches token to API calls; backend validates role claims and allows/denies operations.
  • On sensitive server actions, backend enforces permission checks and records an audit entry.

19. Common pitfalls & how to avoid them

  • Relying only on client checks — avoid this; always secure APIs.
  • Huge JWT payloads — keep tokens small; use role ids instead of huge lists.
  • No refresh strategy — short tokens without refresh cause UX problems; implement a safe refresh flow.
  • Inconsistent role naming — define role constants centrally and share via API docs or a shared library.
  • Not handling lazy modules — failing to use CanLoad exposes lazy code to unauthorized downloads.

20. Performance & scalability tips

  • Keep auth logic light on the client; do heavy policy evaluation on the server.
  • Cache permission lookups server-side using Redis or in-memory caches.
  • For multi-tenant apps, include tenant claim and validate tenant context in guards and backend.
  • Avoid frequent calls to permission API — use local cache with TTL.

21. Summary & best practice checklist

  • Use JWT claims (roles / permissions) to drive client UI and route guards.
  • Always enforce role/permission checks server-side for APIs.
  • Protect lazy modules with CanLoad.
  • Provide structural directives for clean templates (*hasRole, *hasPermission).
  • Implement token refresh, short access token TTL, and secure refresh storage.
  • Centralize role constants and align backend/frontend contracts.
  • Test behavior via unit and e2e tests and maintain audit logs for critical operations.


AngularJS Hosting Europe - HostForLIFE :: Understanding Angular Testing: Component-Level, Unit, Integration, and E2E Techniques

clock November 6, 2025 06:05 by author Peter

One of the most crucial yet sometimes overlooked—aspects of front-end development is testing. Testing guarantees code quality, bug prevention, and trust in each release of large enterprise Angular apps. With examples, best practices, and practical insights, we will examine all forms of testing in Angular, including Unit, Integration, End-to-End (E2E), and Component-level testing.

1. Why Testing Matters in Angular
Modern Angular applications are built with multiple services, components, and modules.
Without automated tests, even a small change can unintentionally break features elsewhere.

Key reasons to write tests:

  • Detect bugs early before production
  • Improve refactoring confidence
  • Ensure consistent functionality
  • Reduce manual QA time

Angular provides an excellent testing ecosystem out of the box with Jasmine, Karma, and Protractor (or Cypress for modern E2E).

2. Types of Testing in Angular

Test TypePurposeTools Commonly UsedExample Scope

Unit Test

Test smallest code units (functions, services, components)

Jasmine + Karma

Single function or service

Integration Test

Test how modules/components work together

Jasmine + TestBed

Component with service

E2E Test

Test the entire application flow

Cypress / Playwright / Protractor

From login to checkout

Component Test

Focus only on UI component behavior and rendering

TestBed / Jest / Cypress Component Testing

Angular Material Table, Forms, Buttons

3. Unit Testing in Angular

Unit tests check if individual functions or components behave as expected.
Angular uses Jasmine for writing tests and Karma as the test runner.

Example: Testing a Service
math.service.ts

export class MathService {
  add(a: number, b: number): number {
    return a + b;
  }
}


math.service.spec.ts
import { MathService } from './math.service';

describe('MathService', () => {
  let service: MathService;

  beforeEach(() => {
    service = new MathService();
  });

  it('should add two numbers correctly', () => {
    expect(service.add(2, 3)).toBe(5);
  });
});


Best Practices

  • Test each method independently
  • Avoid API or database calls
  • Use mocks or spies for dependencies

4. Component-Level Testing
Component testing focuses on how the UI behaves — inputs, outputs, events, and template rendering.
Angular’s TestBed provides a test environment for creating and interacting with components.
Example: Testing a Simple Component

hello.component.ts
import { Component, Input } from '@angular/core';

@Component({
  selector: 'app-hello',
  template: `<h3>Hello {{ name }}!</h3>`
})
export class HelloComponent {
  @Input() name = '';
}

hello.component.spec.ts
import { ComponentFixture, TestBed } from '@angular/core/testing';
import { HelloComponent } from './hello.component';

describe('HelloComponent', () => {
  let component: HelloComponent;
  let fixture: ComponentFixture<HelloComponent>;

  beforeEach(() => {
    TestBed.configureTestingModule({
      declarations: [HelloComponent]
    });
    fixture = TestBed.createComponent(HelloComponent);
    component = fixture.componentInstance;
  });

  it('should display the input name', () => {
    component.name = 'Rajesh';
    fixture.detectChanges();
    const compiled = fixture.nativeElement;
    expect(compiled.querySelector('h3').textContent).toContain('Rajesh');
  });
});


Best Practices
Use fixture.detectChanges() to apply bindings
Query DOM using fixture.nativeElement

Keep test scenarios small and focused

5. Integration Testing
Integration tests ensure multiple parts of your app work together correctly — for example, a component using a service or API.
Example: Component + Service Integration

user.service.ts
@Injectable({ providedIn: 'root' })
export class UserService {
  getUser() {
    return of({ name: 'Rajesh', role: 'Admin' });
  }
}

profile.component.ts
@Component({
  selector: 'app-profile',
  template: `<p>{{ user?.name }} - {{ user?.role }}</p>`
})
export class ProfileComponent implements OnInit {
  user: any;
  constructor(private userService: UserService) {}
  ngOnInit() {
    this.userService.getUser().subscribe(u => this.user = u);
  }
}

profile.component.spec.ts
import { ComponentFixture, TestBed } from '@angular/core/testing';
import { ProfileComponent } from './profile.component';
import { UserService } from './user.service';
import { of } from 'rxjs';

describe('ProfileComponent (Integration)', () => {
  let fixture: ComponentFixture<ProfileComponent>;
  let mockService = { getUser: () => of({ name: 'Rajesh', role: 'Admin' }) };

  beforeEach(() => {
    TestBed.configureTestingModule({
      declarations: [ProfileComponent],
      providers: [{ provide: UserService, useValue: mockService }]
    });
    fixture = TestBed.createComponent(ProfileComponent);
  });

  it('should display user details from service', () => {
    fixture.detectChanges();
    const element = fixture.nativeElement;
    expect(element.textContent).toContain('Rajesh');
  });
});


Best Practices

  • Mock dependencies (like services or APIs)
  • Focus on communication between layers
  • Keep external systems out of the test

6. End-to-End (E2E) Testing
E2E tests simulate real user actions on your application — from login to logout — ensuring that the whole flow works.

Modern Angular apps now prefer Cypress or Playwright over Protractor.

Example: Cypress Test
cypress/e2e/login.cy.ts

describe('Login Page', () => {
  it('should login successfully with valid credentials', () => {
    cy.visit('/login');
    cy.get('input[name="email"]').type('[email protected]');
    cy.get('input[name="password"]').type('12345');
    cy.get('button[type="submit"]').click();
    cy.url().should('include', '/dashboard');
  });
});


Best Practices
Use realistic test data
Separate test environment from production
Avoid flakiness by waiting for API calls using cy.intercept()

7. Choosing the Right Testing Strategy

Project TypeRecommended Focus
Small App / POC Unit & Component Tests
Enterprise App Unit + Integration + E2E
UI Heavy App Component + E2E
API Driven App Integration + E2E

In most projects:

  • 70% of tests are Unit

  • 20% Integration

  • 10% E2E

This gives a good balance between speed and confidence.

8. Testing Tools Summary

ToolPurposeDescription
Jasmine Unit/Integration Test framework for assertions
Karma Test Runner Executes tests in browser
TestBed Angular Testing Utility For setting up components
Cypress / Playwright E2E Testing Real browser automation
Jest Fast Unit Testing Alternative to Jasmine for faster runs

9. Continuous Testing in CI/CD

Integrate your Angular tests into your CI/CD pipeline (Jenkins, GitHub Actions, or Azure DevOps).

Example command in package.json:
"scripts": {"test": "ng test --watch=false --browsers=ChromeHeadless","e2e": "cypress run"}

This ensures that every commit runs all tests automatically.

10. Conclusion
Testing in Angular is not just a best practice it’s a developer’s safety net. Whether you’re building small reusable components or large enterprise apps, combining Unit, Integration, Component, and E2E tests ensures:

  • Better reliability
  • Fewer regressions
  • Confident deployments

Start small, automate everything, and make testing a habit, not an afterthought.



Node.js Hosting - HostForLIFE :: Developing Your Own Developer Tools: Mastering Node.js CLI Tools

clock November 4, 2025 07:04 by author Peter

Command Line Interfaces (CLIs) are effective tools that increase productivity, streamline repetitive processes, and automate workflows. You can create CLI tools that function like any system command, like git, npm, or npx, using Node.js. Building a professional Node.js CLI from scratch is covered in this guide, which also covers error handling, parameter parsing, setup, and packaging for npm distribution.

1. Knowing How CLI Tools Are Powered by Node.js
Node.js is appropriate for backend and system-level automation since it uses the V8 engine to run JavaScript outside of the browser. A Node.js script that can be run straight from a terminal is called a CLI tool. Installing a Node.js CLI globally adds an executable command to your system PATH, making it accessible from any location.

The main elements of any Node.js CLI are:

  • Command runner: the main script that receives and interprets user input.
  • Argument parser: a utility that extracts options and flags from command-line arguments.
  • Core logic: the actual functionality your tool performs.
  • Output handling: feedback printed to the terminal using console.log or rich text libraries.
  • Distribution setup: package metadata that allows global installation.

2. Initial Setup
Let’s start by setting up a new Node.js project.
mkdir create-project-cli
cd create-project-cli
npm init -y

This generates a package.json file. Now, create a folder for your code:
mkdir bin
touch bin/index.js

In the package.json, add this field to define your CLI entry point:
"bin": {
  "create-project": "./bin/index.js"
}


This means when the user installs your package globally, typing create-project will execute the ./bin/index.js file.

3. Adding the Shebang Line

Every Node.js CLI starts with a shebang line so that the system knows which interpreter to use.
Open bin/index.js and add this as the first line:
#!/usr/bin/env node

This tells Unix-like systems to run the script with Node.js, enabling direct execution through the terminal.

Make the file executable:
chmod +x bin/index.js

4. Parsing Command-Line Arguments
Users interact with CLI tools through arguments and flags. For instance:
create-project myApp --template react

Here, myApp is a positional argument, and --template is a named flag.
Instead of manually parsing arguments, use a library like commander or yargs. These handle parsing, validation, and help menus efficiently.

Install Commander
npm install commander

Update bin/index.js
#!/usr/bin/env node
const { Command } = require('commander');
const fs = require('fs');
const path = require('path');

const program = new Command();

program
  .name('create-project')
  .description('CLI to scaffold a new Node.js project structure')
  .version('1.0.0');

program
  .argument('<project-name>', 'Name of your new project')
  .option('-t, --template <template>', 'Specify a template type (node, react, express)', 'node')
  .action((projectName, options) => {
    const destPath = path.join(process.cwd(), projectName);

    if (fs.existsSync(destPath)) {
      console.error(`Error: Directory "${projectName}" already exists.`);
      process.exit(1);
    }

    fs.mkdirSync(destPath);
    fs.writeFileSync(path.join(destPath, 'README.md'), `# ${projectName}\n`);
    fs.writeFileSync(path.join(destPath, 'index.js'), `console.log('Hello from ${projectName}');`);

    if (options.template === 'react') {
      fs.mkdirSync(path.join(destPath, 'src'));
      fs.writeFileSync(path.join(destPath, 'src', 'App.js'), `function App(){ return <h1>Hello React</h1> }\nexport default App;`);
    }

    console.log(`Project "${projectName}" created successfully with ${options.template} template.`);
  });

program.parse(process.argv);

This script
Defines a CLI named create-project
Takes one argument (project-name)
Accepts an optional flag (--template)
Scaffolds a directory structure based on the provided template


Try running it locally before publishing:
node bin/index.js myApp --template react

5. Adding Colors and Feedback
CLI tools should provide clear feedback. You can add colored output using chalk or kleur.

Install chalk
npm install chalk

Update your log statements:
const chalk = require('chalk');

console.log(chalk.green(`Project "${projectName}" created successfully!`));
console.log(chalk.blue(`Template: ${options.template}`));
console.log(chalk.yellow(`Location: ${destPath}`));

Now, when users run the CLI, the terminal output will be visually distinct and easier to read.

6. Improving Error Handling
Good CLI tools fail gracefully. You should handle invalid arguments, permissions, and unexpected errors.

Wrap the main logic inside a try...catch block:
.action((projectName, options) => {
  try {
    const destPath = path.join(process.cwd(), projectName);
    if (fs.existsSync(destPath)) {
      console.error(chalk.red(`Error: Directory "${projectName}" already exists.`));
      process.exit(1);
    }

    fs.mkdirSync(destPath);
    fs.writeFileSync(path.join(destPath, 'README.md'), `# ${projectName}\n`);
    console.log(chalk.green(`Created project: ${projectName}`));
  } catch (err) {
    console.error(chalk.red(`Failed to create project: ${err.message}`));
    process.exit(1);
  }
});


This ensures your tool communicates errors clearly without exposing stack traces or internal logic.

7. Adding Configuration Files

Professional CLI tools often allow users to define configuration defaults, such as preferred templates or directory structure.
You can use a JSON config file inside the user’s home directory.

Add this snippet
const os = require('os');
const configPath = path.join(os.homedir(), '.createprojectrc.json');

function loadConfig() {
  if (fs.existsSync(configPath)) {
    const raw = fs.readFileSync(configPath);
    return JSON.parse(raw);
  }
  return {};
}

function saveConfig(config) {
  fs.writeFileSync(configPath, JSON.stringify(config, null, 2));
}

You can extend the CLI to allow setting a default template:
program
  .command('config')
  .description('Configure default settings for create-project')
  .option('--template <template>', 'Set default project template')
  .action((options) => {
    const currentConfig = loadConfig();
    const newConfig = { ...currentConfig, ...options };
    saveConfig(newConfig);
    console.log(chalk.green('Configuration updated successfully.'));
  });


Now users can run:
create-project config --template react

This stores their preference for future runs.

8. Making the CLI Executable Globally

You can test your CLI globally before publishing:

npm link

This command creates a symlink, allowing you to use your command globally on your system.

Try it
create-project myNodeApp

If it works, you are ready for packaging.

9. Publishing to npm

To make your CLI available to others, publish it on npm.
Ensure your package name is unique by searching it on npmjs.com.
Update package.json fields:
    {
      "name": "create-project-cli",
      "version": "1.0.0",
      "description": "A Node.js CLI tool to scaffold project structures",
      "main": "bin/index.js",
      "bin": {
        "create-project": "./bin/index.js"
      },
      "keywords": ["nodejs", "cli", "scaffold", "project"],
      "author": "Your Name",
      "license": "MIT"
    }

Log in to npm:
npm login

Publish:
npm publish

After publishing, anyone can install it globally:
npm install -g create-project-cli

Now they can scaffold a new project directly from the terminal.

10. Versioning and Continuous Updates
Use semantic versioning (major.minor.patch) for releases:
Increment patch for small fixes: 1.0.1
Increment minor for new features: 1.1.0
Increment the major for breaking changes: 2.0.0

Before publishing a new version:
npm version minor
npm publish


You can automate releases with tools like semantic-release or GitHub Actions for CI/CD.

11. Testing Your CLI
Automated testing ensures your CLI behaves consistently.
Use libraries like Jest or Mocha to run unit tests for functions that generate folders or parse arguments.

Example test using Jest
const fs = require('fs');
const path = require('path');
const { execSync } = require('child_process');

test('creates project folder', () => {
  execSync('node bin/index.js testApp', { stdio: 'inherit' });
  expect(fs.existsSync(path.join(process.cwd(), 'testApp'))).toBe(true);
  fs.rmSync(path.join(process.cwd(), 'testApp'), { recursive: true, force: true });
});

Run your tests
npm test

12. Final Thoughts
You now have a complete, professional-grade Node.js CLI utility that can:

  • Parse arguments and options
  • Handle errors gracefully
  • Store and read configuration
  • Output colored logs
  • Be installed globally or published on npm

With this foundation, you can expand your tool to:

  • Generate starter templates for different frameworks
  • Fetch dependencies automatically
  • Integrate API calls for remote project setup

Building custom CLI tools in Node.js gives you deep control over automation and workflow design. Whether you create utilities for internal teams or publish them for the open-source community, mastering CLI development is a valuable skill that showcases your command over Node.js beyond backend APIs.



AngularJS Hosting Europe - HostForLIFE :: Various Methods for Retrieving and Redirecting ID Parameters in Angular

clock October 28, 2025 10:20 by author Peter

Angular is a robust front-end framework that offers multiple methods for passing data, including ID parameters, and navigating between components. Effective routing management requires knowing how to collect parameters and reroute users, whether you're developing a sophisticated enterprise application or a single-page application (SPA).

1. Using the RouterLink Directive for Easy Navigation
The simplest method for navigating between Angular components is to use the RouterLink directive. It aids in creating URLs with dynamic parameters and is directly utilized in templates.

<a [routerLink]="['/employee', employee.id]">View Details</a>

the employee.id is appended to the /employee route, creating a dynamic URL like /employee/123. This is a convenient way to navigate when the route parameters are known within the template.

2. Programmatic Navigation with the Router

For more complex scenarios, such as navigation that depends on some business logic or conditional operations, Angular’s Router service can be used for programmatic navigation.
import { Router } from '@angular/router';
constructor(private router: Router) {}
viewEmployeeDetails(employeeId: number) {
  this.router.navigate(['/employee', employeeId]);
}


navigate() method takes an array where the first element is the route path, and the subsequent elements are the route parameters.

3. Retrieving Parameters Using ActivatedRoute

Once you’ve navigated to a route that includes parameters, you'll often need to retrieve those parameters in the component. Angular provides the ActivatedRoute service for this purpose.

import { ActivatedRoute } from '@angular/router';
constructor(private route: ActivatedRoute) {}
ngOnInit(): void {
  const employeeId = this.route.snapshot.paramMap.get('id');
  console.log('Employee ID:', employeeId);
}


snapshot.paramMap.get('id') retrieves the id parameter from the route. This is a synchronous method, meaning it grabs the parameter value only at the moment of the component's creation.

4. Using Observables for Dynamic Parameter Retrieval

While snapshot is useful for simple use cases, Angular applications often require handling route changes dynamically without destroying and recreating components. This is where paramMap as an Observable comes into play.
import { ActivatedRoute } from '@angular/router';
constructor(private route: ActivatedRoute) {}
ngOnInit(): void {
  this.route.paramMap.subscribe(params => {
    const employeeId = params.get('id');
    console.log('Employee ID:', employeeId);
  });
}


paramMap.subscribe() ensures that every time the id parameter changes, the new value is logged or processed accordingly. This is ideal for components that need to respond to route changes dynamically.

5. Combining Query Parameters with Navigation
Sometimes, you may want to navigate to a route and include additional information via query parameters. Angular’s Router service allows combining both route parameters and query parameters.

this.router.navigate(['/employee', employeeId], { queryParams: { ref: 'dashboard' } });

navigation directs to /employee/123?ref=dashboard, where 123 is the route parameter, and ref=dashboard is a query parameter.
If you want to retrieve the query parameters in the component

this.route.queryParams.subscribe(params => {
  const ref = params['ref'];
  console.log('Referred from:', ref);
});

6. Redirection after Form Submission
Another common use case is redirecting the user after a form submission or some action completion.
onSubmit() {
  // Assuming form submission is successful
  this.router.navigate(['/employee', newEmployeeId]);
}

7. Handling Complex Redirections with Guards
Angular also supports complex redirection scenarios using route guards. Guards can intercept navigation and redirect users based on certain conditions.
import { Injectable } from '@angular/core';
import { CanActivate, Router } from '@angular/router';
@Injectable({
  providedIn: 'root'
})
export class AuthGuard implements CanActivate {
  constructor(private router: Router) {}
  canActivate(): boolean {
    if (isLoggedIn()) {
      return true;
    } else {
      this.router.navigate(['/login']);
      return false;
    }
  }
}


if the is logged in () function returns false, the user is redirected to the /login route, preventing unauthorized access.

Conclusion

Navigating between routes and handling parameters in Angular is a fundamental aspect of building dynamic and user-friendly applications. Whether you use the simple RouterLink, programmatic navigation, or complex redirection logic, Angular provides the tools to handle a wide range of routing scenarios efficiently. Happy Coding!



AngularJS Hosting Europe - HostForLIFE :: Using Pipes to Create Clear and Effective Angular Applications

clock October 23, 2025 10:12 by author Peter

The use of "pipes" is one of Angular's most potent tools for formatting and changing data inside templates. Developers can apply transformations like formatting dates, changing text cases, or even filtering data in an efficient and reusable way by using pipes, which offer a declarative mechanism to handle data before it is shown to the user. Writing clean, manageable, and modular code for Angular applications requires an understanding of pipes. The main distinctions between pipes and functions will be discussed in this post, along with how to use built-in pipes and make your own custom pipes to increase Angular's functionality. You will have a firm grasp on how to integrate pipes into your Angular projects to improve user experience and expedite data presentation by the end of this tutorial.

What is an Angular Pipe?
In Angular, a pipe is a way to transform data before it is displayed in the user interface. Pipes can be used in templates to modify or format data without having to alter the original data. Pipes are an Angular concept, not a TypeScript (TS) feature. They are a core part of Angular’s template syntax and are used to transform data in the view (template) layer of Angular applications.

Key Points about Pipes in Angular

Angular-Specific: Pipes are a built-in feature of the Angular framework designed to be used in Angular templates. They are not a native feature of JavaScript or TypeScript.
Purpose: Their primary function is to transform data in the template before it is displayed to the user. This transformation can include formatting dates, numbers, currencies, filtering arrays, or performing more complex data transformations.

Declarative Transformation: Pipes enable declarative transformation of data within the template, meaning that the logic for transforming data is cleanly abstracted away from the component’s TypeScript code.

You may be wondering why we should use Pipes when we can use functions.

Criteria Pipe Function
Purpose Data transformation in the template Business logic and calculations
Use case Formatting, filtering, sorting, etc. Complex or multi-step calculations
Performance Pure pipes are efficient for transforming data only when needed Functions can be less performant when used in templates (requires manual calls)
Reusability Highly reusable across templates Functions are reusable within the component or service
Asynchronous Handling Handles observables and promises with AsyncPipe Requires manual subscription logic or use of 'async' in templates
Complexity Best for simple, declarative transformations Best for complex or dynamic logic
When to use When transforming data for display in the template When performing business logic or side effects that don't belong in the template

Types of Pipes

There are two types of Pipes.
Pure Pipe (Default): A pure pipe will only re-run when its input value changes.
    @Pipe({
      name: 'pureExample',
      pure: true // This is the default value, so you can omit this
    })
    export class PureExamplePipe implements PipeTransform {
      transform(value: any): any {
        console.log('Pure pipe executed');
        return value;
      }
    }


Impure Pipe: An impure pipe will re-run whenever Angular detects a change in the component’s state, even if the input value hasn’t changed.
@Pipe({
  name: 'impureExample',
  pure: false // Set to false to make it impure
})
export class ImpureExamplePipe implements PipeTransform {
  transform(value: any): any {
    console.log('Impure pipe executed');
    return value;
  }
}

In Angular, you can use in-built pipes or create your own.

In-built pipes
Angular provides some basic pipes that can be used.

It comes from the '@angular/common' package.

Some popular ones that can be helpful are:
CurrencyPipe, DatePipe, DecimalPipe, LowerCasePipe, UpperCasePipe and TitleCasePipe

How to use an in-built pipe?
In your ts file, define your variable. In our example, we will use the variable title.
title = 'app works!';

In your html, you can use the pipe as follows:
<h1> {{title | uppercase}} </h1>

The result is how the string title is displayed:

Chaining in-built pipes
Create your variable in the ts file.
amount = 123456.123456

In your html file, you can do the following.
<p>{{ amount | currency:'USD' | slice:0:10 }}</p>

The result is as per below:

Note. The currency ‘USD’ is added in front because of the currency pipe, and only 10 characters are displayed because of the slide pipe.

Custom pipes

    Run the command below to create a pipe file:
    ng generate pipe <<pipe-name>>.

For example: ng generate pipe my-custom-pipe. Once executed, the two files below will be created.

Open the file ‘my-custom-pipe.pipe.ts. You will see the following boilerplate code provided:
import { Pipe, PipeTransform } from '@angular/core';

@Pipe({
  name: 'myCustomPipe'
})
export class MyCustomPipePipe implements PipeTransform {
  transform(value: any, args?: any): any {
    return null;
  }
}


After the default class, you can create the function for your new pipe. In our case, we will create a pipe that will replace spaces in a hyphen. It is important to add the decorator ‘@Pipe’ before the class so that Angular knows what follows will be a pipe. Also, pass the name of the pipe as a parameter in the ‘@Pipe’ decorator. Also, when creating the class, implement ‘PipeTransform’. The resulting class will be as follows: 

@Pipe({name: 'removeWhiteSpace'})
export class RemoveWhiteSpacePipe implements PipeTransform {
  transform(value: string): string {
    return value.replace(/\s+/g, '-');
  }
}

The resulting class will be as follows (the full code):
import { Pipe, PipeTransform } from '@angular/core';

@Pipe({
  name: 'myCustomPipe'
})
export class MyCustomPipePipe implements PipeTransform {


  transform(value: any, args?: any): any {
    return null;
  }
}


@Pipe({name: 'removeWhiteSpace'})
export class RemoveWhiteSpacePipe implements PipeTransform {
  transform(value: string): string {
    return value.replace(/\s+/g, '-');
  }
}

In the ts file of your component, create the variable that will hold the value that will be transformed
textWithSpaces = 'This is a text with a lot of spaces that will be transformed';

In the html file of your component, do the following:
<p>{{ textWithSpaces | removeWhiteSpace }}</p>

    The result is the following:

Conclusion
Angular pipes are a powerful and efficient way to transform and format data in your application’s templates. By using built-in pipes, you can easily manipulate data types such as strings, dates, and numbers without having to write repetitive logic in your components. Custom pipes offer even more flexibility, allowing you to create reusable, maintainable, and modular transformation logic tailored to your specific needs.

Understanding the distinction between pipes and functions is key to leveraging their full potential. While functions provide a direct way to execute code, pipes offer a declarative approach to handle transformations directly within templates, improving readability and performance.


Building dynamic and user-friendly applications greatly benefits from the ease with which data can be manipulated in the view layer, whether you're using Angular's built-in pipes or making your own. Gaining proficiency with Angular Pipes will help you write code that is clear, succinct, and compliant with best practices, which will eventually result in applications that are easier to maintain and scale.

Now that you know how to utilize and design pipes, you can add strong data transformations to your Angular applications, which will improve the efficiency and enjoyment of your development process.



European Visual Studio 2022 Hosting - HostForLIFE.eu :: What’s New in Visual Studio 2026 Insiders: Faster, Smarter, and More Modern?

clock October 15, 2025 07:45 by author Peter

For many years, Visual Studio has been the preferred IDE for C++ and.NET developers. Microsoft has advanced developer productivity with the release of Visual Studio 2026 Insiders, which features a new user interface, significant speed improvements, and AI-powered coding assistance. The most intriguing features in Visual Studio 2026 and their implications for developers will be discussed in this blog.

Performance Enhancements
One of the biggest complaints developers often have is slow load times and laggy performance in large solutions. Visual Studio 2026 addresses this with:

  • Faster Operations: Solution loading, builds, and debugging are now significantly quicker.
  • Optimized for Large Codebases: Both x64 and Arm64 architectures benefit from better memory management and reduced delays.

For teams working on massive enterprise applications, these improvements translate into a smoother, more productive workflow.

Deep AI Integration with GitHub Copilot
Visual Studio 2026 takes AI integration to the next level:

  • Contextual Assistance: GitHub Copilot is now embedded directly into the IDE, providing smart code suggestions as you type.
  • Automation of Repetitive Tasks: From generating boilerplate code to suggesting optimizations, AI helps you focus on problem-solving instead of repetitive coding.

This makes VS 2026 a dream for developers looking to leverage AI to accelerate their projects.

Modern UI with Fluent Design
Microsoft has revamped the Visual Studio interface to be cleaner, more modern, and visually cohesive:

  • Fluent UI Overhaul: Menus, dialogs, and toolbars now follow Fluent Design principles.
  • New Themes: Eleven new tinted themes inspired by Microsoft Edge give you better contrast and readability.
  • Intuitive Settings: Icons, spacing, and menus are redesigned for a more user-friendly experience.

A modern, streamlined interface can reduce eye strain and make coding more enjoyable.

Side-by-Side Installation
Upgrading doesn’t mean breaking your current setup:

  • Coexist with Older Versions: Install Visual Studio 2026 alongside VS 2022 without conflicts.
  • Preserve Settings and Extensions: All your previous configurations and plugins remain intact, making the transition seamless.

Full-Stack Development Support
Visual Studio 2026 is ready for modern development:

  • .NET 10 and C# 14 Support: Build high-performance apps with the latest language features.
  • C++26 Updates: New language features, STL improvements, and cross-platform development capabilities.
  • Game Development Tools: Enhanced support for Unity, Unreal Engine, and C++ game development.

Whether you’re building enterprise apps, modern desktop applications, or games, VS 2026 has the tools you need.

Insider Preview Access

Developers eager to try new features early can join the Insiders Channel:

  • Access experimental tools and previews before they are officially released.
  • Provide feedback directly to the Visual Studio team to influence future updates.

Conclusion
Visual Studio 2026 isn’t just an upgrade; it’s a major step forward for developers. From blazing-fast performance and AI-powered coding assistance to a modernized UI, this IDE helps you code smarter, faster, and more efficiently.



AngularJS Hosting Europe - HostForLIFE :: How to Use Reactive Forms to Manage Form Validation in Angular?

clock October 8, 2025 08:52 by author Peter

Create a Basic Reactive Form
Start by importing ReactiveFormsModule in your Angular module:

// app.module.ts
import { ReactiveFormsModule } from '@angular/forms';

@NgModule({
  imports: [ReactiveFormsModule, /* other imports */],
})
export class AppModule {}


Then, build a form in your component using FormBuilder:
// user-form.component.ts
import { Component } from '@angular/core';
import { FormBuilder, FormGroup, Validators } from '@angular/forms';

@Component({ selector: 'app-user-form', templateUrl: './user-form.component.html' })
export class UserFormComponent {
  userForm: FormGroup;

  constructor(private fb: FormBuilder) {
    this.userForm = this.fb.group({
      name: ['', [Validators.required, Validators.minLength(2)]],
      email: ['', [Validators.required, Validators.email]],
      password: ['', [Validators.required, Validators.minLength(6)]],
    });
  }
}


In the template, bind the form and controls:
<!-- user-form.component.html -->
<form [formGroup]="userForm" (ngSubmit)="onSubmit()">
  <label>
    Name
    <input formControlName="name" />
  </label>
  <div *ngIf="userForm.get('name')?.touched && userForm.get('name')?.invalid">
    <small *ngIf="userForm.get('name')?.errors?.required">Name is required.</small>
    <small *ngIf="userForm.get('name')?.errors?.minlength">Name must be at least 2 characters.</small>
  </div>

  <label>
    Email
    <input formControlName="email" />
  </label>
  <div *ngIf="userForm.get('email')?.touched && userForm.get('email')?.invalid">
    <small *ngIf="userForm.get('email')?.errors?.required">Email is required.</small>
    <small *ngIf="userForm.get('email')?.errors?.email">Enter a valid email.</small>
  </div>

  <button type="submit" [disabled]="userForm.invalid">Submit</button>
</form>


Built-in Validators
Angular provides several built-in validators:

  • Validators.required — field must have a value.
  • Validators.email — value must be a valid email.
  • Validators.min / Validators.max — numeric limits.
  • Validators.minLength / Validators.maxLength — string length limits.
  • Validators.pattern — regex-based validation.

You can combine validators in an array for a control, as shown in the example above.

Custom Synchronous Validators

For rules that don’t exist out of the box (e.g., username format), write a custom validator function that returns either null (valid) or an error object:
import { AbstractControl, ValidationErrors } from '@angular/forms';

export function usernameValidator(control: AbstractControl): ValidationErrors | null {
  const value = control.value as string;
  if (!value) return null;
  const valid = /^[a-z0-9_]+$/.test(value);
  return valid ? null : { invalidUsername: true };
}

// usage in form builder
this.userForm = this.fb.group({
  username: ['', [Validators.required, usernameValidator]],
});

Show helpful messages in the template when invalidUsername exists.

Cross-Field Validation (Password Match)

Some validations depend on multiple controls. Use a validator on the FormGroup:
function passwordMatchValidator(group: AbstractControl): ValidationErrors | null {
  const password = group.get('password')?.value;
  const confirm = group.get('confirmPassword')?.value;
  return password === confirm ? null : { passwordsMismatch: true };
}

this.userForm = this.fb.group({
  password: ['', Validators.required],
  confirmPassword: ['', Validators.required],
}, { validators: passwordMatchValidator });

In the template, show the group-level error:
<div *ngIf="userForm.errors?.passwordsMismatch && userForm.touched">
  <small>Passwords do not match.</small>
</div>


Async Validators (e.g., Check Email Uniqueness)

Async validators are useful for server checks like "is this email taken?". They return an Observable or Promise.
import { AbstractControl } from '@angular/forms';
import { map } from 'rxjs/operators';
import { of } from 'rxjs';

function uniqueEmailValidator(apiService: ApiService) {
  return (control: AbstractControl) => {
    if (!control.value) return of(null);
    return apiService.checkEmail(control.value).pipe(
      map(isTaken => (isTaken ? { emailTaken: true } : null))
    );
  };
}

// in component
this.userForm = this.fb.group({
  email: ['', {
    validators: [Validators.required, Validators.email],
    asyncValidators: [uniqueEmailValidator(this.apiService)],
    updateOn: 'blur' // run async validator on blur to reduce calls
  }]
});

Use updateOn: 'blur' to prevent calling the server on every keystroke.

Displaying Validation State and UX Tips

  • Show errors only after user interaction — use touched or dirty to avoid overwhelming users with errors on load.
  • Disable submit while invalid — [disabled]="userForm.invalid" prevents sending bad data.
  • Focus the first invalid control — on submit, set focus to the first invalid field for better UX.
  • Use updateOn: 'blur' or debounce — reduces validation frequency and server calls.

Example to focus first invalid:
onSubmit() {
  if (this.userForm.invalid) {
    const invalidControl = this.el.nativeElement.querySelector('.ng-invalid');
    invalidControl?.focus();
    return;
  }
  // process valid form
}

Reacting to Value Changes and Live Validation
You can subscribe to valueChanges for any control or the whole form to implement live validation messages, dynamic rules, or enable/disable fields.
this.userForm.get('country')?.valueChanges.subscribe(country => {
  if (country === 'US') {
    this.userForm.get('state')?.setValidators([Validators.required]);
  } else {
    this.userForm.get('state')?.clearValidators();
  }
  this.userForm.get('state')?.updateValueAndValidity();
});

Remember to unsubscribe in ngOnDestroy or use the takeUntil pattern.

Integrating with Backend Validation
Server-side validation is the final source of truth. When the backend returns validation errors, map them to form controls so users can correct them:
// after API error response
handleServerErrors(errors: Record<string, string[]>) {
  Object.keys(errors).forEach(field => {
    const control = this.userForm.get(field);
    if (control) {
      control.setErrors({ server: errors[field][0] });
    }
  });
}


Show control.errors.server messages in the template.

Testing Form Validation
Unit test reactive forms by creating the component, setting values, and asserting validity:
it('should invalidate empty email', () => {
  component.userForm.get('email')?.setValue('');
  expect(component.userForm.get('email')?.valid).toBeFalse();
});


For async validators, use fakeAsync and tick() to simulate time.

  • Accessibility (A11y) Considerations
  • Always link error messages to inputs with aria-describedby.
  • Use clear error language and avoid technical terms.
  • Ensure focus management sends keyboard users to errors on submit.

Example
<input id="email" formControlName="email" aria-describedby="emailError" />
<div id="emailError" *ngIf="userForm.get('email')?.invalid">
  <small>Enter a valid email address.</small>
</div>

Performance Tips and Best Practices

  • Use OnPush change detection where appropriate to reduce re-renders.
  • Avoid heavy computation inside valueChanges subscribers.
  • Use debounceTime for expensive validations or server calls:

this.userForm.get('search')?.valueChanges.pipe(debounceTime(300)).subscribe(...);

Clean up subscriptions with takeUntil or async pipe.

Summary
An effective, testable method for managing form validation is provided by Angular's Reactive Forms. For common rules, use the built-in validators; for special cases, create your own sync and async validators; and for cross-field checks, such as password confirmation, use group validators. Enhance the user experience by integrating server-side errors using setErrors, emphasizing the initial incorrect control, and displaying errors upon interaction. Use performance techniques like debouncing and OnPush change detection, test your validations, and consider accessibility.



Europe mySQL Hosting - HostForLIFEASP.NET :: What happens if you restart the database service provided by WAMP, MySQL?

clock October 6, 2025 08:59 by author Peter

What happens when you restart MySQL (WAMP’s database service)?

  • Active connections are dropped → any application connected to MySQL will lose its session.
  • Running queries/transactions are aborted → if a query was in the middle of writing, MySQL will roll back that transaction (thanks to transaction logs in InnoDB).
  • Tables/data themselves are safe → MySQL ensures durability, so committed data is not lost.
  • Non-transactional tables (MyISAM) are riskier → if you still have MyISAM tables, they can become corrupted if a write was in progress when the service stopped.

Risks of Restarting Every 3 Hours

  • Apps/websites using the DB may fail while the service is down.
  • Any batch jobs, cron jobs, or API calls during restart will error out.
  • If you restart during heavy writes, performance may be affected briefly.

Tables themselves won’t get corrupted in InnoDB, but MyISAM tables can.

Safer Alternatives
Only restart if the service fails

Instead of restarting every 3 hours, configure Task Scheduler to start the service if it’s stopped (health check).

Example batch
sc query wampmysqld64 | find "RUNNING" >nul
if %errorlevel%==1 net start wampmysqld64
sc query wampapache64 | find "RUNNING" >nul
if %errorlevel%==1 net start wampapache64


This way it only starts services if they’re not running.

Schedule a restart during off-peak hours

e.g. once daily at 3 AM, when traffic is minimal.

Use MySQL config for stability
Instead of forced restarts, tune MySQL memory, query cache, etc., so it doesn’t need frequent restarting.

Answer to your question
No, restarting won’t corrupt data in InnoDB tables.

Yes, it can cause temporary downtime and aborted queries, so apps may face errors.

If you use MyISAM tables, there is a small risk of corruption.



Node.js Hosting - HostForLIFE :: Understanding package.json and package-lock.json in Node.js

clock October 3, 2025 08:48 by author Peter

1. What is package.json?
package.json is the heart of any Node.js project. It declares your project’s dependencies and provides metadata about your application.


Key Features

  • Lists dependencies and devDependencies.
  • Specifies version ranges using semantic versioning ( ^ , ~ ).
  • Includes project metadata like name, version, scripts, author, and license.
  • Human-readable and editable.

{
  "name": "my-app",
  "version": "1.0.0",
  "dependencies": {
    "lodash": "^4.17.21"
  },
  "devDependencies": {
    "jest": "~29.0.0"
  },
  "scripts": {
    "start": "node index.js"
  }
}


Key Point: package.json specifies what versions your project is compatible with , not the exact installed version.

2. What is package-lock.json?
package-lock.json is automatically generated by npm to lock the exact versions of every installed package, including nested dependencies.

Key Features

  • Records the exact version installed for each package.
  • Contains resolved URLs and integrity hashes to ensure packages are not tampered with.
  • Records nested dependencies (dependencies of dependencies).
  • Not intended for manual editing.

{
  "name": "my-app",
  "lockfileVersion": 3,
  "dependencies": {
    "lodash": {
      "version": "4.17.21",
      "resolved": "https://registry.npmjs.org/lodash/-/lodash-4.17.21.tgz",
      "integrity": "sha512-xyz"
    }
  }
}


Key Point: package-lock.json ensures that every environment installs exactly the same versions , even if package.json allows ranges.

3. Main Differences Between package.json and package-lock.json

Featurepackage.jsonpackage-lock.json
Purpose Declares dependencies and project info Locks exact versions of installed packages
Edited by Developer npm automatically
Version Can specify ranges (^, ~) Exact versions installed
Nested dependencies Not recorded Fully recorded
Effect on installation npm uses ranges to resolve versions Ensures consistent installs
Human-readable? Yes Not really

4. How npm install Works

The npm install command is used to install packages based on package.json and package-lock.json.

# Install all dependencies listed in package.json
npm install

# Install a specific package and save it to dependencies
npm install lodash

# Install a package as a dev dependency
npm install --save-dev jest

# Install a package globally
npm install -g typescript


Process

  • Reads package.json for dependencies.
  • Resolves the latest versions allowed by version ranges (if package-lock.json doesn’t exist).
  • Downloads packages to node_modules.
  • Updates or creates package-lock.json with exact versions.


5. What Happens If You Delete package-lock.json?

If package-lock.json is deleted and you run:

npm install

  • npm will resolve latest versions matching the ranges in package.json.
  • Download new packages and regenerate package-lock.json.
  • This may result in different versions from the previous install, which could break your code.

Safe scenarios for deleting:

  • Intentionally updating packages.
  • Starting a fresh project or refreshing dependencies.

Why are both files important

  • package.json defines what your project needs.
  • package-lock.json ensures everyone gets the exact same package versions for consistent development and production environments.

Conclusion
package.json = “What I want” (dependency ranges and project info)
package-lock.json = “Exactly what I got” (locked versions)


Deleting package-lock.json can lead to installing newer package versions, which may cause unexpected issues. Always commit package-lock.json to version control for consistency.



About HostForLIFE

HostForLIFE is European Windows Hosting Provider which focuses on Windows Platform only. We deliver on-demand hosting solutions including Shared hosting, Reseller Hosting, Cloud Hosting, Dedicated Servers, and IT as a Service for companies of all sizes.

We have offered the latest Windows 2019 Hosting, ASP.NET 5 Hosting, ASP.NET MVC 6 Hosting and SQL 2019 Hosting.


Tag cloud

Sign in