European Windows 2019 Hosting BLOG

BLOG about Windows 2019 Hosting and SQL 2019 Hosting - Dedicated to European Windows Hosting Customer

European SQL Server 2022 Hosting :: Creating a Data Versioning System that stores each record's history (SQL Server +.NET)

clock November 19, 2025 07:18 by author Peter

A data versioning system records every modification made to your information, including who made what changes, when they were made, and how to audit or restore previous versions. Audit, debugging, GDPR and financial compliance, and capabilities like "time travel," record revert, and change comparisons all depend on this.

With several implementation alternatives (built-in temporal tables, manual historical tables with triggers, and application-level capture), ER diagrams, flowcharts, SQL scripts, ASP.NET Core patterns, and best practices, this paper provides a workable, production-ready design.

Two main approaches (summary)

  • System-versioned temporal tables (SQL Server feature, simple, performant, automatic).
    • Pros: automatic row-versioning, built-in time travel query, efficient.
    • Cons: less flexible metadata (who/why), harder to store JSON diffs, needs SQL Server 2016+.
  • Manual history tables + triggers / stored procedures.
    • Pros: full control (store user, reason, JSON diffs, tx ids), easier to extend.
    • Cons: more code, need to manage triggers and retention.

Additionally, application-level capture (EF Core interceptors, change tracker) can supplement to include user/context info and business logic.

Option A: System-versioned temporal table (recommended when available)
1. Create temporal table

CREATE TABLE dbo.Customer
(
    CustomerId   UNIQUEIDENTIFIER NOT NULL PRIMARY KEY DEFAULT NEWID(),
    Name         NVARCHAR(200) NOT NULL,
    Email        NVARCHAR(200),
    Balance      DECIMAL(18,2) DEFAULT 0,
    SysStartTime DATETIME2 GENERATED ALWAYS AS ROW START NOT NULL,
    SysEndTime   DATETIME2 GENERATED ALWAYS AS ROW END NOT NULL,
    PERIOD FOR SYSTEM_TIME (SysStartTime, SysEndTime)
)
WITH (SYSTEM_VERSIONING = ON (HISTORY_TABLE = dbo.CustomerHistory));

SQL Server creates dbo.CustomerHistory automatically with the row versions. But CustomerHistory does not include ChangedBy or ChangeReason.

2. Add audit metadata (who/why)
You can store user info in a separate audit table or extend approach by writing triggers to insert augmented history or use an application-level write to an audit table.
Example: keep CustomerHistoryMeta where you store (HistoryRowPK, ChangedBy, ChangeReason, TxId) linked to history rows using SysStartTime and CustomerId as keys.

3. Query history (time travel)
-- Get record as of a point in time
SELECT *
FROM dbo.Customer
FOR SYSTEM_TIME AS OF '2025-11-01 10:00:00'
WHERE CustomerId = '...';

-- Get all versions
SELECT * FROM dbo.Customer
FOR SYSTEM_TIME ALL
WHERE CustomerId = '...'
ORDER BY SysStartTime;


4. Revert to historical version (pattern)
To revert, read the historical row and insert a new current row (or update current row) — do not “restore” history row directly; write a new change so revert is also tracked.

Option B: Manual history table + trigger (flexible, full metadata)
1. Schema (example for table Customer)
CREATE TABLE dbo.Customer
(
    CustomerId UNIQUEIDENTIFIER PRIMARY KEY DEFAULT NEWID(),
    Name NVARCHAR(200),
    Email NVARCHAR(200),
    Balance DECIMAL(18,2)
);

CREATE TABLE dbo.CustomerHistory
(
    HistoryId BIGINT IDENTITY PRIMARY KEY,
    CustomerId UNIQUEIDENTIFIER NOT NULL,
    ChangeType CHAR(1) NOT NULL, -- I/U/D
    ChangedBy NVARCHAR(200) NULL,
    ChangeReason NVARCHAR(500) NULL,
    ChangedAt DATETIME2 NOT NULL DEFAULT SYSUTCDATETIME(),
    PayloadJson NVARCHAR(MAX) NOT NULL, -- full row snapshot as JSON
    TxId UNIQUEIDENTIFIER NOT NULL DEFAULT NEWID()
);

CREATE INDEX IX_CustomerHistory_CustomerId ON dbo.CustomerHistory(CustomerId);
CREATE INDEX IX_CustomerHistory_TxId ON dbo.CustomerHistory(TxId);


2. Trigger to capture changes
CREATE TRIGGER trg_Customer_Audit
ON dbo.Customer
AFTER INSERT, UPDATE, DELETE
AS
BEGIN
    SET NOCOUNT ON;

    DECLARE @txid UNIQUEIDENTIFIER = NEWID();
    DECLARE @changedBy NVARCHAR(200) = SUSER_SNAME(); -- replace via CONTEXT_INFO if app sets

    -- INSERTED rows -> Insert
    INSERT INTO dbo.CustomerHistory (CustomerId, ChangeType, ChangedBy, ChangeReason, PayloadJson, TxId)
    SELECT i.CustomerId, 'I', @changedBy, NULL, (SELECT i.* FOR JSON PATH, WITHOUT_ARRAY_WRAPPER), @txid
    FROM inserted i;

    -- UPDATED rows -> Update (capture new snapshot or both old & new as needed)
    INSERT INTO dbo.CustomerHistory (CustomerId, ChangeType, ChangedBy, ChangeReason, PayloadJson, TxId)
    SELECT u.CustomerId, 'U', @changedBy, NULL, (SELECT u.* FOR JSON PATH, WITHOUT_ARRAY_WRAPPER), @txid
    FROM inserted u
    WHERE EXISTS (SELECT 1 FROM deleted d WHERE d.CustomerId = u.CustomerId);

    -- DELETED rows -> Delete
    INSERT INTO dbo.CustomerHistory (CustomerId, ChangeType, ChangedBy, ChangeReason, PayloadJson, TxId)
    SELECT d.CustomerId, 'D', @changedBy, NULL, (SELECT d.* FOR JSON PATH, WITHOUT_ARRAY_WRAPPER), @txid
    FROM deleted d;
END
Notes


SUSER_SNAME() gives SQL login; for application user prefer setting SESSION_CONTEXT or CONTEXT_INFO before DML and reading it in trigger (see below).

PayloadJson stores full state snapshot; you may instead store only changed columns (diff) if storage is a concern.

3. Passing application user into trigger
Before executing DML, set session context from application:
EXEC sp_set_session_context 'AppUser', '[email protected]';
Then inside trigger:

DECLARE @changedBy NVARCHAR(200) = CONVERT(NVARCHAR(200), SESSION_CONTEXT(N'AppUser'));

This makes ChangedBy accurate.

4. Stored procedures to query history
CREATE PROCEDURE usp_GetCustomerHistory
    @CustomerId UNIQUEIDENTIFIER
AS
BEGIN
    SELECT HistoryId, ChangeType, ChangedBy, ChangeReason, ChangedAt, PayloadJson
    FROM dbo.CustomerHistory
    WHERE CustomerId = @CustomerId
    ORDER BY ChangedAt DESC;
END


5. Revert procedure (create new record state from history)
CREATE PROCEDURE usp_RevertCustomerToHistory
    @HistoryId BIGINT,
    @RevertedBy NVARCHAR(200)
AS
BEGIN
    DECLARE @payload NVARCHAR(MAX);
    SELECT @payload = PayloadJson FROM dbo.CustomerHistory WHERE HistoryId = @HistoryId;

    -- parse JSON into columns and update current table
    UPDATE dbo.Customer
    SET Name = JSON_VALUE(@payload, '$.Name'),
        Email = JSON_VALUE(@payload, '$.Email'),
        Balance = TRY_CAST(JSON_VALUE(@payload, '$.Balance') AS DECIMAL(18,2))
    WHERE CustomerId = JSON_VALUE(@payload, '$.CustomerId');

    -- insert a history record marking revert
    INSERT INTO dbo.CustomerHistory (CustomerId, ChangeType, ChangedBy, ChangeReason, PayloadJson, TxId)
    VALUES (JSON_VALUE(@payload, '$.CustomerId'), 'U', @RevertedBy, 'Revert to HistoryId ' + CAST(@HistoryId AS NVARCHAR(20)), @payload, NEWID());
END


Application-level capture (EF Core interceptor) - add user / reason
If you use EF Core, intercept SaveChanges to write audit to history table so you have full contextual data (user id, IP, reason).

Example (simplified)
public class AuditSaveChangesInterceptor : SaveChangesInterceptor
{
    private readonly IHttpContextAccessor _http;
    public AuditSaveChangesInterceptor(IHttpContextAccessor http) => _http = http;

    public override async ValueTask<InterceptionResult<int>> SavingChangesAsync(DbContextEventData eventData,
        InterceptionResult<int> result, CancellationToken cancellationToken = default)
    {
        var ctx = eventData.Context;
        var user = _http.HttpContext?.User?.Identity?.Name ?? "system";
        var entries = ctx.ChangeTracker.Entries().Where(e => e.State == EntityState.Modified
                                                           || e.State == EntityState.Added
                                                           || e.State == EntityState.Deleted);

        foreach (var entry in entries)
        {
            var payload = JsonSerializer.Serialize(entry.CurrentValues.ToObject()); // or build object
            var history = new CustomerHistory
            {
                CustomerId = (Guid)entry.Property("CustomerId").CurrentValue,
                ChangeType = entry.State == EntityState.Added ? "I" : entry.State == EntityState.Deleted ? "D" : "U",
                ChangedBy = user,
                PayloadJson = payload,
                ChangedAt = DateTime.UtcNow,
                TxId = Guid.NewGuid()
            };
            ctx.Set<CustomerHistory>().Add(history);
        }

        return await base.SavingChangesAsync(eventData, result, cancellationToken);
    }
}

Register interceptor in Program.cs for EF Core.

Benefits: you have direct access to user principal and request info.

Query patterns & useful API endpoints

  • GET /api/entity/{id}/history — list versions.
  • GET /api/entity/{id}/history/{historyId} — fetch specific version.
  • GET /api/entity/{id}/diff?from=histA&to=histB — return field-level diff.
  • POST /api/entity/{id}/revert — revert to historyId (authz required).
  • GET /api/entity/{id}/asOf?timestamp=... — time-travel view (temporal tables easy).

Example C# controller methods using Dapper/EF Core — omitted for brevity (pattern same as stored procs).

Field-level diff (practical approach)
Store PayloadJson for each version (full row).

To compute diff, fetch two JSON objects and compare keys; report changed fields, old and new values. Use server code (C#) to parse JSON into Dictionary<string, object> and compare.

Example diff function (C# pseudocode)
Dictionary<string, object> left = JsonSerializer.Deserialize<Dictionary<string, object>>(leftJson);
Dictionary<string, object> right = JsonSerializer.Deserialize<Dictionary<string, object>>(rightJson);
var diffs = new List<Diff>();
foreach(var key in left.Keys.Union(right.Keys))
{
    left.TryGetValue(key, out var lv);
    right.TryGetValue(key, out var rv);
    if (!object.equals(lv, rv))
        diffs.Add(new Diff { Field = key, Old = lv?.ToString(), New = rv?.ToString() });
}

Return diffs to UI.

Retention, compression and archiving

  • Keep history for required retention window (e.g., 2–7 years) per compliance.
  • Archive older history to cheaper storage (Parquet/JSON files in Blob/S3). Provide a process to purge archived rows.
  • Consider compressing PayloadJson (e.g., gzip) if history large. Store as VARBINARY or use table compression features.
  • Avoid storing huge BLOBs repeatedly; store references instead.

Concurrency, transaction and tx-id handling
Use a TxId and TransactionCommittedAt in history to group related row changes into a single logical transaction. For trigger approach, generate TxId from CONTEXT_INFO or SESSION_CONTEXT set by application before DML. In triggers use that TxId instead of NEWID() if you want grouping.

Example in application
EXEC sp_set_session_context @key = 'TxId', @value = '...';

Trigger reads

DECLARE @txid UNIQUEIDENTIFIER = CONVERT(uniqueidentifier, SESSION_CONTEXT(N'TxId'));

This allows you to revert a complete transaction by selecting all history rows with same TxId.

Security & GDPR considerations
Protect personal data: encryption at rest and in transit.

Provide delete/forget workflows: redact PII in history if user requests (but preserve audit trail per law). Consider pseudonymisation.

Audit access to history itself (who viewed history). Log access events separately.

Restrict revert endpoints to authorized roles.

Performance considerations

  • Index history tables on (EntityId, ChangedAt).
  • Partition large history tables by date (Monthly/Yearly).
  • Use compression (ROW/ PAGE) on history partitions.
  • For high write volumes prefer temporal tables which are highly optimised, or use append-only history tables with minimal indexes to speed inserts; create read-optimized projections for reporting.

Testing checklist

  • Verify INSERT / UPDATE / DELETE produce history rows with correct payload.
  • Test user propagation via SESSION_CONTEXT.
  • Test revert and confirm a new history row created.
  • Test time-travel queries for temporal tables.
  • Test diff outputs for correctness.
  • Test retention/archiving and reimport from archive.
  • Load-test writes and history insert rate.

Example migration & scripts (summary)

  • Add history table schema for each entity.
  • Add triggers or enable temporal versioning.
  • Add SyncWatermark table if you need cross-system sync.
  • Add stored procedures to fetch history, revert, cleanup/archive.
  • Add application interceptor to set SESSION_CONTEXT('AppUser') and optional TxId.
  • Build APIs for UI and admin tasks.

Final recommendations

  • If you are using SQL Server 2016+ and you simply require time-travel and row-version history queries:  utilize temporal tables that are system-versioned (easier, performant).
  • Use manual history tables and triggers in conjunction with application-level session context if you require more elaborate revert/grouping, JSON diffs, or richer information (who/why).
  • Combine the two methods: keep a separate AuditMeta table containing user/tx metadata created by an application or trigger to link history rows to metadata, and utilize temporal tables for automatic versioning.
  • Automate monitoring, archiving, and retention.

HostForLIFEASP.NET SQL Server 2022 Hosting



European SQL Server 2022 Hosting :: Identifying SQL Server "Cannot Initialize Data Source Object" Errors

clock November 11, 2025 08:04 by author Peter

While working with Linked Servers, OPENQUERY, or OPENROWSET in SQL Server, you may encounter one of the most common and frustrating errors:
OLE DB provider "Microsoft.ACE.OLEDB.12.0" for linked server "(null)" returned message "Cannot initialize data source object of OLE DB provider".

This error usually occurs when SQL Server is unable to access or initialize the external data source (like Excel, Access, or another SQL Server).

In this guide, we’ll break down:

  • The main causes of this error
  • Step-by-step troubleshooting
  • Common scenarios (Excel, Access, Linked Servers)
  • Configuration & security fixes

Common Scenarios Where the Error Appears

ScenarioExample Code
Querying Excel via OPENROWSET SELECT * FROM OPENROWSET('Microsoft.ACE.OLEDB.12.0', 'Excel 12.0;Database=C:\Data\Sales.xlsx;HDR=YES', 'SELECT * FROM [Sheet1$]');
Accessing Access Database SELECT * FROM OPENROWSET('Microsoft.ACE.OLEDB.12.0', 'C:\Data\Customer.accdb';'admin';'', 'SELECT * FROM Customers');
Using Linked Server SELECT * FROM OPENQUERY(ExcelLink, 'SELECT * FROM [Sheet1$]');

If any of these fail, you’ll often see the “Cannot initialize data source object” error.

Root Causes of the Error

Here are the most common reasons this error occurs:

Step-by-Step Troubleshooting Guide
Step 1: Check File Permissions

  • Locate the file (e.g., C:\Data\Sales.xlsx).
  • Right-click → Properties → Security tab.
  • Ensure the SQL Server service account (like NT SERVICE\MSSQLSERVER or Network Service) has read/write permissions.

If not sure which account SQL uses, run:
SELECT servicename, service_account
FROM sys.dm_server_services;


Step 2: Enable Ad Hoc Distributed Queries
Run the following in SSMS:
EXEC sp_configure 'show advanced options', 1;
RECONFIGURE;
EXEC sp_configure 'Ad Hoc Distributed Queries', 1;
RECONFIGURE;


Then re-run your OPENROWSET or OPENDATASOURCE query.

Step 3: Verify OLE DB Provider Installation
Check if the required OLE DB provider is installed:
For Excel/Access → Microsoft.ACE.OLEDB.12.0
For SQL Server-to-SQL Server → SQLNCLI or MSOLEDBSQL


You can verify it using:
EXEC master.dbo.sp_enum_oledb_providers;

Step 4: Check 32-bit vs 64-bit Compatibility
SQL Server (64-bit) requires a 64-bit version of the OLE DB provider.

If you’re running a 32-bit SSMS, but the server uses 64-bit SQL, install both provider versions or run your query via SQL Server Agent Job.

Step 5: Ensure File Is Closed and Accessible
If the Excel file is open by another user or locked for editing, SQL can’t read it.
Close the file and retry.

If it’s on a network path, ensure:
\\ServerName\SharedFolder\File.xlsx

is accessible from the SQL Server machine using the same service account credentials.

Step 6: Test Connection String
Try running this minimal query:
SELECT *
FROM OPENROWSET(
    'Microsoft.ACE.OLEDB.12.0',
    'Excel 12.0;Database=C:\Data\Test.xlsx;HDR=YES',
    'SELECT TOP 5 * FROM [Sheet1$]'
);

If it works with a simple file, the issue is likely your original path or sheet name.

Step 7: Configure Linked Server Options
If using Linked Server for Excel or Access:
EXEC sp_addlinkedserver
    @server='ExcelLink',
    @srvproduct='Excel',
    @provider='Microsoft.ACE.OLEDB.12.0',
    @datasrc='C:\Data\Sales.xlsx',
    @provstr='Excel 12.0;HDR=YES';

EXEC sp_serveroption 'ExcelLink', 'Data Access', TRUE;

Then test:
SELECT * FROM OPENQUERY(ExcelLink, 'SELECT * FROM [Sheet1$]');


Advanced Troubleshooting Tips

  • Reboot the SQL Service after installing new OLE DB drivers.
  • If you’re running on SQL Server Express, ensure it supports Distributed Queries.
  • Avoid UNC paths (\\Server\Folder\File.xlsx) unless the SQL service has domain permissions.

Check Event Viewer logs under Application → MSSQLSERVER for detailed provider errors.

Alternative Approaches

If the problem persists, consider alternatives:

  • Use Import/Export Wizard (in SSMS) instead of OPENROWSET.
  • Use BULK INSERT for CSV data.
  • For Access, use ODBC Linked Tables or .NET Integration in your application layer.

HostForLIFEASP.NET SQL Server 2022 Hosting



European SQL Server 2022 Hosting :: Developers' Best Practices for SQL Server Security

clock November 3, 2025 06:26 by author Peter

One of the most crucial components of any database system is security. As developers, we frequently overlook the fact that a single unsafe query or open permission can reveal private company information in favor of performance. With examples, explanations, and a straightforward flowchart, let's examine SQL Server Security Best Practices that all developers should adhere to in this post.

The Significance of Security
Critical data, such as user passwords, financial transactions, personal information, etc., is stored by modern applications.
If your SQL Server isn't set up securely, hackers can take advantage of weaknesses like:

  • Injection of SQL
  • Elevation of Privilege
  • Information Leakage
  • Unauthorized Entry
That’s why database-level security must be a shared responsibility between developers and DBAs.


Security Layers in SQL Server
Before jumping into best practices, understand that SQL Server security has multiple layers :

  • Authentication: Who can access the server
  • Authorization: What actions they can perform
  • Encryption: How data is protected in transit and at rest
  • Auditing: Tracking who did what and when

Best Practices for Developers
Let’s break down the most essential security best practices step by step.

1. Use Parameterized Queries (Prevent SQL Injection)

Never concatenate user input directly in your SQL statements.

Vulnerable Example

string query = "SELECT * FROM Users WHERE Username = '" + userInput + "'";

Safe Example
SqlCommand cmd = new SqlCommand("SELECT * FROM Users WHERE Username = @Username", conn);
cmd.Parameters.AddWithValue("@Username", userInput);

Why: Parameterized queries ensure that input is treated as data, not executable SQL, preventing SQL injection attacks.

2. Follow the Principle of Least Privilege
Grant only the permissions required — nothing more.

Don’t

  • Use sa or system admin accounts for applications.
  • Give db_owner role to every user. 

Do

  • Create application-specific users with limited access.
  • Assign roles like db_datareader or db_datawriter as needed.

3. Secure Connection Strings
Never store connection strings in plain text inside your source code.
Use Configuration Files or Secrets Manager:

  • .NET Core: Store in appsettings.json and protect with User Secrets or Azure Key Vault.
  • Windows: Use DPAPI or Encrypted Configuration Sections.

Example
"ConnectionStrings": {
  "DefaultConnection": "Server=myServer;Database=myDB;User Id=appUser;Password=***;"
}


4. Encrypt Sensitive Data
Use SQL Server encryption features to protect confidential data.

Transparent Data Encryption (TDE)

Encrypts the database files (.mdf, .ldf) at rest.
CREATE DATABASE ENCRYPTION KEY
WITH ALGORITHM = AES_256
ENCRYPTION BY SERVER CERTIFICATE MyServerCert;
ALTER DATABASE MyDB SET ENCRYPTION ON;


Column-Level Encryption
Encrypt specific columns like passwords or credit card numbers.

CREATE COLUMN MASTER KEY MyKey
WITH ALGORITHM = RSA_2048;

5. Avoid Hardcoding Credentials
Never hardcode usernames, passwords, or keys in your code.
Always use:

  • Environment variables
  • Secure configuration management
  • Secret stores (e.g., Azure Key Vault, AWS Secrets Manager)

6. Enable Row-Level Security (RLS)
Row-Level Security restricts data visibility based on user or role.

Example
CREATE SECURITY POLICY SalesFilter
ADD FILTER PREDICATE dbo.fnSecurityPredicate(UserID)
ON dbo.Sales WITH (STATE = ON);


This ensures each user can only see data they are authorized to view.

7. Implement Data Masking
Use Dynamic Data Masking to hide sensitive information from unauthorized users.
ALTER TABLE Customers
ALTER COLUMN Email ADD MASKED WITH (FUNCTION = 'email()');

Result
Admin sees full email: [email protected]
Analyst sees masked: [email protected]


8. Regularly Patch and Update SQL Server
Always apply the latest SQL Server Service Packs and Cumulative Updates .
Outdated versions often contain known vulnerabilities that hackers exploit.

9. Use Secure Network Connections (SSL/TLS)
Enable encryption for data in transit.
Force Encryption = Yes


In the connection string:
Encrypt=True;TrustServerCertificate=False;

10. Audit and Monitor Database Activity
Enable SQL Server Audit to track actions such as login attempts, schema changes, or data access.
CREATE SERVER AUDIT Audit_LoginTracking
TO FILE (FILEPATH = 'C:\AuditLogs\')
WITH (ON_FAILURE = CONTINUE);

Then:
ALTER SERVER AUDIT Audit_LoginTracking WITH (STATE = ON);

You can later review logs to identify suspicious activities.

Flowchart: SQL Server Security Flow

Here’s a simple visualization of how SQL Server enforces security at multiple layers:

            ┌─────────────────────────┐
            │   User / Application    │
            └──────────┬──────────────┘
                       │
                       ▼
         ┌────────────────────────┐
         │ Authentication Layer   │
         │ (Login / Password / AD)│
         └─────────────┬──────────┘
                       │
                       ▼
         ┌────────────────────────┐
         │ Authorization Layer    │
         │ (Roles, Permissions)   │
         └─────────────┬──────────┘
                       │
                       ▼
         ┌─────────────────────────┐
         │ Row-Level / Data Access │
         │ (RLS, Masking, Filters) │
         └─────────────┬───────────┘
                       │
                       ▼
         ┌─────────────────────────┐
         │ Encryption Layer        │
         │ (TDE, SSL, Column Key)  │
         └─────────────┬───────────┘
                       │
                       ▼
         ┌─────────────────────────┐
         │ Auditing & Monitoring   │
         │ (Logs, Alerts, Reports) │
         └─────────────────────────┘

This layered approach ensures defense at every step of the data access process.

Final Checklist for Developers

Security AreaBest PracticeExample

Input Handling

Parameterized Queries

@param

Access Control

Least Privilege

Limited Roles

Data Protection

Encryption & Masking

TDE / AES

Secrets Management

No Hardcoded Credentials

Azure Key Vault

Monitoring

SQL Server Audit

Audit Logs


Conclusion

Database security must be incorporated into the application from the start by developers, not only DBAs.

Together, authentication, authorization, encryption, and auditing make up a safe SQL Server configuration.
Recall:
"Security problems can ruin your company, but performance problems can harm your app."

Thus, make SQL Server security a habit rather than a checklist by adhering to these procedures constantly.

HostForLIFEASP.NET SQL Server 2022 Hosting



About HostForLIFE

HostForLIFE is European Windows Hosting Provider which focuses on Windows Platform only. We deliver on-demand hosting solutions including Shared hosting, Reseller Hosting, Cloud Hosting, Dedicated Servers, and IT as a Service for companies of all sizes.

We have offered the latest Windows 2019 Hosting, ASP.NET 5 Hosting, ASP.NET MVC 6 Hosting and SQL 2019 Hosting.


Month List

Tag cloud

Sign in