European Windows 2019 Hosting BLOG

BLOG about Windows 2019 Hosting and SQL 2019 Hosting - Dedicated to European Windows Hosting Customer

European MySQL Hosting - HostForLIFE.eu :: Reporting Progress Using MySqlBackup.NET

clock July 28, 2025 08:03 by author Peter

In this thorough tutorial, we'll show you how to use MySqlBackup.NET to create stunning, real-time progress reports for database backup and restore processes. This lesson will assist you in creating polished progress indicators that tell your users at every stage of the development process, whether you're creating a web interface or a desktop program.

What is MySqlBackup.NET?
MySqlBackup.NET is a C# open source tool used for backup and restore of MySQL database. Find out more at the github repository.

Live Demo
Before we dive into the code, check out this visual demonstration of what we’ll be building:

This video showcases the real-time progress reporting system in action, complete with multiple beautiful CSS themes that you can customize for your application.

An overall of the Lifecycle of Export Progress
Everything is happening all at once:


Steps to Understanding the Foundation
In the export process, MySqlBackup.NET provides the progress status through the event mb.ExportProgressChanged:
void Backup()
{
    string fileBackup = Server.MapPath("~/backup.sql");

    using (var conn = new MySqlConnection(constr))
    using (var cmd = conn.CreateCommand())
    using (var mb = new MySqlBackup(cmd))
    {
        conn.Open();
        mb.ExportInfo.IntervalForProgressReport = 200; // 5 updates per second
        mb.ExportProgressChanged += Mb_ExportProgressChanged;
        mb.ExportToFile(fileBackup);

    }
}

private void Mb_ExportProgressChanged(object sender, ExportProgressArgs e)
{
    long totalRows = e.TotalRowsInAllTables;
    long currentRow = e.CurrentRowIndexInAllTables;
    int percent = CalculatePercent(taskInfo.TotalRows, taskInfo.CurrentRow);
}

Notice that the above code is running in a single thread (the main thread), but in order to have a progress report, we need to move the main process of MySqlBackup.NET to another secondary thread, so-called asynchronous in C#.

Dedicating the Export Process to another thread (asynchronous), by using System.Threading.Task
void Backup()
{
    // Asynchronous
    // Export process is splitted into another thread
    // firing the task in another thread
    _ = Task.Run(() => { BeginBackup(); });
}

void BeginBackup()
{
    using (var conn = new MySqlConnection(constr))
    using (var cmd = conn.CreateCommand())
    using (var mb = new MySqlBackup(cmd))
    {
        conn.Open();
        mb.ExportInfo.IntervalForProgressReport = 200; // 5 updates per second
        mb.ExportProgressChanged += Mb_ExportProgressChanged;
        mb.ExportToFile(fileBackup);
    }
}

C#

Next, creating the Intermediary Caching Data Centre.

Write a simple class to hold the values:

public class TaskInfo
{
    public bool IsComplete { get; set;} = false;
    public int Percent { get; set; } = 0;
    public long TotalRows { get; set; } = 0L;
    public long CurrentRow { get; set; } = 0L;
}


Declare a global static ConcurrentDictionary to hold the class. ConcurrentDictionary is a specialized thread safe variable. It allows multiple different threads to safely read and write the values.

We are marking it static so that the caching values can be accessed by all threads (all request) of the whole application. 
public class api
{
    // accesible by any thread concurrently
    static ConcurrentDictionary<int, TaskInfo> dicTask = new ConcurrentDictionary<int, TaskInfo>();

    void Backup()
    {
        _ = Task.Run(() => { BeginBackup(); });
    }

    void BeginBackup()
    {
        ...
    }

    private void Mb_ExportProgressChanged(object sender, ExportProgressArgs e)
    {
        ...
    }
}

Next, we are going to receive the progress status values provided by MySqlBackup.NET:
public class api
{
    static ConcurrentDictionary<int, TaskInfo> dicTask = new ConcurrentDictionary<int, TaskInfo>();

    void Backup()
    {
        // get a new taskid, can be a random number
        int newTaskId = GetNewTaskId();

        _ = Task.Run(() => { BeginBackup(newTaskId); });
    }

    // caching the taskid in this thread
    int taskid = 0;

    void BeginBackup(int newTaskId)
    {
        // cache the task id
        taskid = newTaskId;

        string fileBackup = Server.MapPath("~/backup.sql");

        // Create a new task info to collect data
        TaskInfo taskInfo = new TaskInfo();

        // Cache the task info into the global dictionary
        dicTask[newTaskId] = taskInfo;

        using (var conn = new MySqlConnection(constr))
        using (var cmd = conn.CreateCommand())
        using (var mb = new MySqlBackup(cmd))
        {
            conn.Open();
            mb.ExportInfo.IntervalForProgressReport = 200; // 5 updates per second
            mb.ExportProgressChanged += Mb_ExportProgressChanged;
            mb.ExportToFile(fileBackup);
        }

        // Passing this point, mark the completion of the process

        // Informing the frontend about the completion
        taskInfo.IsComplete = true;
    }

    private void Mb_ExportProgressChanged(object sender, ExportProgressArgs e)
    {
        // Retrieve the task info belongs to this task id
        if (dicTask.TryGetValue(taskid, out var taskInfo))
        {
            // Collect the data provided by the export process
            taskInfo.TotalRows = e.TotalRowsInAllTables;
            taskInfo.CurrentRow = e.CurrentRowIndexInAllTables;
            taskInfo.Percent = CalculatePercent(taskInfo.TotalRows, taskInfo.CurrentRow);
        }
    }
}

Now, let’s integrate this logic into the application.
In this article, we’ll be using ASP.NET Web Forms as our demonstration platform to show how progress reporting integrates seamlessly with web applications.

In this demo of ASP.NET Web Forms, we’ll be using:

  • ZERO VIEWSTATE  – The secret to modern Web Forms
  • Zero server control
  • No custom user control
  • No postback
  • No UpdatePanel
  • Pure HTML, CSS, and JavaScript / FetchApi

Important Note: Once you understand the core working mechanism and patterns demonstrated in this walkthrough, you can easily replicate these same ideas to build your own UI progress reporting system in any application framework. Whether you’re developing with:

  • Windows Forms (WinForms) applications
  • .NET Core web applications
  • ASP.NET MVC projects
  • Blazor applications
  • Mono cross-platform apps
  • Console applications with custom UI
  • WPF desktop applications

The fundamental principles remain the same: create an intermediary caching layer, implement the two-thread pattern, and provide real-time UI updates. The specific implementation details will vary by framework, but the core architecture and MySqlBackup.NET integration patterns are universally applicable.

Let’s begin by building our backend API to handle the backup and restore logic. We’ll create a new blank ASP.NET Web Forms page. A typical new blank web form page will look something like this:
<%@ Page Language="C#" AutoEventWireup="true" CodeBehind="apiBackup.aspx.cs" Inherits="myweb.apiBackup" %>

<!DOCTYPE html>

<html xmlns="http://www.w3.org/1999/xhtml">
<head runat="server">
    <title></title>
</head>
<body>
    <form id="form1" runat="server">
        <div>
        </div>
    </form>
</body>
</html>


Delete everything from the frontend markup, leaving only the page directive declaration at the top:
<%@ Page Language="C#" AutoEventWireup="true" CodeBehind="api.aspx.cs" Inherits="myweb.api" %>

Now, go to the code behind, which typically look like this:
namespace myweb
{
    public partial class apiBackup : System.Web.UI.Page
    {
        protected void Page_Load(object sender, EventArgs e)
        {

        }
    }
}


Patch the code that we’ve written previously to the backend of this web form page:
// backend C#

public partial class apiBackup : System.Web.UI.Page
{
    string constr = "server=localhost;user=root;pwd=1234;database=mydatabase;";

    static ConcurrentDictionary<int, TaskInfo> dicTask = new ConcurrentDictionary<int, TaskInfo>();

    protected void Page_Load(object sender, EventArgs e)
    {
        // get the action for the request from the frontend
        string action = Request["action"] + "";

        switch (action)
        {
            case "backup":
                Backup();
                break;
        }
    }

    int taskid = 0;

    void Backup()
    {
        int newTaskId = GetNewTaskId();

        // run the task in the background
        // no waiting
        _ = Task.Run(() => { BeginBackup(newTaskId); });

        // immediately tell the task id to the frontend
        Response.Write(newTaskId.ToString());
    }

    void BeginBackup(int newTaskId)
    {
        taskid = newTaskId;

        string fileBackup = Server.MapPath("~/backup.sql");

        TaskInfo taskInfo = new TaskInfo();

        dicTask[newTaskId] = taskInfo;

        using (var conn = new MySqlConnection(constr))
        using (var cmd = conn.CreateCommand())
        using (var mb = new MySqlBackup(cmd))
        {
            conn.Open();
            mb.ExportInfo.IntervalForProgressReport = 200; // 5 updates per second
            mb.ExportProgressChanged += Mb_ExportProgressChanged;
            mb.ExportToFile(fileBackup);
        }

        taskInfo.IsComplete = true;
    }

    private void Mb_ExportProgressChanged(object sender, ExportProgressArgs e)
    {
        if (dicTask.TryGetValue(taskid, out var taskInfo))
        {
            taskInfo.TotalRows = e.TotalRowsInAllTables;
            taskInfo.CurrentRow = e.CurrentRowIndexInAllTables;
            taskInfo.Percent = CalculatePercent(taskInfo.TotalRows, taskInfo.CurrentRow);
        }
    }


Now, we have built an api endpoint that allows the frontend to tell the server to start a backup task.

There are two main ways that the frontend can do this with:

  • Get Request or
  • Post Request

Using Get Request:
// frontend javascript

// cache the task id globally at the frontend
let taskid = 0;

async function backup() {
    const response = await fetch('/api?action=backup', {
        method: 'GET',
        credentials: 'include'
    });

    // obtain the task id returned from the backend

    // convert the return response into text
    const responseText = await response.text();

    // convert the text into number (task id)
    taskid = parseInt(responseText);
}


Using Post Request:
// frontend javascript

async function backup() {

    const formData = new FormData();
    formData.append('action', 'backup');

    const response = await fetch('/api', {
        method: 'POST',
        body: formData,
        credentials: 'include'
    });

    // obtain the task id returned from the backend

    // convert the return response into text
    const responseText = await response.text();

    // convert the text into number (task id)
    taskid = parseInt(responseText);
}

In this context, it is recommended to use a POST request, as the query string will not be exposed to the server log.

Next, at the frontend, write a new function getStatus() to get the progress status from the backend:
// frontend javascript

async function getStatus() {

    const formData = new FormData();
    formData.append('action', 'getstatus');
    formData.append('taskid', taskid);

    const response = await fetch('/api', {
        method: 'POST',
        body: formData,
        credentials: 'include'
    });

    // response = progress status
}

Now, back to the backend api, handle a new action for “getstatus”:
protected void Page_Load(object sender, EventArgs e)
{
    string action = Request["action"] + "";

    switch (action)
    {
        case "backup":
            Backup();
            break;
        case "getstatus":
            GetStatus();
            break;
    }
}

void GetStatus()
{
    string requestTaskId = Request["taskid"] + "";

    int _taskid = Convert.ToInt32(requestTaskId);

    // get the task info from the global dictionary
    if (dicTask.TryGetValue(_taskid, out var taskInfo))
    {
        // convert the class object into json string

        // Install the Nuget Package of "System.Text.JSON" to enable this function
        // You also need to add an 'using' line at the top of your code to use this:
        // using System.Text.JSON;

        string json = JsonSerializer.Serialize(taskInfo);
        Response.ContentType = "application/json";

        // send the progress status back to the frontend
        Response.Write(json);
    }
}


Example of responding text (JSON):
// json

{
  "IsComplete": false,
  "Percent": 50,
  "TotalRows": 10000,
  "CurrentRow": 5000
}

Go back to the frontend, continue to handle the request at the frontend:
// frontend javascript

async function getStatus() {
    const formData = new FormData();
    formData.append('action', 'getstatus');
    formData.append('taskid', taskid);

    const response = await fetch('/api', {
        method: 'POST',
        body: formData,
        credentials: 'include'
    });

    // Convert the response to text
    let result = await response.text();

    // Convert the result to JSON object
    let jsonObject = JSON.parse(result);

    // You can now work with progress value:
    document.getElementById("spanIsComplete").textContent  = jsonObject.IsComplete;
    document.getElementById("spanPercent").textContent  = jsonObject.Percent + "%";
    document.getElementById("spanTotalRows").textContent  = jsonObject.TotalRows;
    document.getElementById("spanCurrentRow").textContent  = jsonObject.CurrentRow;
}

Which at the frontpage you will have the following html span containers for the above JavaScript to fill in with:
// frontend html

<p>Is Complete: <span id="spanIsComplete"></span></p>
<p>Percent: <span id="spanPercent"></span></p>
<p>Total Rows: <span id="spanTotalRows"></span></p>
<p>Current Row: <span id="spanCurrentRow"></span></p>

Now, we’ll be using an interval timer to repeatedly call the function getStatus() Every 200ms to continuously update the frontend UI.

// frontend javascript

let intervalTimer = null;

// Start monitoring
startMonitoring() {
    stopMonitoring();
    intervalTimer = setInterval(
        () => getStatus(),
        200);
},

// Stop monitoring
stopMonitoring() {
    if (intervalTimer) {

        // stop the timer
        clearInterval(intervalTimer);

        // remove the timer
        intervalTimer = null;
    }
},

Now, after the backup is successfully called to work, we’ll begin monitoring the progress by calling the interval timer:
// frontend javascript

async function backup() {

    const formData = new FormData();
    formData.append('action', 'backup');

    const response = await fetch('/api', {
        method: 'POST',
        body: formData,
        credentials: 'include'
    });

    const responseText = await response.text();

    taskid = parseInt(responseText);

    // by this point the backup task has already begun
    // start monitoring the progress status
    startMonitoring();
}

Now, the interval is set to call 5 times per second… until the task is completed.
How to call it to stop when the task is completed? by calling the function:
stopMonitoring();

Where is a good location to call the timer to stop?
You want to pause for a moment to think?
Yes, right after the value returned from the backend, where we can check the completion status. If it is completed, simply call the timer to stop:
// frontend javascript

async function getStatus() {
    const formData = new FormData();
    formData.append('action', 'getstatus');
    formData.append('taskid', taskid);

    const response = await fetch('/api', {
        method: 'POST',
        body: formData,
        credentials: 'include'
    });

    let result = await response.text();

    let jsonObject = JSON.parse(result);

    document.getElementById("spanIsComplete").textContent  = jsonObject.IsComplete;
    document.getElementById("spanPercent").textContent  = jsonObject.Percent + "%";
    document.getElementById("spanTotalRows").textContent  = jsonObject.TotalRows;
    document.getElementById("spanCurrentRow").textContent  = jsonObject.CurrentRow;

    // now check for the completion status here
    if (jsonObject.IsComplete) {
        // yes, it's completed, call the timer stop now
        stopMonitoring();
    }
}


Now, before we proceed to the full walkthrough section. Let’s do one more function.

What if we want to stop the export before it gets finished?

Let’s create another javascript function to call the backend to stop the process:
// frontend javascript

async function stopTask() {

    const formData = new FormData();
    formData.append("action", "stoptask");
    formData.append("taskid", taskid);

    // sending stop request to backend
    const response = await fetch('/api', {
        method: 'POST',
        body: formData,
        credentials: 'include'
    });
}


Back to C# backend api endpoint.

First edit the task info object to handle the stop request:
// backend C#

public class TaskInfo
{
    public bool IsComplete { get; set;} = false;
    public int Percent { get; set; } = 0;
    public long TotalRows { get; set; } = 0L;
    public long CurrentRow { get; set; } = 0L;

    // add a boolean flag for marking the task to be cancelled
    public bool RequestCancel { get; set; } = false;
}

Let's handle the new stop request:
```csharp
// backend C#

protected void Page_Load(object sender, EventArgs e)
{
    string action = Request["action"] + "";

    switch (action)
    {
        case "backup":
            Backup();
            break;
        case "getstatus":
            GetStatus();
            break;

        // handle stop request
        case "stoptask":
            Stop();
            break;
    }
}

void Stop()
{
    string requestTaskId = Request["taskid"] + "";

    int _taskid = Convert.ToInt32(requestTaskId);

    // get the task info
    if (dicTask.TryGetValue(_taskid, out var taskInfo))
    {
        // set the boolean flag to signal the request to cancel task
        taskInfo.RequestCancel = true;
    }
}


As you can see, the additional handling of stop request does not immediately stop the process. It only set a boolean flag. We cannot directly call the MySqlBackup to stop, but instead we set the boolean to notify MySqlBackup.NET to stop.

According to the previous interval time setup, MySqlBackup.NET will interact with the backend every 200ms, that is the moment, we have the opportunity to communicate with the export process.

Here is the code that we have already written in the previous section, at the very end of this code, do the real stopping action:
// backend C#
void BeginBackup(int newTaskId)
{
    taskid = newTaskId;

    string fileBackup = Server.MapPath("~/backup.sql");

    TaskInfo taskInfo = new TaskInfo();

    dicTask[newTaskId] = taskInfo;

    using (var conn = new MySqlConnection(constr))
    using (var cmd = conn.CreateCommand())
    using (var mb = new MySqlBackup(cmd))
    {
        conn.Open();
        mb.ExportInfo.IntervalForProgressReport = 200; // 5 updates per second
        mb.ExportProgressChanged += Mb_ExportProgressChanged;
        mb.ExportToFile(fileBackup);
    }

    taskInfo.IsComplete = true;
}

private void Mb_ExportProgressChanged(object sender, ExportProgressArgs e)
{
    if (dicTask.TryGetValue(taskid, out var taskInfo))
    {
        taskInfo.TotalRows = e.TotalRowsInAllTables;
        taskInfo.CurrentRow = e.CurrentRowIndexInAllTables;
        taskInfo.Percent = CalculatePercent(taskInfo.TotalRows, taskInfo.CurrentRow);

        // MySqlBackup.NET will contact the backend every 200ms
        // Here is the oppurtunity that we can talk to MySqlBackup

        // Check the cancellation boolean flag
        if (taskInfo.RequestCancel)
        {
            // Now, here we directly control MySqlBackup to stop
            // all it's inner processes.
            // Do the real "stop" here
            ((MySqlBackup)sender).StopAllProcess();

            // Note: When a task is cancelled, you may want to clean up the partially created file
            // This will be handled in the full implementation section
        }
    }
}

Above is the basic walkthrough.

If you’ve followed along this far, congratulations! You now understand the core concepts. What follows is the production-ready implementation with all the bells and whistles – error handling, cancellation, file management, and more. Don’t worry if it looks complex – it’s built on the same foundation you just learned.
Full Walkthrough

MySqlBackup.NET provides two essential classes that make progress reporting possible:

1. ExportProgressArgs.cs

This class provides detailed information during backup operations, including:

  • Current table being processed
  • Total number of tables
  • Row counts (current and total)
  • Progress percentages

2. ImportProgressArgs.cs
This class tracks restore operations with:

  • Bytes processed vs. total bytes
  • Import completion percentage
  • Real-time progress updates

Getting Started: Basic Progress Events
The magic happens when you subscribe to progress change events. Here’s how to set up the fundamental progress reporting:

Backup Progress Monitoring

void Backup()
{
    using (var conn = new MySqlConnection(connectionString))
    using (var cmd = conn.CreateCommand())
    using (var mb = new MySqlBackup(cmd))
    {
        conn.Open();

        // Set update frequency (100ms = 10 updates per second)
        mb.ExportInfo.IntervalForProgressReport = 100;

        // Subscribe to progress events
        mb.ExportProgressChanged += Mb_ExportProgressChanged;

        mb.ExportToFile(filePathSql);
    }
}

private void Mb_ExportProgressChanged(object sender, ExportProgressArgs e)
{
    // Rich progress information available
    string currentTable = e.CurrentTableName;
    int totalTables = e.TotalTables;
    int currentTableIndex = e.CurrentTableIndex;
    long totalRows = e.TotalRowsInAllTables;
    long currentRows = e.CurrentRowIndexInAllTables;
    long currentTableRows = e.TotalRowsInCurrentTable;
    long currentTableProgress = e.CurrentRowIndexInCurrentTable;

    // Calculate completion percentage
    int percentCompleted = 0;
    if (e.CurrentRowIndexInAllTables > 0L && e.TotalRowsInAllTables > 0L)
    {
        if (e.CurrentRowIndexInAllTables >= e.TotalRowsInAllTables)
        {
            percentCompleted = 100;
        }
        else
        {
            percentCompleted = (int)(e.CurrentRowIndexInAllTables * 100L / e.TotalRowsInAllTables);
        }
    }
}


Restore Progress Monitoring
void Restore()
{
    using (var conn = config.GetNewConnection())
    using (var cmd = conn.CreateCommand())
    using (var mb = new MySqlBackup(cmd))
    {
        conn.Open();

        // Set update frequency
        mb.ImportInfo.IntervalForProgressReport = 100;

        // Subscribe to progress events
        mb.ImportProgressChanged += Mb_ImportProgressChanged;

        mb.ImportFromFile(filePathSql);
    }
}

private void Mb_ImportProgressChanged(object sender, ImportProgressArgs e)
{
    // Byte-based progress tracking
    long totalBytes = e.TotalBytes;
    long currentBytes = e.CurrentBytes;

    // Calculate completion percentage
    int percentCompleted = 0;
    if (e.CurrentBytes > 0L && e.TotalBytes > 0L)
    {
        if (e.CurrentBytes >= e.TotalBytes)
        {
            percentCompleted = 100;
        }
        else
        {
            percentCompleted = (int)(e.CurrentBytes * 100L / e.TotalBytes);
        }
    }
}


Choosing the Right Update Frequency
The update interval significantly impacts your application’s responsiveness:
// Configure update frequency based on your needs:

// Desktop/Local Applications (low latency)
mb.ExportInfo.IntervalForProgressReport = 100;  // 10 updates/second
mb.ImportInfo.IntervalForProgressReport = 250;  // 4 updates/second

// Web Applications (higher latency)
mb.ExportInfo.IntervalForProgressReport = 500;  // 2 updates/second
mb.ImportInfo.IntervalForProgressReport = 1000; // 1 update/second


Performance Guidelines:

  • 100ms: Perfect for desktop applications – very responsive
  • 250ms: Great balance for most applications
  • 500ms: Ideal for web applications with moderate traffic
  • 1000ms: Best for high-traffic web applications

Architecture: The Two-Thread Pattern
Understanding the architecture is crucial for building robust progress reporting:

MySqlBackup.NET Internal Threads

  1. Main Process Thread: Executes actual backup/restore operations
  2. Progress Reporting Thread: Periodically reports progress via timer events

Your Application Threads

  • Backend Thread: Handles MySqlBackup.NET operations and caches progress data
  • UI Thread: Retrieves cached data and updates the user interface

Creating the Progress Data Model
In your application, you need to write your own class to hold the progress values at your side. Here’s a comprehensive class to cache all progress information
class ProgressReportTask
{
    public int ApiCallIndex { get; set; }
    public int TaskId { get; set; }
    public int TaskType { get; set; }  // 1 = backup, 2 = restore
    public string FileName { get; set; }
    public string SHA256 { get; set; }

    // Task status tracking
    public bool IsStarted { get; set; }
    public bool IsCompleted { get; set; } = false;
    public bool IsCancelled { get; set; } = false;
    public bool RequestCancel { get; set; } = false;
    public bool HasError { get; set; } = false;
    public string ErrorMsg { get; set; } = "";

    // Time tracking
    public DateTime TimeStart { get; set; } = DateTime.MinValue;
    public DateTime TimeEnd { get; set; } = DateTime.MinValue;
    public TimeSpan TimeUsed { get; set; } = TimeSpan.Zero;

    // Backup-specific progress data
    public int TotalTables { get; set; } = 0;
    public int CurrentTableIndex { get; set; } = 0;
    public string CurrentTableName { get; set; } = "";
    public long TotalRowsCurrentTable { get; set; } = 0;
    public long CurrentRowCurrentTable { get; set; } = 0;
    public long TotalRows { get; set; } = 0;
    public long CurrentRowIndex { get; set; } = 0;

    // Restore-specific progress data
    public long TotalBytes { get; set; } = 0;
    public long CurrentBytes { get; set; } = 0;

    public int PercentCompleted { get; set; } = 0;

    // UI-friendly properties
    [JsonPropertyName("TaskTypeName")]
    public string TaskTypeName
    {
        get
        {
            return TaskType switch
            {
                1 => "Backup",
                2 => "Restore",
                _ => "Unknown"
            };
        }
    }

    [JsonPropertyName("TimeStartDisplay")]
    public string TimeStartDisplay => TimeStart.ToString("yyyy-MM-dd HH:mm:ss");

    [JsonPropertyName("TimeEndDisplay")]
    public string TimeEndDisplay => TimeEnd.ToString("yyyy-MM-dd HH:mm:ss");

    [JsonPropertyName("TimeUsedDisplay")]
    public string TimeUsedDisplay => TimeDisplayHelper.TimeSpanToString(TimeUsed);

    [JsonPropertyName("FileDownloadUrl")]
    public string FileDownloadUrl
    {
        get
        {
            if (!string.IsNullOrEmpty(FileName))
            {
                return $"/apiFiles?folder=backup&filename={FileName}";
            }
            return "";
        }
    }
}


Building the User Interface – Framework Agnostic Approach
Create a new blank ASP.NET Web Forms page and delete everything from the frontend markup, leaving only the page directive declaration at the top:
<%@ Page Language="C#" AutoEventWireup="true" CodeBehind="apiProgressReport.aspx.cs" Inherits="System.pages.apiProgressReport" %>

Route this page to /apiProgressReport using your preferred routing method.

Main API Controller

// Thread-safe progress cache
static ConcurrentDictionary<int, ProgressReportTask> dicTask = new ConcurrentDictionary<int, ProgressReportTask>();

protected void Page_Load(object sender, EventArgs e)
{
    if (!IsUserAuthorized())
    {
        Response.StatusCode = 401;
        Response.Write("0|Unauthorized access");
        Response.End();
        return;
    }

    string action = (Request["action"] + "").ToLower();

    switch (action)
    {
        case "backup":
            Backup();
            break;
        case "restore":
            Restore();
            break;
        case "stoptask":
            StopTask();
            break;
        case "gettaskstatus":
            GetTaskStatus();
            break;
        default:
            Response.StatusCode = 400;
            Response.Write("0|Invalid action");
            break;
    }
}

bool IsUserAuthorized()
{
    // Implement your authentication logic here
    // Check user login and backup permissions
    return true; // Simplified for demo
}


With this setup, the front end can access the api in 2 main ways

Example 1: by using query string (the get request), simple direct:
/apiProgressReport?action=backup
/apiProgressReport?action=restore
/apiProgressReport?action=stoptask
/apiProgressReport?action=gettaskstatus

Example 2: by using post request through fetchapi or xmlhttprequest in a nutshell, the javascript fetchapi will look something like this:
// Javascript

// backup
const formData = new FormData();
formData.append('action', 'backup');
const response = await fetch('/apiProgressReport', {
    method: 'POST',
    body: formData,
    credentials: 'include'
});

// restore
const formData = new FormData();
formData.append('action', 'restore');
// attach and send the file to server
formData.append('fileRestore', fileRestore.files[0]);
const response = await fetch('/apiProgressReport', {
    method: 'POST',
    body: formData,
    credentials: 'include'
});

Backup Implementation
void Backup()
{
    var taskId = GetNewTaskId();

    // Start task asynchronously (fire and forget pattern)
    _ = Task.Run(() => { BeginExport(taskId); });

    // Immediately return task ID to frontend
    Response.Write(taskId.ToString());
}

int thisTaskId = 0;

void BeginExport(int newTaskId)
{
    thisTaskId = newTaskId;

    ProgressReportTask task = new ProgressReportTask()
    {
        TaskId = newTaskId,
        TaskType = 1, // backup
        TimeStart = DateTime.Now,
        IsStarted = true
    };

    dicTask[newTaskId] = task;

    try
    {
        string folder = Server.MapPath("~/App_Data/backup");
        Directory.CreateDirectory(folder);

        string fileName = $"backup-{DateTime.Now:yyyy-MM-dd_HHmmss}.sql";
        string filePath = Path.Combine(folder, fileName);

        using (var conn = config.GetNewConnection())
        using (var cmd = conn.CreateCommand())
        using (var mb = new MySqlBackup(cmd))
        {
            conn.Open();
            mb.ExportInfo.IntervalForProgressReport = 100;
            mb.ExportProgressChanged += Mb_ExportProgressChanged;
            mb.ExportToFile(filePath);
        }

        // Handle cancellation or completion
        if (task.RequestCancel)
        {
            task.IsCancelled = true;
            try
            {
                if (File.Exists(filePath))
                    File.Delete(filePath);
            }
            catch { }
        }
        else
        {
            // Successfully completed
            task.FileName = fileName;
            task.SHA256 = Sha256.Compute(filePath);
        }

        task.TimeEnd = DateTime.Now;
        task.TimeUsed = DateTime.Now - task.TimeStart;
        task.IsCompleted = true;
    }
    catch (Exception ex)
    {
        task.HasError = true;
        task.ErrorMsg = ex.Message;
        task.TimeEnd = DateTime.Now;
        task.IsCompleted = true;
    }
}

private void Mb_ExportProgressChanged(object sender, ExportProgressArgs e)
{
    if (dicTask.TryGetValue(thisTaskId, out var task))
    {
        // Update all progress information
        task.CurrentTableName = e.CurrentTableName;
        task.TotalTables = e.TotalTables;
        task.CurrentTableIndex = e.CurrentTableIndex;
        task.TotalRows = e.TotalRowsInAllTables;
        task.CurrentRowIndex = e.CurrentRowIndexInAllTables;
        task.TotalRowsCurrentTable = e.TotalRowsInCurrentTable;
        task.CurrentRowCurrentTable = e.CurrentRowIndexInCurrentTable;

        // Calculate percentage
        if (e.CurrentRowIndexInAllTables > 0L && e.TotalRowsInAllTables > 0L)
        {
            if (e.CurrentRowIndexInAllTables >= e.TotalRowsInAllTables)
            {
                task.PercentCompleted = 100;
            }
            else
            {
                task.PercentCompleted = (int)(e.CurrentRowIndexInAllTables * 100L / e.TotalRowsInAllTables);
            }
        }
        else
        {
            task.PercentCompleted = 0;
        }

        // Handle cancellation requests
        if (task.RequestCancel)
        {
            ((MySqlBackup)sender).StopAllProcess();
        }
    }
}

Restore Implementation
void Restore()
{
    var taskId = GetNewTaskId();

    ProgressReportTask task = new ProgressReportTask()
    {
        TaskId = taskId,
        TaskType = 2, // restore
        TimeStart = DateTime.Now,
        IsStarted = true
    };

    dicTask[taskId] = task;

    // Validate uploaded file
    if (Request.Files.Count == 0)
    {
        task.IsCompleted = true;
        task.HasError = true;
        task.ErrorMsg = "No file uploaded";
        task.TimeEnd = DateTime.Now;
        return;
    }

    string fileExtension = Request.Files[0].FileName.ToLower().Trim();
    if (!fileExtension.EndsWith(".zip") && !fileExtension.EndsWith(".sql"))
    {
        task.IsCompleted = true;
        task.HasError = true;
        task.ErrorMsg = "Invalid file type. Only .zip or .sql files are supported.";
        task.TimeEnd = DateTime.Now;
        return;
    }

    // Save uploaded file
    string folder = Server.MapPath("~/App_Data/backup");
    Directory.CreateDirectory(folder);
    string fileName = $"restore-{DateTime.Now:yyyy-MM-dd_HHmmss}";
    string filePath = Path.Combine(folder, fileName + ".sql");

    if (fileExtension.EndsWith(".zip"))
    {
        string zipPath = Path.Combine(folder, fileName + ".zip");
        Request.Files[0].SaveAs(zipPath);
        ZipHelper.ExtractFile(zipPath, filePath);
        task.FileName = fileName + ".zip";
    }
    else
    {
        Request.Files[0].SaveAs(filePath);
        task.FileName = fileName + ".sql";
    }

    // Start restore process asynchronously
    _ = Task.Run(() => { BeginRestore(taskId, filePath); });

    Response.Write(taskId.ToString());
}

void BeginRestore(int newTaskId, string filePath)
{
    thisTaskId = newTaskId;

    if (dicTask.TryGetValue(thisTaskId, out ProgressReportTask task))
    {
        try
        {
            task.FileName = Path.GetFileName(filePath);
            task.SHA256 = Sha256.Compute(filePath);

            using (var conn = config.GetNewConnection())
            using (var cmd = conn.CreateCommand())
            using (var mb = new MySqlBackup(cmd))
            {
                conn.Open();
                mb.ImportInfo.IntervalForProgressReport = 100;
                mb.ImportProgressChanged += Mb_ImportProgressChanged;
                mb.ImportFromFile(filePath);
            }

            if (task.RequestCancel)
            {
                task.IsCancelled = true;
            }

            task.TimeEnd = DateTime.Now;
            task.TimeUsed = DateTime.Now - task.TimeStart;
            task.IsCompleted = true;
        }
        catch (Exception ex)
        {
            task.HasError = true;
            task.ErrorMsg = ex.Message;
            task.TimeEnd = DateTime.Now;
            task.TimeUsed = DateTime.Now - task.TimeStart;
            task.IsCompleted = true;
        }
    }
}

private void Mb_ImportProgressChanged(object sender, ImportProgressArgs e)
{
    if (dicTask.TryGetValue(thisTaskId, out var task))
    {
        task.TotalBytes = e.TotalBytes;
        task.CurrentBytes = e.CurrentBytes;

        // Calculate percentage
        if (e.CurrentBytes > 0L && e.TotalBytes > 0L)
        {
            if (e.CurrentBytes >= e.TotalBytes)
            {
                task.PercentCompleted = 100;
            }
            else
            {
                task.PercentCompleted = (int)(e.CurrentBytes * 100L / e.TotalBytes);
            }
        }
        else
        {
            task.PercentCompleted = 0;
        }

        // Handle cancellation requests
        if (task.RequestCancel)
        {
            ((MySqlBackup)sender).StopAllProcess();
        }
    }
}


Task Control Methods

void StopTask()
{
    if (int.TryParse(Request["taskid"] + "", out int taskId))
    {
        if (dicTask.TryGetValue(taskId, out ProgressReportTask task))
        {
            task.RequestCancel = true;
            Response.Write("1");
        }
        else
        {
            Response.Write("0|Task not found");
        }
    }
    else
    {
        Response.Write("0|Invalid task ID");
    }
}

void GetTaskStatus()
{
    if (int.TryParse(Request["apicallid"] + "", out int apiCallId))
    {
        if (int.TryParse(Request["taskid"] + "", out int taskId))
        {
            if (dicTask.TryGetValue(taskId, out ProgressReportTask task))
            {
                task.ApiCallIndex = apiCallId;

                string json = JsonSerializer.Serialize(task);
                Response.Clear();
                Response.ContentType = "application/json";
                Response.Write(json);
            }
        }
    }
}

int GetNewTaskId()
{
    // Use Guid to prevent collisions
    return Math.Abs(Guid.NewGuid().GetHashCode());
}

Handling Late Echo Responses
The apiCallId system prevents UI corruption from network latency issues:

Normal scenario:
Call 1 → Response 1: 5%
Call 2 → Response 2: 10%
Call 3 → Response 3: 15%

Latency scenario:
Call 1 → (delayed)
Call 2 → Response 2: 10%
Call 3 → Response 3: 15%
        ↓
Response 1: 5% (ignored due to old apiCallId)

Building the Frontend
Basic Value Container

You can use <span> as the container for the values:
// data that is manually handled by user
<span id="labelTaskId"></span>
<span id="labelPercent">0</span>
<span id="lableTimeStart"></span>
<span id="lableTimeEnd"></span>
<span id="lableTimeElapse"></span>
<span id="labelSqlFilename"></span> // the download webpath for the generated sql dump file
<span id="labelSha256"></span> // the SHA 256 checksum for the generate file

// typical status: running, completed, error, cancelled
<span id="labelTaskStatus"></span>

// the following data fields provided by mysqlbackup.net during the progress change events:

// fields for backup used
<span id="labelCurTableName"></span>
<span id="labelCurTableIndex"></span>
<span id="labelTotalTables"></span>
<span id="labelCurrentRowsAllTables"></span>
<span id="labelTotalRowsAllTable">0</span>
<span id="labelCurrentRowsCurrentTables"></span>
<span id="labelTotalRowsCurrentTable">0</span>

// fields for restore used
<span id="labelCurrentBytes"></span>
<span id="lableTotalBytes"></span>

HTML Structure
Example of a comprehensive progress display:
<!-- Progress Bar -->
<div id="progress_bar_container">
    <div id="progress_bar_indicator">
        <span id="labelPercent">0</span> %
    </div>
</div>

<!-- Control Buttons -->
<div class="controls">
    <button type="button" onclick="backup();">Backup Database</button>
    <button type="button" onclick="restore();">Restore Database</button>
    <button type="button" onclick="stopTask();">Stop Current Task</button>

    <label for="fileRestore">Select Restore File:</label>
    <input type="file" id="fileRestore" accept=".sql,.zip" />
</div>

<!-- Detailed Status Display -->
<div class="task-status">
    <table>
        <tr>
            <td>Task ID</td>
            <td><span id="labelTaskId">--</span></td>
        </tr>
        <tr>
            <td>Status</td>
            <td>
                <span id="labelTaskStatus">Ready</span>
                <span id="labelTaskMessage"></span>
            </td>
        </tr>
        <tr>
            <td>Time</td>
            <td>
                Start: <span id="labelTimeStart">--</span> |
                End: <span id="labelTimeEnd">--</span> |
                Duration: <span id="labelTimeElapsed">--</span>
            </td>
        </tr>
        <tr>
            <td>File</td>
            <td>
                <span id="labelSqlFilename">--</span><br>
                SHA256: <span id="labelSha256">--</span>
            </td>
        </tr>

        <!-- Backup-specific fields -->
        <tr class="backup-only">
            <td>Current Table</td>
            <td>
                <span id="labelCurrentTableName">--</span>
                (<span id="labelCurrentTableIndex">--</span> / <span id="labelTotalTables">--</span>)
            </td>
        </tr>
        <tr class="backup-only">
            <td>All Tables Progress</td>
            <td>
                <span id="labelCurrentRowsAllTables">--</span> / <span id="labelTotalRowsAllTables">--</span>
            </td>
        </tr>
        <tr class="backup-only">
            <td>Current Table Progress</td>
            <td>
                <span id="labelCurrentRowsCurrentTable">--</span> / <span id="labelTotalRowsCurrentTable">--</span>
            </td>
        </tr>

        <!-- Restore-specific fields -->
        <tr class="restore-only">
            <td>Progress</td>
            <td>
                <span id="labelCurrentBytes">--</span> / <span id="labelTotalBytes">--</span> bytes
            </td>
        </tr>
    </table>
</div>

JavaScript Implementation
Initialization and Variables

// Global variables
let taskId = 0;
let apiCallId = 0;
let intervalTimer = null;
let intervalMs = 1000;

// Cache DOM elements for better performance
const elements = {
    fileRestore: document.querySelector("#fileRestore"),
    progressBar: document.querySelector("#progress_bar_indicator"),
    labelPercent: document.querySelector("#labelPercent"),
    labelTaskId: document.querySelector("#labelTaskId"),
    labelTimeStart: document.querySelector("#labelTimeStart"),
    labelTimeEnd: document.querySelector("#labelTimeEnd"),
    labelTimeElapsed: document.querySelector("#labelTimeElapsed"),
    labelTaskStatus: document.querySelector("#labelTaskStatus"),
    labelTaskMessage: document.querySelector("#labelTaskMessage"),
    labelSqlFilename: document.querySelector("#labelSqlFilename"),
    labelSha256: document.querySelector("#labelSha256"),

    // Backup-specific elements
    labelCurrentTableName: document.querySelector("#labelCurrentTableName"),
    labelCurrentTableIndex: document.querySelector("#labelCurrentTableIndex"),
    labelTotalTables: document.querySelector("#labelTotalTables"),
    labelCurrentRowsAllTables: document.querySelector("#labelCurrentRowsAllTables"),
    labelTotalRowsAllTables: document.querySelector("#labelTotalRowsAllTables"),
    labelCurrentRowsCurrentTable: document.querySelector("#labelCurrentRowsCurrentTable"),
    labelTotalRowsCurrentTable: document.querySelector("#labelTotalRowsCurrentTable"),

    // Restore-specific elements
    labelCurrentBytes: document.querySelector("#labelCurrentBytes"),
    labelTotalBytes: document.querySelector("#labelTotalBytes")
};


Core Functions
Backup, Restore, Stop
async function backup() {
    resetUIValues();

    const formData = new FormData();
    formData.append('action', 'backup');

    const result = await fetchData(formData);

    if (result.ok) {
        taskId = result.thisTaskid;
        intervalMs = 1000;
        startIntervalTimer();
        showSuccessMessage("Backup Started", "Database backup has begun successfully");
    } else {
        showErrorMessage("Backup Failed", result.errMsg);
        stopIntervalTimer();
    }
}

async function restore() {
    resetUIValues();

    if (!elements.fileRestore.files || elements.fileRestore.files.length === 0) {
        showErrorMessage("No File Selected", "Please select a backup file to restore");
        return;
    }

    const formData = new FormData();
    formData.append('action', 'restore');
    formData.append('fileRestore', elements.fileRestore.files[0]);

    const result = await fetchData(formData);

    if (result.ok) {
        taskId = result.thisTaskid;
        intervalMs = 1000;
        startIntervalTimer();
        showSuccessMessage("Restore Started", "Database restore has begun successfully");
    } else {
        showErrorMessage("Restore Failed", result.errMsg);
        stopIntervalTimer();
    }
}

async function stopTask() {
    if (!taskId || taskId === 0) {
        showErrorMessage("No Active Task", "There is no running task to stop");
        return;
    }

    const formData = new FormData();
    formData.append("action", "stoptask");
    formData.append("taskid", taskId);

    const result = await fetchData(formData);

    if (result.ok) {
        showSuccessMessage("Stop Requested", "The task is being cancelled...");
    } else {
        showErrorMessage("Stop Failed", result.errMsg);
        stopIntervalTimer();
    }
}


Progress Monitoring
async function fetchTaskStatus() {
    apiCallId++;

    const formData = new FormData();
    formData.append('action', 'gettaskstatus');
    formData.append('taskid', taskId);
    formData.append('apicallid', apiCallId);

    const result = await fetchData(formData);

    if (result.ok) {
        // Ignore late responses
        if (result.jsonObject.ApiCallIndex !== apiCallId) {
            return;
        }
        updateUIValues(result.jsonObject);
    } else {
        showErrorMessage("Status Fetch Failed", result.errMsg);
        stopIntervalTimer();
    }
}

function updateUIValues(data) {
    // Optimize update frequency when task starts
    if (data.PercentCompleted > 0 && intervalMs === 1000) {

        // change the timer interval time
        intervalMs = 100;
        stopIntervalTimer();
        setTimeout(startIntervalTimer, 500);
    }

    // Stop monitoring when task completes
    if (data.IsCompleted || data.HasError || data.IsCancelled) {
        stopIntervalTimer();
    }

    // Update basic information
    elements.labelTaskId.textContent = data.TaskId || "--";
    elements.labelTimeStart.textContent = data.TimeStartDisplay || "--";
    elements.labelTimeEnd.textContent = data.TimeEndDisplay || "--";
    elements.labelTimeElapsed.textContent = data.TimeUsedDisplay || "--";

    // Update progress bar
    const percent = data.PercentCompleted || 0;
    elements.labelPercent.style.display = "block";
    elements.labelPercent.textContent = percent;
    elements.progressBar.style.width = percent + '%';

    // Update status with visual indicators
    const statusContainer = elements.labelTaskStatus.closest('td');

    if (data.HasError) {
        elements.labelTaskStatus.textContent = "Error";
        elements.labelTaskMessage.textContent = data.ErrorMsg || "";
        statusContainer.className = "status-error";
        showErrorMessage("Task Failed", data.ErrorMsg || "Unknown error occurred");
    } else if (data.IsCancelled) {
        elements.labelTaskStatus.textContent = "Cancelled";
        elements.labelTaskMessage.textContent = "";
        statusContainer.className = "status-cancelled";
        showWarningMessage("Task Cancelled", "The operation was cancelled by user request");
    } else if (data.IsCompleted) {
        elements.labelTaskStatus.textContent = "Completed";
        elements.labelTaskMessage.textContent = "";
        statusContainer.className = "status-complete";
        showSuccessMessage("Task Completed", "Operation finished successfully!");
    } else {
        elements.labelTaskStatus.textContent = "Running";
        elements.labelTaskMessage.textContent = "";
        statusContainer.className = "status-running";
    }

    // Update file information
    if (data.FileName && data.FileName.length > 0) {
        elements.labelSqlFilename.innerHTML =
            `File: <a href='${data.FileDownloadUrl}' class='download-link' target='_blank'>Download ${data.FileName}</a>`;
    } else {
        elements.labelSqlFilename.textContent = data.FileName || "--";
    }
    elements.labelSha256.textContent = data.SHA256 || "--";

    // Update backup-specific information
    if (data.TaskType === 1) {
        elements.labelCurrentTableName.textContent = data.CurrentTableName || "--";
        elements.labelCurrentTableIndex.textContent = data.CurrentTableIndex || "--";
        elements.labelTotalTables.textContent = data.TotalTables || "--";
        elements.labelCurrentRowsAllTables.textContent = data.CurrentRowIndex || "--";
        elements.labelTotalRowsAllTables.textContent = data.TotalRows || "--";
        elements.labelCurrentRowsCurrentTable.textContent = data.CurrentRowCurrentTable || "--";
        elements.labelTotalRowsCurrentTable.textContent = data.TotalRowsCurrentTable || "--";

        // Hide restore-specific fields
        elements.labelCurrentBytes.textContent = "--";
        elements.labelTotalBytes.textContent = "--";
    }

    // Update restore-specific information
    if (data.TaskType === 2) {
        elements.labelCurrentBytes.textContent = formatBytes(data.CurrentBytes) || "--";
        elements.labelTotalBytes.textContent = formatBytes(data.TotalBytes) || "--";

        // Hide backup-specific fields
        elements.labelCurrentTableName.textContent = "--";
        elements.labelCurrentTableIndex.textContent = "--";
        elements.labelTotalTables.textContent = "--";
        elements.labelCurrentRowsAllTables.textContent = "--";
        elements.labelTotalRowsAllTables.textContent = "--";
        elements.labelCurrentRowsCurrentTable.textContent = "--";
        elements.labelTotalRowsCurrentTable.textContent = "--";
    }
}


Utility Functions
// Centralized API handler
async function fetchData(formData) {
    try {
        let action = formData.get("action") || "";

        const response = await fetch('/apiProgressReport', {
            method: 'POST',
            body: formData,
            credentials: 'include'
        });

        if (response.ok) {
            const responseText = await response.text();

            if (responseText.startsWith("0|")) {
                let err = responseText.substring(2);
                return { ok: false, errMsg: err };
            }
            else {
                if (!responseText || responseText.trim() === '') {
                    let err = "Empty response from server";
                    return { ok: false, errMsg: err };
                }
                else {
                    // Success
                    if (action == "backup" || action == "restore") {
                        let _thisTaskid = parseInt(responseText);
                        if (isNaN(_thisTaskid)) {
                            let err = `Invalid Task ID: ${_thisTaskid}`;
                            return { ok: false, errMsg: err };
                        }
                        else {
                            return { ok: true, thisTaskid: _thisTaskid };
                        }
                    }
                    else if (action == "stoptask") {
                        if (responseText == "1") {
                            return { ok: true };
                        }
                        else {
                            let err = `Unable to stop task`;
                            return { ok: false, errMsg: err };
                        }
                    }
                    else if (action == "gettaskstatus") {
                        try {
                            let thisJsonObject = JSON.parse(responseText);
                            return { ok: true, jsonObject: thisJsonObject };
                        }
                        catch (err) {
                            return { ok: false, errMsg: err };
                        }
                    }
                }
            }
        }
        else {
            const err = await response.text();
            return { ok: false, errMsg: err };
        }
    }
    catch (err) {
        return { ok: false, errMsg: err };
    }
}

function resetUIValues() {
    // Reset all display elements
    Object.values(elements).forEach(element => {
        if (element && element.textContent !== undefined) {
            element.textContent = "--";
        }
    });

    // Reset progress bar
    elements.progressBar.style.width = '0%';
    elements.labelPercent.style.display = "none";
    elements.labelPercent.textContent = "0";

    // Reset status styling
    const statusContainer = elements.labelTaskStatus.closest('td');
    if (statusContainer) {
        statusContainer.className = "";
    }
}

function startIntervalTimer() {
    stopIntervalTimer();
    intervalTimer = setInterval(fetchTaskStatus, intervalMs);
}

function stopIntervalTimer() {
    if (intervalTimer) {
        clearInterval(intervalTimer);
        intervalTimer = null;
    }
}

function formatBytes(bytes) {
    if (!bytes || bytes === 0) return '0 Bytes';

    const sizes = ['Bytes', 'KB', 'MB', 'GB'];
    const i = Math.floor(Math.log(bytes) / Math.log(1024));

    return Math.round(bytes / Math.pow(1024, i) * 100) / 100 + ' ' + sizes[i];
}

// Message display functions (implement based on your UI framework)
function showSuccessMessage(title, message) {
    console.log(` ${title}: ${message}`);
    // Implement your success notification here
}

function showErrorMessage(title, message) {
    console.error(` ${title}: ${message}`);
    // Implement your error notification here
}

function showWarningMessage(title, message) {
    console.warn(` ${title}: ${message}`);
    // Implement your warning notification here
}



European ASP.NET Ajax Hosting :: What is AJAX Security?

clock June 10, 2021 08:51 by author Peter

The advent of Web 2.0 brought about a new technique in building web applications, Asynchronous, JavaScript, and XML. AJAX is a faster and interactive technology that has found great favor among modern businesses today. With it comes a combination of JavaScript, HTML, CSS, and XML to build one useful technique that makes web application interactivity faster and affordable in terms of bandwidth consumption. This article is a description of AJAX and its security issues.
 
AJAX
Conventional web sites were known to be slower and consumed more bandwidth because of the way they connected to the server. It would take a page to reload to connect to the server using synchronous connection. This meant more bandwidth consumption and slower response from web applications. On the other hand, AJAX is a browser technology that uses asynchronous means to communicate to the server. This means that you can communicate with the server to update certain portions of a page without having to reload the whole page.
 
A good example of AJAX in use is the Google create account page which recognizes a username in use soon after a user enters their suggested username. This means that in the background the page has communicated with the Google server to check if the name exists and show results without having to reload the entire page.
 
It is considered the most feasible Rich Internet Application (RIA) to date. AJAX makes use of Open Standards that include HTML and CSS for the presentation of data, XML for data storage and transfers to and from the server, XMLHttpRequest objects in the browser to fetch data from the server, and finally JavaScript for interactivity. AJAX can also transfer data in JSON or plain-text.
 
Security Issues with AJAX
AJAX applications only use a different technique to connect to the server. However, they use the same security schemes to connect to the server. This entails that you still have to include your authentication, authorization, and data protection methods in the web.xml file or program. AJAX applications bear the same vulnerabilities as ordinary or conventional web applications. In as much as people prefer the swiftness and the advanced interactivity of AJAX applications, some are misled to believe that AJAX web applications are more secure than ordinary web applications.
 
AJAX applications are known to have session management vulnerabilities and a lot of loopholes in the hidden URLs which carry AJAX requests to the server.
 
The AJAX engine makes use of JavaScript to transfer user requests/commands and transforms them into function calls. The AJAX engine sends these function calls in plain-text to the server that may be intercepted by attackers to reveal database information, variable names, or any other confidential user data that may be used by the attacker maliciously.
 
AJAX-based applications are also vulnerable to Cross-Site Request Forgery (CRSF) and Cross-Site Scripting (XSS). Although it is not that easy to exploit CSRF on AJAX applications because the requests are hidden, attackers may be able to create a script that can steal a user’s session token and by so doing be able to steal the user’s session remotely.
 
This can be avoided by creating random complex tokens for the AJAX requests which are not identified by the attackers. The server embeds the complex token on the page and checks for it each time the users make a request to the server and if it is any different the server does not process the request.
 
To ensure AJAX security against XSS, the application has to strictly sanitize user input and output. The use of JS functions such as ‘document.write()’, ‘innerHTML()’, ‘eval()’, ‘write()’ may make it possible for XSS attacks in AJAX web applications.
 
Conclusion
AJAX is a very fast and affordable browser technology but needs to be treated just like any other web application when it comes to security. Organizations need to do thorough scanning of their AJAX applications just like on conventional web applications to ensure absolute security from common vulnerabilities.

HostForLIFEASP.NET Ajax Hosting

 



European Entity Framework Core 1.0 Hosting - HostForLIFE.eu :: Entity Framework Code First Approach

clock October 14, 2020 08:57 by author Peter

Entity framework is an Object/Relational Mapping (O/RM) framework. It is an enhancement to ADO.NET that gives developers an automated mechanism for accessing & storing the data in the database. ORM framework automatically creates classes based on database tables and the opposite is also true; that is, it can also automatically generate necessary SQL to create database tables based on classes.

First we need to install Entity framework and the best way to install that is by using: -NuGet Package Manager.
Code first approach allows us to create our custom classes first and based on those classes entity framework can generate database automatically for us.
 
Step 1
Create New Project - Select Web Application- Select MVC Template - Cick OK.
 
Step 2
Go toTools - NuGet Package Manager - Manage NuGet Packages For Solution - Browse and type Entity Framework click on Install
 
Step 3
Add the following classes in the Model Folder of your project.
    using System.Collections.Generic;  
    using System.ComponentModel.DataAnnotations;  
    using System.ComponentModel.DataAnnotations.Schema;  
    namespace CodeFirstApproach.Models  
    {  
    [Table("tblClass")]  
    public class Class  
    {  
       [Key]  
       public int ClassID { get; set; }  
       public string ClassName { get; set; }  
       public List<Student> Students { get; set; }  
       public List<Course> Courses { get; set; }  
    }  
    }   
       
       
    [Table("tblStudent")]  
    public class Student  
    {  
       [Key]  
       public int StudentID { get; set; }  
       public string StudentName { get; set; }  
       public Class Class { get; set; }  
    }  
       
    [Table("tblCourse")]  
    public class Course  
    {  
       [Key]  
       public int CourseID { get; set; }  
       public string CourseName { get; set; }  
       public List<Class> Classes { get; set; }  
    }  


Step 4
Add another class as a Context Class and add connection string in web.config. In class add DbSet where each DbSet will map to a table in the database. If we have a property DbSet of Students, and the name of that property is Students, the Entity Framework will by default look for a Student table inside the database which in our case is tblStudent as we used [Table("tblStudent")] as Annotation .
    public class StudentContext: DbContext[DBContext maps to a specific database that has a schema that the DBContext understands] {  
        public StudentContext(): base("name=conn_StudentDB") {}  
        public DbSet < Class > Classes {  
            get;  
            set;  
        }  
        public DbSet < Student > Students {  
            get;  
            set;  
        }  
        public DbSet < Course > Courses {  
            get;  
            set;  
        }  
    }  
    protected override void OnModelCreating(DbModelBuilder modelBuilder) {  
        modelBuilder.Entity < Class > ().HasMany(s => s.Courses).WithMany(c => c.Classes).Map(cs => {  
            cs.MapLeftKey("ClassId");  
            cs.MapRightKey("CourseId");  
            cs.ToTable("tblClassCourse");  
        });  
    }  
    }  

Step 4
Add another class as a Context Class and add connection string in web.config.In class add DbSet where each DbSet will map to a table.
 
Note
We could also say that DBSet class represents an entity set that is used to create, read, update, and delete operations.
 
Web.Config
In the database, if we have a property DbSet of Students, and the name of that property is Students, the Entity Framework will by default look for an Student table inside the database which in our case is tblStudent as we used [Table("tblStudent")] as Annotation .
    <connectionStrings>  
       <add name="conn_StudentDB" connectionString="data source=.;initial catalog=StudentDB;integrated security=True;MultipleActiveResultSets=True;" providerName="System.Data.SqlClient" />  
    </connectionStrings>  

Step 5
 
In HomeController Index Action write the following code and Run (Press F5)

    public ActionResult Index()  
    {  
        StudentContext context = new StudentContext();  
         var x = (from z in context.Courses select z).ToList();  
        return View();  
    }  

Observation
 
It will create the database with all the tables.
 
Point to Start
 
Another Approach is Code Based Migration - Drop the database and follow the step below.
 
Step 1
Entity Framework Code First Migration
 
Go To Package Manager Console and type ,
 
PM> Enable-Migrations
 
Note
It will create a Configuration file in the Migration Folder.
 
Step 2
 
PM> Add-Migration -Name CreateDb
 
Note
It will create a Script in the Migration Folder with xxx_CreateDb.cs. It will consist of two methods,  Up() and Down().
    public partial class CreateDb: DbMigration {  
        public override void Up() {  
            //code to execute when you Update the DB.  
        }  
        public override void Down() {  
            //code to execute when you Rollback.   
        }  
    }  

Step3 - PM> Update-Database
We will see our Database getting created once again just like the image above.
 
Currently we have no data. Insert a few records in the following table.
    insert into [dbo].[tblClass] values('Computer Science'),('Electronics And Communication'),('Mechanical')  
    insert into [dbo].[tblStudent]values('Peter',1),('Scott',2),('Daniel',3),('Raymond',1),('Anthony',2),('Greg',3)  
    insert into [dbo].[tblCourse] values('.Net'),('SQL'),('Java'),('VLSI'),('Analog Communication'),('Optical Fiber Communication'),('Thermodynamics'),('Measurement and Instructions'),('Advanced Solid Mechanics')  
    insert into [dbo].[tblClassCourse] values(1,1),(1,2),(1,3),(2,4),(2,5),(2,6),(2,3),(3,7),(3,8),(3,9),(3,3),(1,5),(3,5);  

Updating Table
 
Step 1
 
Now we will add a property in Student class PhoneNo
    [Table("tblStudent")]  
    public class Student {  
        public int StudentID {  
            get;  
            set;  
        }  
        public string StudentName {  
            get;  
            set;  
        }  
        [Required]  
        public string PhoneNo {  
            get;  
            set;  
        }  
        public Class Class {  
            get;  
            set;  
        }  
    }  


Step 2
 
Go to Console and Add-Migration again for the changes in our model which will create _AddedProperty_tblStudent.cs inside the configuration folder.
 
PM> Add-Migration -Name AddedProperty_tblStudent
    public partial class AddedProperty_tblStudent: DbMigration {  
        public override void Up() {  
            AddColumn("dbo.tblStudent", "PhoneNo", c => c.String(nullable: false));  
            //Setting default value for any property  
            //AddColumn("dbo.tblStudent", "PhoneNo", c => c.String(nullable:true,defaultValue:"NULL"));      [Do not use Required Annotation then]  
            //AddColumn("dbo.tblStudent", "JoinDate", c => c.DateTime(nullable: false, defaultValueSql:"GetDate()"));  
        }  
        public override void Down() {  
            DropColumn("dbo.tblStudent", "PhoneNo");  
        }  
    }  

Step 3
 
PM> Update-Database. Check the following changes in DB for tblStudent.
 
RollBack - Update-Database -TargetMigration:"CreateDb"



SQL Server 2012 Hosting Belgium - HostForLIFE.eu :: How to Fix Distribution setup SQL Server Agent error: "RegCreateKeyEx() returned error 5, 'Access is denied.'" ?

clock October 15, 2019 12:15 by author Peter

With this short article, I will tell you about How to Fix Distribution setup SQL Server Agent error: "RegCreateKeyEx() returned error 5, 'Access is denied.'" on my SQL Server 2012 Hosting.

In the Configure Distribution Wizard, the step "Configuring SQL Server Agent to start automatically" errors with the following text:

TITLE: Configure Distribution Wizard
------------------------------
An error occurred configuring SQL Server Agent.
------------------------------
ADDITIONAL INFORMATION:
RegCreateKeyEx() returned error 5, 'Access is denied.' (Microsoft SQL Server, Error: 22002)

This is a very minor error, and not difficult to work around at all. The wizard is making an attempt to alter the SQL Server Agent service "Start Mode" to Automatic. you'll be able to try this via the SQL Server Configuration Manager instead.

In the Sysinternals method Monitor, you'll see: Operation: RegCreateKey Result: ACCESS DENIED Path: "HKLM\System\CurrentControlSet\Services\SQLAgent$SQL2012"

 

If you encounter this error, choose "No" in the "SQL Server Agent Start" page in the configure Distribution Wizard (as shown below), so set your agent service to Automatic start Mode via the SQL Server Configuration Manager.

The third step of the wizard that failed before won't happen. Why the failure truly happens I didn't figure this out, and i am open to feedback, however this seems like a vestigial step to a wizard that otherwise has no negative impact. Running SSMS with "run as Administrator" doesn't seem to fix this error either. i would like to recognize why this error happens within the 1st place.

HostForLIFE.eu SQL Server 2012 Hosting
HostForLIFE.eu is European Windows Hosting Provider which focuses on Windows Platform only. We deliver on-demand hosting solutions including Shared hosting, Reseller Hosting, Cloud Hosting, Dedicated Servers, and IT as a Service for companies of all sizes. We have customers from around the globe, spread across every continent. We serve the hosting needs of the business and professional, government and nonprofit, entertainment and personal use market segments.



SQL Server Hosting - HostForLIFE.eu :: Instead Of Triggers

clock September 4, 2019 12:41 by author Peter

Instead of triggers are used to skip DML commands. They fire when you try to execute insert, update or delete statement but instead of executing these commands trigger actually works and trigger functionality executes.
 

Example
    create table approved_emp ( eid int identity(1,1), ename varchar(30)) 
    create table emp ( id int identity(1,1) , ename varchar(30), AddedBy varchar(30)) 
      
    Create trigger instead_of on approved_emp 
    instead of insert 
    as 
    begin 
    declare @name varchar(30) 
    select @name=ename from inserted 
    insert into temp_audit values(@name, USER ) 
    end 


So, basically, trigger will work as, when we will try to add new record in approved_emp table, instead of inserting new records it will add ename into emp table. No data will reflect in approved_emp table as trigger is fired on the table every time while adding data into that table.
 
You can also create instead of triggers for update and delete as well.
 



SQL Server Hosting - HostForLIFE.eu :: Rename SQL Server Database

clock July 24, 2019 12:47 by author Peter

Database Administrators usually use the sp_renamedb system stored procedure to quickly rename a SQL Server Database. However, the drawback of using sp_renamedb is that it doesn't rename the Logical and Physical names of the underlying database files. It's a best practice to make sure the Logical Name and Physical File Name of the database is also renamed to reflect the actual name of the database to avoid any confusion with backup, restore or detach/attach operations.

Let's first create a new database named CoreDB using the T-SQL below:

USE master
GO
IF EXISTS (SELECT name FROM sys.databases WHERE name = N'CoreDB')
DROP DATABASE CoreDB
GO
USE master
GO
CREATE DATABASE [CoreDB]
ON PRIMARY
(
NAME = N'CoreDB',
FILENAME = N'C:\Program Files\Microsoft SQL Server\MSSQL10.SQL2008\MSSQL\DATA\CoreDB.mdf' ,
SIZE = 2048KB ,
FILEGROWTH = 1024KB
)
LOG ON
(
NAME = N'CoreDB_log',
FILENAME = N'C:\Program Files\Microsoft SQL Server\MSSQL10.SQL2008\MSSQL\DATA\CoreDB_log.ldf' ,
SIZE = 1024KB ,
FILEGROWTH = 10%
)
GO

Rename CoreDB Database Using sp_renamedb System Stored Procedure

Now let's rename the CoreDB database to ProductsDB by executing the below T-SQL code.

USE master
GO
ALTER DATABASE CoreDB
SET SINGLE_USER

WITH ROLLBACK IMMEDIATE
GO
EXEC master..sp_renamedb 'CoreDB','ProductsDB'
GO
ALTER DATABASE ProductsDB
SET MULTI_USER
GO

Once the above T-SQL has executed successfully the database name will change however the Logical Name and File Name will not change. You can verify this by executing the T-SQL below:

USE master
GO
/* Identify Database File Names */
SELECT

name AS [Logical Name],
physical_name AS [DB File Path],
type_desc AS [File Type],
state_desc AS [State]
FROM sys.master_files
WHERE database_id = DB_ID(N'ProductsDB')
GO

Your output should look something like this from the above query.

You can see in the above snippet that the Logical Name and File Name in the DB File Path column for ProductsDB are still reflecting the old name of CoreDB. This is not a good practice to follow in a Production Environment. Below you will see the steps which a DBA can follow to rename the database and its respective files.

Steps to Rename a SQL Server Database

DBAs should follow the below steps which will not only rename the database, but at the same time will also rename the Logical Name and File Name of the database.

This first set of commands put the database in single user mode and also modifies the logical names.


/* Set Database as a Single User */
ALTER DATABASE CoreDB SET SINGLE_USER WITH ROLLBACK IMMEDIATE
/* Change Logical File Name */
ALTER DATABASE [CoreDB] MODIFY FILE (NAME=N'CoreDB', NEWNAME=N'ProductsDB')
GO
ALTER DATABASE [CoreDB] MODIFY FILE (NAME=N'CoreDB_log', NEWNAME=N'ProductsDB_log')
GO

This is the output from the above code.


Now we need to detach the database, so we can rename the physical files.  If the database files are open you will not be able to rename the files.

/* Detach Current Database */
USE [master]
GO
EXEC master.dbo.sp_detach_db @dbname = N'CoreDB'
GO

Once the CoreDB database is detached successfully then the next step will be to rename the Physical Files. This can be done either manually or by using the xp_cmdshell system stored procedure. You can enable xp_cmdshell feature using the sp_configure system stored procedure.

USE master
GO
sp_configure 'show advanced options'
GO
/* 0 = Disabled , 1 = Enabled */
sp_configure 'xp_cmdshell', 1
GO
RECONFIGURE WITH OVERRIDE
GO

Once xp_cmdshell is enabled you can use the below script to rename the physical files of the database.

/* Rename Physical Files */
USE [master]
GO
EXEC xp_cmdshell 'RENAME "C:\Program Files\Microsoft SQL ServerMSSQL10.SQL2008\
MSSQL\DATA\CoreDB.mdf", "ProductsDB.mdf"'
GO
EXEC xp_cmdshell 'RENAME "C:\Program Files\Microsoft SQL ServerMSSQL10.SQL2008\
MSSQL\DATA\CoreDB_log.ldf", "ProductsDB_log.ldf"'
GO

Once the above step has successfully executed then the next step will be to attach the database, this can be done by executing the T-SQL below:

/* Attach Renamed ProductsDB Database Online */
USE [master]
GO
CREATE DATABASE ProductsDB ON
( FILENAME = N'C:\Program Files\Microsoft SQL Server\MSSQL10.SQL2008\MSSQL\DATA\ProductsDB.mdf' ),
( FILENAME = N'C:\Program Files\Microsoft SQL Server\MSSQL10.SQL2008\MSSQL\DATA\ProductsDB_log.ldf' )
FOR ATTACH
GO

Once the above step has successfully executed then the final step will be to allow multi user access for the user database by executing the below T-SQL:

/* Set Database to Multi User*/
ALTER DATABASE ProductsDB SET MULTI_USER
GO

You can verify the Logical and File Names for the ProductsDB database by executing the T-SQL below:

USE master
GO
/* Identify Database File Names */
SELECT
name AS [Logical Name],
physical_name AS [DB File Path],
type_desc AS [File Type],
state_desc AS [State]
FROM sys.master_files
WHERE database_id = DB_ID(N'ProductsDB')

 

HostForLIFE.eu SQL Server 2012 Hosting
HostForLIFE.eu is European Windows Hosting Provider which focuses on Windows Platform only. We deliver on-demand hosting solutions including Shared hosting, Reseller Hosting, Cloud Hosting, Dedicated Servers, and IT as a Service for companies of all sizes. We have customers from around the globe, spread across every continent. We serve the hosting needs of the business and professional, government and nonprofit, entertainment and personal use market segments.



European SQL 2017 Hosting :: How To Call A Web Service From SQL Server?

clock April 24, 2019 11:27 by author Peter

In this blog, I have shown the process of calling web services through a stored procedure in SQL Server database. Also, I have explained how to call a stored procedure with a SOAP Envelope.

Step 1
Create a stored procedure in your SQL Server.
  CREATE proc [dbo].[spHTTPRequest]    
        @URI varchar(2000) = 'http://localhost:55253/',         
        @methodName varchar(50) = 'Get',    
        @requestBody varchar(8000) = '',    
        @SoapAction varchar(255),    
        @UserName nvarchar(100), -- Domain\UserName or UserName    
        @Password nvarchar(100),    
        @responseText varchar(8000) output   
  as   
  SET NOCOUNT ON   
  IF    @methodName = ''   
  BEGIN   
        select FailPoint = 'Method Name must be set'   
        return   
  END   
  set   @responseText = 'FAILED'   
  DECLARE @objectID int   
  DECLARE @hResult int   
  DECLARE @source varchar(255), @desc varchar(255)    
  EXEC @hResult = sp_OACreate 'MSXML2.ServerXMLHTTP', @objectID OUT   
  IF @hResult <> 0    
  BEGIN   
        EXEC sp_OAGetErrorInfo @objectID, @source OUT, @desc OUT   
        SELECT      hResult = convert(varbinary(4), @hResult),    
                    source = @source,    
                    description = @desc,    
                    FailPoint = 'Create failed',    
                    MedthodName = @methodName    
        goto destroy    
        return   
  END   
  -- open the destination URI with Specified method    
  EXEC @hResult = sp_OAMethod @objectID, 'open', null, @methodName, @URI, 'false', @UserName, @Password   
  IF @hResult <> 0    
  BEGIN   
        EXEC sp_OAGetErrorInfo @objectID, @source OUT, @desc OUT   
        SELECT      hResult = convert(varbinary(4), @hResult),    
              source = @source,    
              description = @desc,    
              FailPoint = 'Open failed',    
              MedthodName = @methodName    
        goto destroy    
        return   
  END   
  -- set request headers    
  EXEC @hResult = sp_OAMethod @objectID, 'setRequestHeader', null, 'Content-Type', 'text/xml;charset=UTF-8'   
  IF @hResult <> 0    
  BEGIN   
        EXEC sp_OAGetErrorInfo @objectID, @source OUT, @desc OUT   
        SELECT      hResult = convert(varbinary(4), @hResult),    
              source = @source,    
              description = @desc,    
              FailPoint = 'SetRequestHeader failed',    
              MedthodName = @methodName    
        goto destroy    
        return   
  END   
  -- set soap action    
  EXEC @hResult = sp_OAMethod @objectID, 'setRequestHeader', null, 'SOAPAction', @SoapAction    
  IF @hResult <> 0    
  BEGIN   
        EXEC sp_OAGetErrorInfo @objectID, @source OUT, @desc OUT   
        SELECT      hResult = convert(varbinary(4), @hResult),    
              source = @source,    
              description = @desc,    
              FailPoint = 'SetRequestHeader failed',    
              MedthodName = @methodName    
        goto destroy    
        return   
  END   
  declare @len int   
  set @len = len(@requestBody)    
  EXEC @hResult = sp_OAMethod @objectID, 'setRequestHeader', null, 'Content-Length', @len    
  IF @hResult <> 0    
  BEGIN   
        EXEC sp_OAGetErrorInfo @objectID, @source OUT, @desc OUT   
        SELECT      hResult = convert(varbinary(4), @hResult),    
              source = @source,    
              description = @desc,    
              FailPoint = 'SetRequestHeader failed',    
              MedthodName = @methodName    
        goto destroy    
        return   
  END   
  /*   
  -- if you have headers in a table called RequestHeader you can go through them with this   
  DECLARE @HeaderKey varchar(500), @HeaderValue varchar(500)   
  DECLARE RequestHeader CURSOR  
  LOCAL FAST_FORWARD   
  FOR  
        SELECT      HeaderKey, HeaderValue   
        FROM RequestHeaders   
        WHERE       Method = @methodName   
  OPEN RequestHeader   
  FETCH NEXT FROM RequestHeader   
  INTO @HeaderKey, @HeaderValue   
  WHILE @@FETCH_STATUS = 0   
  BEGIN  
        --select @HeaderKey, @HeaderValue, @methodName   
        EXEC @hResult = sp_OAMethod @objectID, 'setRequestHeader', null, @HeaderKey, @HeaderValue   
        IF @hResult <> 0   
        BEGIN  
              EXEC sp_OAGetErrorInfo @objectID, @source OUT, @desc OUT  
              SELECT      hResult = convert(varbinary(4), @hResult),   
                    source = @source,   
                    description = @desc,   
                    FailPoint = 'SetRequestHeader failed',   
                    MedthodName = @methodName   
              goto destroy   
              return  
        END  
        FETCH NEXT FROM RequestHeader   
        INTO @HeaderKey, @HeaderValue   
  END  
  CLOSE RequestHeader   
  DEALLOCATE RequestHeader   
  */    
  -- send the request    
  EXEC @hResult = sp_OAMethod @objectID, 'send', null, @requestBody    
  IF    @hResult <> 0    
  BEGIN   
        EXEC sp_OAGetErrorInfo @objectID, @source OUT, @desc OUT   
        SELECT      hResult = convert(varbinary(4), @hResult),    
              source = @source,    
              description = @desc,    
              FailPoint = 'Send failed',    
              MedthodName = @methodName    
        goto destroy    
        return   
  END   
  declare @statusText varchar(1000), @status varchar(1000)    
  -- Get status text    
  exec sp_OAGetProperty @objectID, 'StatusText', @statusText out   
  exec sp_OAGetProperty @objectID, 'Status', @status out   
  select @status, @statusText, @methodName    
  -- Get response text    
  exec sp_OAGetProperty @objectID, 'responseText', @responseText out   
  IF @hResult <> 0    
  BEGIN   
        EXEC sp_OAGetErrorInfo @objectID, @source OUT, @desc OUT   
        SELECT      hResult = convert(varbinary(4), @hResult),    
              source = @source,    
              description = @desc,    
              FailPoint = 'ResponseText failed',    
              MedthodName = @methodName    
        goto destroy    
        return   
  END   
  destroy:    
        exec sp_OADestroy @objectID    
  SET NOCOUNT OFF   
      
  GO   


The Stored Procedure takes the following parameters.
  @URI: the URI of the web service
  @MethodName: this would be ‘GET’ or ‘POST’
  @RequestBody: this is the SOAP xml that you want to send
  @SoapAction: this the operation that you want to call on your service
  @UserName: NT UserName if your web service requires authentication
  @Password: the password if using NT Authentication on the web service
  @ResponseText: this is an out parameter that contains the response from the web service


Step 2
Make the setting in SQL for it.
  Use master 
  sp_configure 'show advanced options', 1  
   
  GO  
  RECONFIGURE;  
  GO  
  sp_configure 'Ole Automation Procedures', 1  
  GO  
  RECONFIGURE;  
  GO  
  sp_configure 'show advanced options', 1  
  GO  
  RECONFIGURE; 


Step 3

Call the stored procedure (Here is a sample call to my service).
  declare @xmlOut varchar(8000) 
  Declare @RequestText as varchar(8000); 
  set @RequestText= 
  '<soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/" xmlns:tem="http://tempuri.org/"> 
     <soapenv:Header/> 
     <soapenv:Body> 
        <tem:CreateOrder> 
           <!--Optional:--> 
           <tem:OrderRequest> 
              <tem:OrderId>200</tem:OrderId> 
              <!--Optional:--> 
              <tem:OrderName>something</tem:OrderName> 
           </tem:OrderRequest> 
        </tem:CreateOrder> 
     </soapenv:Body> 
  </soapenv:Envelope>' 
  exec spHTTPRequest 
  'http://localhost/testwebservices/helloworldservice.asmx', 
  'POST', 
  @RequestText, 
  'http://tempuri.org/CreateOrderForMe',   -- this is your SOAPAction: 
  '', '', @xmlOut out 
  select @xmlOut  


Make sure your SOAP action is correct. Copy this action from your services. It will show up when your service is RUN.

 



European SQL 2017 Hosting :: How to Create Registration Form in ASP.NET with SQL Server Database

clock March 19, 2019 11:25 by author Scott

In this post we will see how to create ASP.NET Registration form and save its data in MS SQL Database. It will be quite simple form with values like Employee Id, Employee name, Date of Birth, Country, State, City names and more. We will be also adding Profile Image of the user and saving it to ASP.NET Project’s folder to complete our Registration form data. So let’s see.

I’m also using AJAX to create this registration form in ASP.NET. You can add it to your project by using Visual Studio’s Nuget Manager Console.

First create a new ASP.NET web project in Visual studio. I’m using Visual studio 2017 to create this registration form for ASP.NET.

Now add a new ASPX form with name Registration.aspx and edit it as below:

Registration.aspx:

<%@ Page Language="C#" AutoEventWireup="true" CodeBehind="Registration.aspx.cs"
Inherits="RegistrationForms.Registration" ValidateRequest="true"%>

<%@ Register Assembly="AjaxControlToolkit" Namespace="AjaxControlToolkit" TagPrefix="ajax" %>
<!DOCTYPE html>

<html xmlns="http://www.w3.org/1999/xhtml">
<head runat="server">
<link href="css/bootstrap.min.css" rel="stylesheet" />
<style>
.borderless td, .borderless th {
border: none;
border-color: Red;
}

.table-condensed > thead > tr > th, .table-condensed > tbody > tr > th, .table-condensed > tfoot > tr > th, .table-condensed > thead > tr > td, .table-condensed > tbody > tr > td, .table-condensed > tfoot > tr > td {
padding: 3px;
}

input, select {
border-radius: 3px;
padding: 1px;
border: 1px solid darkgray;
}

.btnCoral {
background-color: crimson;
color: #fff;
}

body {
/* Permalink - use to edit and share this gradient: http://colorzilla.com/gradient-editor/#1e5799+0,2989d8+50,207cca+51,7db9e8+100;Blue+Gloss+Default */
background: #1e5799; /* Old browsers */
background: -moz-linear-gradient(top, #1e5799 0%, #2989d8 50%, #207cca 51%, #7db9e8 100%); /* FF3.6-15 */
background: -webkit-linear-gradient(top, #1e5799 0%,#2989d8 50%,#207cca 51%,#7db9e8 100%); /* Chrome10-25,Safari5.1-6 */
background: linear-gradient(to bottom, #1e5799 0%,#2989d8 50%,#207cca 51%,#7db9e8 100%); /* W3C, IE10+, FF16+, Chrome26+, Opera12+, Safari7+ */
filter: progid:DXImageTransform.Microsoft.gradient( startColorstr='#1e5799', endColorstr='#7db9e8',GradientType=0 ); /* IE6-9 */
}

.parent-container {
background-color: black;
width:70%;
}

.container {
background-color: white;
margin:2px;
width:auto;
}

</style>
<title>Registration Form</title>
</head>
<body>
<form id="form1" runat="server">
<asp:ScriptManager runat="server" />
<div class="parent-container">
<div class="container">
<table class="table-condensed borderless">
<tr>
<td align="center" colspan="2">
<h2 style="background-color: black; color: White; padding: 5px;">REGISTRATION FORM</h2>
</td>
</tr>
<tr>
<td align="right">EMP Id:</td>
<td align="left">
<asp:TextBox runat="server" ID="txtEmpId" />
<asp:RequiredFieldValidator runat="server" ControlToValidate="txtEmpId" ErrorMessage="*" ForeColor="Red"
ValidateRequestMode="Enabled"/>
</td>
</tr>
<tr>
<td align="right">Name:</td>
<td align="left">
<asp:TextBox runat="server" ID="txtName" />
<asp:RequiredFieldValidator runat="server" ControlToValidate="txtName" ErrorMessage="*" ForeColor="Red" />
</td>
</tr>
<tr>
<td align="right">Date of Birth:</td>
<td align="left">
<asp:TextBox runat="server" ID="txtDOB" />
<ajax:CalendarExtender runat="server" Format="yyyy-MM-dd" TargetControlID="txtDOB" />

<asp:RequiredFieldValidator runat="server" ControlToValidate="txtDOB" ErrorMessage="*" ForeColor="Red" />
</td>
</tr>
<tr>
<td align="right">Address:</td>
<td align="left">
<asp:TextBox runat="server" ID="txtAddress" />
<asp:RequiredFieldValidator runat="server" ControlToValidate="txtAddress" ErrorMessage="*" ForeColor="Red" />
</td>
</tr>
<tr>
<td align="right">Country:</td>
<td align="left">
<asp:DropDownList runat="server" ID="ddCountry"
AutoPostBack="true" OnSelectedIndexChanged="ddCountry_SelectedIndexChanged" /></td>
</tr>
<tr>
<td align="right">State:</td>
<td align="left">
<asp:DropDownList runat="server" ID="ddState"
AutoPostBack="true" OnSelectedIndexChanged="ddState_SelectedIndexChanged" /></td>
</tr>
<tr>
<td align="right">City:</td>
<td align="left">
<asp:DropDownList runat="server" ID="ddCity" /></td>
</tr>
<tr>
<td align="right">Pincode:</td>
<td align="left">
<asp:TextBox runat="server" ID="txtPincode" />
<asp:RequiredFieldValidator runat="server" ControlToValidate="txtPincode" ErrorMessage="*" ForeColor="Red" />
</td>
</tr>
<tr>
<td align="right">Date of Joining:</td>
<td align="left">
<asp:TextBox runat="server" ID="txtDOJ" />
<ajax:CalendarExtender runat="server" Format="yyyy-MM-dd" TargetControlID="txtDOJ" />
<asp:RequiredFieldValidator runat="server" ControlToValidate="txtDOJ" ErrorMessage="*" ForeColor="Red" />
</td>
</tr>
<tr>
<td align="right">Department:</td>
<td align="left">
<asp:DropDownList runat="server" ID="ddDepartment" /></td>
</tr>
<tr>
<td align="right">Role:</td>
<td align="left">
<asp:DropDownList runat="server" ID="ddRole">

<asp:ListItem Text="USER" Value="USER" />
<asp:ListItem Text="ROLE" Value="ROLE" />

</asp:DropDownList></td>
</tr>
<tr>
<td align="right">Profile Pic:</td>
<td align="left" valign="middle">
<asp:FileUpload runat="server" ID="fileUpload" />
<asp:Button runat="server" Text="Upload" ID="btnUpload" OnClick="btnUpload_Click" />

<asp:Image ID="imgProfile" runat="server" Width="150px" Height="150px" />
</td>
</tr>

<tr>
<td align="right">Email ID:</td>
<td align="left">
<asp:TextBox runat="server" ID="txtEmail" />
<asp:RequiredFieldValidator runat="server" ControlToValidate="txtEmail" ErrorMessage="*" ForeColor="Red" />
<asp:RegularExpressionValidator runat="server" ValidationExpression="\w+([-+.]\w+)*@\w+([-.]\w+)*\.\w+([-.]\w+)*" ControlToValidate="txtEmail"
ErrorMessage="Email address invalid" />
</td>
</tr>
<tr>
<td align="right">Password:</td>
<td align="left">
<asp:TextBox runat="server" ID="txtpassword" TextMode="Password"/>
<asp:RequiredFieldValidator runat="server" ControlToValidate="txtpassword" ErrorMessage="*" ForeColor="Red" />
</td>
</tr>
<tr>
<td align="right">Confirm Password:</td>
<td align="left">
<asp:TextBox runat="server" ID="txtConfirmPassword" TextMode="Password"/></td>
</tr>
<tr>
<td colspan="20">
<asp:CompareValidator ID="comparePasswords"
runat="server"
ControlToCompare="txtpassword"
ControlToValidate="txtConfirmPassword"
ErrorMessage="Passwords do not match up."
ForeColor="Red" />
</td>
</tr>
<tr>
<td colspan="2" align="center">
<br />
<asp:Button runat="server" ID="btnSubmit" Text="Submit" OnClick="btnSubmit_Click"
CssClass="btn btnCoral" />
</td>
</tr>
<tr>
<td colspan="2">
<asp:Label runat="server" ID="lblInfo" />
</td>
</tr>
</table>
</div>

</div>
</form>
</body>
</html>

 

Ajax’s Data picker is used to set Date of Birth and Date of Joining information of the employee to complete the registration process.

Profile picture will be saved inside “ProfileImages” folder.

 

And below is my code for the registration page:

using System;
using System.Data.SqlClient;
using System.Data;
using System.Configuration;
using System.IO;
using System.Web.UI;

namespace RegistrationForms
{
public partial class Registration : System.Web.UI.Page
{
SqlConnection con;
SqlCommand cmd;
SqlDataAdapter adapter;
SqlDataReader reader;
DataSet ds;
DataTable dt;

protected void Page_Load(object sender, EventArgs e)
{
if (!IsPostBack)
{
LoadCountries();
LoadDepartment();
}
}

public void LoadDepartment()
{
try
{
con = new SqlConnection(ConfigurationManager.ConnectionStrings["connectionstring"].ConnectionString);
cmd = new SqlCommand("Select * from tblDepartments", con);
con.Open();
adapter = new SqlDataAdapter(cmd);
dt = new DataTable();
adapter.Fill(dt);

if (dt.Rows.Count > 0)
{
DataRow dr = dt.NewRow();
dr["DeptId"] = 0;
dr["Department"] = "Please select Department";
dt.Rows.InsertAt(dr, 0);

ddDepartment.DataSource = dt;
ddDepartment.DataTextField = "Department";
ddDepartment.DataValueField = "DeptId";
ddDepartment.DataBind();
}
adapter.Dispose();
cmd.Dispose();
con.Close();
}
catch (Exception ex)
{
lblInfo.Text = ex.Message.ToString();
}
}
public void LoadCountries()
{
try
{
con = new SqlConnection(ConfigurationManager.ConnectionStrings["connectionstring"].ConnectionString);
cmd = new SqlCommand("Select * from tblCountries", con);
con.Open();
adapter = new SqlDataAdapter(cmd);
dt = new DataTable();
adapter.Fill(dt);

if (dt.Rows.Count > 0)
{
DataRow dr = dt.NewRow();
dr["CountryId"] = 0;
dr["Country"] = "Please select Country";
dt.Rows.InsertAt(dr, 0);


ddCountry.DataSource = dt;
ddCountry.DataTextField = "Country";
ddCountry.DataValueField = "CountryId";
ddCountry.SelectedIndex = 0;
ddCountry.DataBind();
}
adapter.Dispose();
cmd.Dispose();
con.Close();
}
catch (Exception ex)
{
lblInfo.Text = ex.Message.ToString();
}

}
protected void btnSubmit_Click(object sender, EventArgs e)
{
try
{
if (Page.IsValid)
{

con = new SqlConnection(ConfigurationManager.ConnectionStrings["connectionString"].ConnectionString);
cmd = new SqlCommand();
cmd.CommandType = CommandType.StoredProcedure;
cmd.CommandText = "sp_AddUsers";
cmd.Connection = con;
cmd.Parameters.AddWithValue("@UserId", txtEmpId.Text.ToString());
cmd.Parameters.AddWithValue("@Name", txtName.Text.ToString());
cmd.Parameters.AddWithValue("@DOB", txtDOB.Text.ToString());
cmd.Parameters.AddWithValue("@Address", txtAddress.Text.ToString());
cmd.Parameters.AddWithValue("@City", ddCity.SelectedValue.ToString());
cmd.Parameters.AddWithValue("@State", ddState.SelectedValue.ToString());
cmd.Parameters.AddWithValue("@Country", ddCountry.SelectedValue.ToString());
cmd.Parameters.AddWithValue("@Pincode", txtPincode.Text.ToString());
cmd.Parameters.AddWithValue("@Department", ddDepartment.SelectedValue.ToString());
cmd.Parameters.AddWithValue("@UserRole", ddRole.SelectedItem.Text.ToString());
cmd.Parameters.AddWithValue("@Passcode", txtpassword.Text.ToString());
cmd.Parameters.AddWithValue("@DOJ", txtDOJ.Text.ToString());
cmd.Parameters.AddWithValue("@ProfilePic", imgProfile.ImageUrl.ToString());
cmd.Parameters.AddWithValue("@EmailId", txtEmail.Text.ToString());
con.Open();
cmd.ExecuteNonQuery();
cmd.Dispose();
con.Close();
ShowAlert("Success", "Saved Successfully");
ClearFeilds();
}
else
ShowAlert("Error", "Please enter all fields");
}
catch (Exception ex)
{

}
}

public void ShowAlert(String header, String message)
{
ScriptManager.RegisterStartupScript(this, GetType(), header, "alert('" + message + "');", true);
}

public void ClearFeilds()
{
txtAddress.Text = "";
txtConfirmPassword.Text = "";
txtDOB.Text = "";
txtDOJ.Text = "";
txtEmail.Text = "";
txtEmpId.Text = "";
txtName.Text = "";
txtpassword.Text = "";
txtPincode.Text = "";
LoadCountries();
}
protected void ddCountry_SelectedIndexChanged(object sender, EventArgs e)
{
LoadState();
}

public void LoadState()
{
try
{
con = new SqlConnection(ConfigurationManager.ConnectionStrings["connectionstring"].ConnectionString);
cmd = new SqlCommand("Select * from tblStates where CountryId=@id", con);
cmd.Parameters.AddWithValue("@id", ddCountry.SelectedValue.ToString());
con.Open();
adapter = new SqlDataAdapter(cmd);
dt = new DataTable();
adapter.Fill(dt);

if (dt.Rows.Count > 0)
{
DataRow dr = dt.NewRow();
dr["StateId"] = 0;
dr["State"] = "Please select State";
dr["CountryId"] = "0";
dt.Rows.InsertAt(dr, 0);
ddState.Items.Clear();
ddState.ClearSelection();
ddState.DataSource = dt;
ddState.DataTextField = "State";
ddState.DataValueField = "StateId";
ddState.DataBind();
ddState.SelectedValue = null;
ddState.SelectedIndex = 0;
ddCity.Items.Clear();
ddCity.SelectedValue = null;
}
adapter.Dispose();
cmd.Dispose();
con.Close();
}
catch (Exception ex)
{
}
}

protected void ddState_SelectedIndexChanged(object sender, EventArgs e)
{
try
{
con = new SqlConnection(ConfigurationManager.ConnectionStrings["connectionstring"].ConnectionString);
cmd = new SqlCommand("Select * from tblCities where Stateid=@id", con);
cmd.Parameters.AddWithValue("@id", ddState.SelectedValue.ToString());
con.Open();
adapter = new SqlDataAdapter(cmd);
dt = new DataTable();
adapter.Fill(dt);

if (dt.Rows.Count > 0)
{
DataRow dr = dt.NewRow();
dr["CityId"] = 0;
dr["City"] = "Please select City";
dr["StateId"] = "0";
dt.Rows.InsertAt(dr, 0);
ddCity.DataSource = null;
ddCity.DataSource = dt;
ddCity.DataTextField = "City";
ddCity.DataValueField = "CityId";
ddCity.DataBind();
ddCity.SelectedValue = null;
ddCity.SelectedIndex = 0;
}
adapter.Dispose();
cmd.Dispose();
con.Close();
}
catch (Exception ex)
{
}
}

protected void btnUpload_Click(object sender, EventArgs e)
{
try
{
if (fileUpload.HasFile)
{
string fileName = Path.GetFileName(fileUpload.PostedFile.FileName);
fileUpload.PostedFile.SaveAs(Server.MapPath("~/ProfileImages/") + fileName);
imgProfile.ImageUrl = "~/ProfileImages/" + fileName;
}
}
catch (Exception ex)
{
lblInfo.Text = "Image upload: " + ex.Message.ToString();
}
}
}
}

Below is the Database script (MS SQL Database):

--Create database [ESource]
USE [ESource]

CREATE TABLE [dbo].[tblStates](
[StateId] [int] IDENTITY(1,1) NOT NULL,
[State] [nvarchar](50) NOT NULL,
[CountryId] [int] NOT NULL
)

CREATE TABLE [dbo].[tblCities](
[CityId] [int] IDENTITY(1,1) NOT NULL,
[City] [nvarchar](50) NOT NULL,
[StateId] [int] NOT NULL
)

CREATE TABLE [dbo].[tblCountries](
[CountryId] [int] IDENTITY(1,1) NOT NULL,
[Country] [nvarchar](50) NOT NULL
)

CREATE TABLE [dbo].[tblDepartments](
[DeptId] [int] IDENTITY(1,1) NOT NULL,
[Department] [nvarchar](50) NOT NULL
)

CREATE TABLE [dbo].[tblUserDetails](
[EId] [int] IDENTITY(1,1) NOT NULL,
[EmailId] [nvarchar](50) NOT NULL,
[UserId] [nvarchar](50) NULL,
[Passcode] [nvarchar](50) NOT NULL,
[Name] [nvarchar](50) NOT NULL,
[DOB] [datetime] NOT NULL,
[Address] [nvarchar](max) NOT NULL,
[City] [nvarchar](50) NOT NULL,
[Pincode] [int] NOT NULL,
[State] [nvarchar](50) NOT NULL,
[Country] [nvarchar](50) NOT NULL,
[Department] [nvarchar](50) NOT NULL,
[DOJ] [datetime] NOT NULL,
[ProfilePic] [nvarchar](max) NOT NULL,
[UserRole] [nvarchar](50)
)

CREATE procedure [dbo].[sp_AddUsers]
(
@UserId nvarchar(50),
@Passcode nvarchar(50),
@Name nvarchar(50),
@DOB datetime,
@Address nvarchar(max),
@City int,
@Pincode nvarchar(50),
@State int,
@Country int,
@Department int,
@DOJ datetime,
@ProfilePic nvarchar(max),
@UserRole nvarchar(50),
@EmailId nvarchar(50)
)
as begin
Insert into tblUserDetails
(UserId,Passcode,Name,DOB,Address,City,Pincode,State,Country,Department,DOJ,ProfilePic,UserRole,EmailId)
values
(@UserId,@Passcode,@Name,@DOB,@Address,@City,@Pincode,@State,@Country,@Department,@DOJ,@ProfilePic,@UserRole,@EmailId);
end
GO



SQL Server 2014 Hosting - HostForLIFE.eu :: Transfer Database From SQL Server 2008 To 2014

clock September 27, 2018 11:52 by author Peter

There are lots of companies that use Microsoft services for creating and editing databases and table records. It is one of the most sought-after technology when it comes to relational database management system. They keep upgrading their products to remove bugs and improve services. One may need to transfer database from SQL Server 2008 to 2014 to keep up with the latest requirements. It is better to have complete knowledge with respect to steps involved in performing the migration. There are different ways to perform this procedure without any data loss.

Different Ways to Transfer Database from SQL Server 2008 to 2014
Following is a snapshot of all the methods one can opt for when moving databases,

  • Transfer Database using Backup and Restore Option
  • First, archive the full database with all the instances.
  • Then, copy the backup to the target location.
  • Next, restore it on the destination Server specify the ‘WITH NORECOVERY’ option.
  • To migrate SQL Server 2008 database to 2014 by overwriting the pre-existing database, use the ‘WITH REPLACE’ option.

Move Database using Attach and Detach

  • First, detach the source Server by using the sp_detach_db stored procedure.
  • Then, copy the .mdf, .ldf and .ndf files to the destination computer.
  • Next, use the sp_attach_db stored procedure to attach the database to the target Server.
  • Browse to the location where the copied files are saved on the new machine.

Transfer using Import and Export Wizard
There is an inbuilt facility provided by Microsoft for SQL Server 2008 to 2014 migration. It is the Data Transformation Services Import and Export Data Wizard. It has the ability to transfer complete databases or selectively move objects to the destination database. It can be implemented by repeating the steps below:
First, go to SQL Server Management Studio on the source Server and select the database to export.
Then, right-click on it and go to Tasks >> Copy Database Wizard.
Now, select the source and destination credentials and choose appropriate settings.
Then, click Next or schedule SQL Server 2008 to 2014 migration for some other time.
Finally, click on the Execute button to implement the changes made.

Transfer SQL Server Scripts to Destination Server
First, launch the SQL Server Management Studio on the source server.

  • Then, select the database and right-click on it.
  • Then, go to Tasks >> Generate Scripts Wizard(GSW).
  • Next, select the appropriate choice from the multiple options available.
  • Make sure that the ‘script data = true’ is selected to move data as well.
  • Then, select Next >> Next >> Finish.
  • Next, connect to the Database Server and create a new database in it.
  • Then, select a ‘New Query’ button from the navigation bar and paste the scripts generated by the GSW.
  • Finally, execute them on the destination database.

It is a smarter decision to transfer database from SQL Server 2008 to 2014. It contributes towards organization’s growth and technology upgrade needs. There are far too many ways to perform this migration. It is not easy to understand and to implement them without any trouble. Even technical professionals can use some help now and again. This post discusses all the manual means to migrate SQL Server 2008 database to 2014. One can also go with SysTools SQL Server Database Migrator to transfer SQL Server database from one Server to another in a small down time in few clicks.

HostForLIFE.eu SQL 2014 Hosting
HostForLIFE.eu revolutionized hosting with Plesk Control Panel, a Web-based interface that provides customers with 24x7 access to their server and site configuration tools. Plesk completes requests in seconds. It is included free with each hosting account. Renowned for its comprehensive functionality - beyond other hosting control panels - and ease of use, Plesk Control Panel is available only to HostForLIFE's customers. They
offer a highly redundant, carrier-class architecture, designed around the needs of shared hosting customers.

 



European Entity Framework Core 1.0 Hosting - HostForLIFE.eu :: How to Access SQL from Entity Framework Core

clock January 12, 2017 08:33 by author Scott

ORMs like Entity Framework have always been shrouded in a bit of controversy from SQL purists who aren't in control of the types of queries that these tools are building behind the scenes. While this is a perfectly valid stance to take, these same tools can often assist in getting things done much faster than tinkering with an endless chain of JOINs, sub-queries, etc.

But what if you want to know exactly what is going on behind the scenes? What SQL is being generated by Entity Framework? And is it efficient enough to suit your needs, or do you need to handle writing some yourself?

This post will cover a quick tip that will allow you to see the SQL behind the scenes and judge for yourself using Entity Framework Core.

How Did This Work Prior to Entity Framework Core?

Previously, you could use Reflection to create an ObjectQuery object and then call the ToTraceString()method to actually store the query results as seen below:

// Build a query using Entity Framework
var query = _context.Widgets.Where(w => w.IsReal && w.Id == 42); 
// Get the generated SQL
var sql = ((System.Data.Objects.ObjectQuery)query).ToTraceString(); 

And that's really it. The result of the ToTraceString() call will return a string variable containing the entire SQL query being executed.

Options for Entity Framework Core

The previous approach no longer works within the Entity Framework Core (EF7) world, so we have to resort to one of three options, which may vary depending on your needs:

  • Using Built-in or Custom Logging. Logging the executing query using your logger of choice or the built-in Logger in .NET Core as mentioned in this tutorial.
  • Using a Profiler. Using an SQL Profiler like MiniProfiler to monitor the executing query.
  • Using Crazy Reflection Code. You can implement some custom reflection code similar to the older approach to perform the same basic concept.

Since both of the first two options are fairly well documented, we will be focusing on the crazy reflection approach.

Getting Behind the Scenes in Entity Framework Core

Using the following snippets of code, which rely on Reflection to resolve information about the compiler, parser, database, and fields being targeted, we can use those things to reconstruct what is actually going on behind the scenes.

NOTE: Since this relies on Reflection, it may be subject to breaking in the future, especially with any API changes.

public class IQueryableExtensions 
{
    private static readonly FieldInfo QueryCompilerField =
    typeof(EntityQueryProvider).GetTypeInfo().DeclaredFields.First
    (x => x.Name == "_queryCompiler");

    private static readonly PropertyInfo NodeTypeProviderField =
    QueryCompilerTypeInfo.DeclaredProperties.Single(x => x.Name == "NodeTypeProvider");

    private static readonly MethodInfo CreateQueryParserMethod =
    QueryCompilerTypeInfo.DeclaredMethods.First(x => x.Name == "CreateQueryParser");

    private static readonly FieldInfo DataBaseField =
    QueryCompilerTypeInfo.DeclaredFields.Single(x => x.Name == "_database");

    private static readonly FieldInfo QueryCompilationContextFactoryField =
    typeof(Database).GetTypeInfo().DeclaredFields.Single
    (x => x.Name == "_queryCompilationContextFactory");

    public static string ToSql<TEntity>
    (this IQueryable<TEntity> query) where TEntity : class
    {
        if (!(query is EntityQueryable<TEntity>)
        && !(query is InternalDbSet<TEntity>))
        {
            throw new ArgumentException("Invalid query");
        }

        var queryCompiler = (IQueryCompiler)QueryCompilerField.GetValue(query.Provider);
        var nodeTypeProvider =
        (INodeTypeProvider)NodeTypeProviderField.GetValue(queryCompiler);
        var parser = (IQueryParser)CreateQueryParserMethod.Invoke
        (queryCompiler, new object[] { nodeTypeProvider });
        var queryModel = parser.GetParsedQuery(query.Expression);
        var database = DataBaseField.GetValue(queryCompiler);
        var queryCompilationContextFactory =
        (IQueryCompilationContextFactory)QueryCompilationContextFactoryField.GetValue(database);
        var queryCompilationContext = queryCompilationContextFactory.Create(false);
        var modelVisitor =
        (RelationalQueryModelVisitor)queryCompilationContext.CreateQueryModelVisitor();
        modelVisitor.CreateQueryExecutor<TEntity>(queryModel);
        var sql = modelVisitor.Queries.First().ToString();

        return sql;
    }
}

And as far as actual usage goes, you would simply call the ToSql() method to return your SQL query string:

// Build a query using Entity Framework
var query = _context.Widgets.Where(w => w.IsReal && w.Id == 42); 
// Get the generated SQL
var sql = query.ToSql(); 



About HostForLIFE

HostForLIFE is European Windows Hosting Provider which focuses on Windows Platform only. We deliver on-demand hosting solutions including Shared hosting, Reseller Hosting, Cloud Hosting, Dedicated Servers, and IT as a Service for companies of all sizes.

We have offered the latest Windows 2019 Hosting, ASP.NET 5 Hosting, ASP.NET MVC 6 Hosting and SQL 2019 Hosting.


Month List

Tag cloud

Sign in