European Windows 2019 Hosting BLOG

BLOG about Windows 2019 Hosting and SQL 2019 Hosting - Dedicated to European Windows Hosting Customer

SQL Server Hosting - HostForLIFE :: Simple SELECT, INSERT, UPDATE and DELETE Using LINQ to SQL

clock April 27, 2021 08:57 by author Peter

Language-INtegrated Query (LINQ) is a Microsoft .NET Framework component that adds native data querying capabilities to .NET languages. In other words LINQ has the power of querying on any source of data (Collection of objects, database tables or XML Files). We can easily retrieve data from any object that implements the IEnumerable<T> interface and any provider that implements the IQueryable<T> interface.

Microsoft basically divides LINQ into the following three areas:

  • LINQ to Object : Queries performed against in-memory data
  • LINQ to ADO.Net
  • LINQ to SQL (formerly DLinq) : Queries performed against the relation database; only Microsoft SQL Server is supported.
  • LINQ to DataSet : Supports queries by using ADO.NET data sets and data tables.
  • LINQ to Entities : Microsoft ORM solution
  • LINQ to XML (formerly XLinq) : Queries performed against the XML source.


LINQ to SQL
LINQ to SQL translates our actions to SQL and submits the changes to the database. Here we will perform Select, Insert, Update and Delete operations on a COURSE table.

Step 1: Create a COURSE Table in the database

Step 2: Create a ContextData file using the Object Relational Designer:
Create a new item, select the LINQ to SQL classes (as shown in the following figure) and name it Operation.dbml.

Step 2: Create a ContextData file using the Object Relational Designer:
Create a new item, select the LINQ to SQL classes (as shown in the following figure) and name it Operation.dbml.

After clicking the Add button the ContextData file is created. Now we should drag all the tables onto the left-hand side of the designer and save (as shown in the following figure). This will create all the mappings and settings for each table and their entities.


For .dbml files the database connection string is defined in the web.config file as:
    <connectionStrings>  
    <add name="DevelopmentConnectionString" connectionString="Data Source=sandeepss-PC;Initial Catalog=Development;User ID=sa;  
    Password=*******" providerName="System.Data.SqlClient" />  
     </connectionStrings>


We can use a connection string from the web.config file or we can pass a connection string as a parameter in the constructor of the DataContext class to create an object of the DataContext class.

The SELECT Operation
    private void GetCourses()  
    {  
          //create DataContext object  
          OperationDataContext OdContext = new OperationDataContext();  
          var courseTable = from course in OdContext.GetTable<COURSE>() select course;  
          //grdCourse is gridview id  
          grdCourse.DataSource = courseTable;  
          grdCourse.DataBind();  
    }


The INSERT Operation
    private void AddNewCourse()  
    {  
          //Data maping object to our database  
          OperationDataContext OdContext = new OperationDataContext();  
          COURSE objCourse = new COURSE();  
          objCourse.course_name = "B.Tech";  
          objCourse.course_desc = "Bachelor Of Technology";  
          objCourse.modified_date = DateTime.Now;  
          //Adds an entity in a pending insert state to this System.Data.Linq.Table<TEntity>and parameter is the entity which to be added  
          OdContext.COURSEs.InsertOnSubmit(objCourse);  
          // executes the appropriate commands to implement the changes to the database  
          OdContext.SubmitChanges();  
    }

The Update Operation
    private void UpdateCourse()  
    {  
          OperationDataContext OdContext = new OperationDataContext();  
          //Get Single course which need to update  
          COURSE objCourse = OdContext.COURSEs.Single(course => course.course_name == "B.Tech");  
          //Field which will be update  
          objCourse.course_desc = "Bachelor of Technology";  
          // executes the appropriate commands to implement the changes to the database  
          OdContext.SubmitChanges();  
    }

The DELETE Operation
    private void DeleteCourse()  
    {  
          OperationDataContext OdContext = new OperationDataContext();  
          //Get Single course which need to Delete  
          COURSE objCourse = OdContext.COURSEs.Single(course => course.course_name == "B.Tech");  
          //Puts an entity from this table into a pending delete state and parameter is the entity which to be deleted.  
          OdContext.COURSEs.DeleteOnSubmit(objCourse);  
          // executes the appropriate commands to implement the changes to the database  
          OdContext.SubmitChanges();  
    }

To perform select, insert, update and delete operations we create a table and create a data context class; in other words a dbml file. In this file designer view we drag and drop the COURSE table from the Server Explorer. This data context class is an Object and table mapping and we perform the operation on the object and database updated according to the action using the submitChanges() method.

HostForLIFEASP.NET SQL Server 2019 Hosting



SQL Server Hosting - HostForLIFE :: SQL Server Performance Tuning Tips

clock April 19, 2021 07:18 by author Peter

In this article, we will learn about SQL Server performance tuning tips with examples.
 
Database
The Database is the most important and powerful part of any application. If your database is not working properly and taking a long time to compute the result, this means something is going wrong in the database. Here, database tune-up is required, otherwise, the performance of the application will degrade.

I know a lot of articles already published on this topic. But in this article, I tried to provide a list of database tune-up tips that will cover all the aspects of the database. Database tuning is a very critical and fussy process. It is true that database tuning is a database admin task but we should have the basic level of knowledge for doing this. Because, if we are working on a project where there is no role of admin, then it is our responsibility to maintain the performance of the database. If the performance of the database is degraded, then it will cause the worst effect on the whole system.
 
In this article, I will explain some basic database tuning tips that I learned from my experience and from my friends who are working as a database administrator. Using these tips, you can maintain or upgrade the performance of your database system. Basically, these tips are written for SQL Server but we can implement these into another database too, like Oracle and MySQL. Please read these tips carefully and at the end of the article, let me know if you find something wrong or incorrect.
 
Avoid Null value in the fixed-length field
We should avoid the Null value in fixed-length fields because if we insert the NULL value in a fixed-length field, then it will take the same amount of space as the desired input value for that field. So, if we require a null value in a field, then we should use a variable-length field that takes lesser space for NULL. The use of NULLs in a database can reduce the database performance, especially,  in WHERE clauses. For example, try to use varchar instead of char and nvarchar.
    Never use Select * Statement:  

When we require all the columns of a table, we usually use a “Select *” statement. Well, this is not a good approach because when we use the “select *” statement, the SQL Server converts * into all column names before executing the query, which takes extra time and effort. So, always provide all the column names in the query instead of “select *”.
 
Normalize tables in a database
Normalized and managed tables increase the performance of a database. So,  always try to perform at least 3rd normal form. It is not necessary that all tables require a 3NF normalization form, but if any table contains 3NF form normalization, then it can be called well-structured tables.
 
Keep Clustered Index Small
Clustered index stores data physically into memory. If the size of a clustered index is very large, then it can reduce the performance. Hence, a large clustered index on a table with a large number of rows increases the size significantly. Never use an index for frequently changed data because when any change in the table occurs, the index is also modified, and that can degrade performance.
 
Use Appropriate Datatype
If we select an inappropriate data type, it will reduce the space and enhance the performance; otherwise, it generates the worst effect. So, select an appropriate data type according to the requirement. SQL contains many data types that can store the same type of data but you should select an appropriate data type because each data type has some limitations and advantages upon another one.
 
Store image path instead of the image itself
I found that many developers try to store the image into the database instead of the image path. It may be possible that it is a requirement of the application to store images into a database. But generally, we should use an image path, because storing image in a database increases the database size and reduces performance.
 
USE Common Table Expressions (CTEs) instead of Temp table
We should prefer a CTE over the temp table because temp tables are stored physically in a TempDB which is deleted after the session ends. While CTEs are created within memory. Execution of a CTE is very fast as compared to the temp tables and very lightweight too.
 
Use Appropriate Naming Convention
The main goal of adopting a naming convention for database objects is to make it easily identifiable by the users, their type, and the purpose of all objects contained in the database. A good naming convention decreases the time required in searching for an object. A good name clearly indicates the action name of any object that it will perform.
    * tblEmployees // Name of table  
    * vw_ProductDetails // Name of View  
    * PK_Employees // Name of Primary Key  


Use UNION ALL instead of UNION
We should prefer UNION ALL instead of UNION because UNION always performs sorting that increases the time. Also, UNION can't work with text datatype because text datatype doesn't support sorting. So, in that case, UNION can't be used. Thus, always prefer UNION All.
 
Use Small data type for Index
It is very important to use a Small data type for the index. Because the bigger size of the data type reduces the performance of the Index. For example, nvarhcar(10) uses  20 bytes of data, and varchar(10) uses 10 bytes of the data. So, the index for the varchar data type works better. We can also take another example of DateTime and int. Datetime data type takes 8 Bytes and int takes 4 bytes. A small datatype means less I/O overhead that increases the performance of the index.
    Use Count(1) instead of Count(*) and Count(Column_Name):  

There is no difference in the performance of these three expressions; but, the last two expressions are not well considered to be a good practice. So, always use count(10) to get the numbers of records from a table.
 
Use Stored Procedure
Instead of using the row query, we should use the stored procedure because stored procedures are fast and easy to maintain for security and large queries.
 
Use Between instead of In
If Between can be used instead of IN, then always prefer Between. For example, you are searching for an employee whose id is either 101, 102, 103, or 104. Then, you can write the query using the In operator like this:
    Select * From Employee Where EmpId In (101,102,103,104)  

You can also use Between operator for the same query.
    Select * from Employee Where EmpId Between 101 And 104  

Use If Exists to determine the record
 
It has been seen many times that developers use "Select Count(*)" to get the existence of records. For example
    Declare @Count int;  
    Set @Count=(Select * From Employee Where EmpName Like '%Pan%')  
    If @Count>0  
    Begin  
    //Statement  
    End  

But, this is not a proper way for such type of queries. Because, the above query performs the complete table scan, so you can use If Exists for the same query. That will increase the performance of your query, as below.
    IF Exists(Select Emp_Name From Employee Where EmpName Like '%Pan%')  
    Begin  
    //Statements  
    End  


Never Use ” Sp_” for User Define Stored Procedure
Most programmers use “sp_” for user-defined Stored Procedures. I suggest to never use “sp_” for user-defined Stored Procedure because in SQL Server, the master database has a Stored Procedure with the "sp_" prefix. So, when we create a Stored Procedure with the "sp_" prefix, the SQL Server always looks first in the Master database, then in the user-defined database, which takes some extra time.
 
Practice to use Schema Name
A schema is an organization or structure for a database. We can define a schema as a collection of database objects that are owned by a single principle and form a single namespace. Schema name helps the SQL Server finding that object in a specific schema. It increases the speed of the query execution. For example, try to use [dbo] before the table name.
 
Avoid Cursors
A cursor is a temporary work area created in the system memory when a SQL statement is executed. A cursor is a set of rows together with a pointer that identifies the current row. It is a database object to retrieve the data from a result set one row at a time. But, the use of a cursor is not good because it takes a long time because it fetches data row by row. So, we can use a replacement of cursors. A temporary table for or While loop may be a replacement of a cursor in some cases.
 
SET NOCOUNT ON
When an INSERT, UPDATE, DELETE, or SELECT command is executed, the SQL Server returns the number affected by the query. It is not good to return the number of rows affected by the query. We can stop this by using NOCOUNT ON.
 
Use Try–Catch
In T-SQL, a Try-Catch block is very important for exception handling. A best practice and use of a Try-Catch block in SQL can save our data from undesired changes. We can put all T-SQL statements in a TRY BLOCK and the code for exception handling can be put into a CATCH block.
 
Remove Unused Index
Remove all unused indexes because indexes are always updated when the table is updated so the index must be maintained even if not used.
 
Always create an index on the table
An index is a data structure to retrieve fast data. Indexes are special lookup tables that the database search engine can use to speed up data retrieval. Simply an index is a pointer to data in a table. Mainly an index increases the speed of data retrieval. So always try to keep a minimum of one index on each table it may be either clustered or non-clustered index.
 
Use Foreign Key with the appropriate action
A foreign key is a column or combination of columns that is the same as the primary key, but in a different table. Foreign keys are used to define a relationship and enforce integrity between two tables. In addition to protecting the integrity of our data, FK constraints also help document the relationships between our tables within the database itself. Also, define an action rule for the delete and update command, you can select any action among the No Action, Set NULL, Cascade, and set default.
 
Use Alias Name
Aliasing renames a table or a column temporarily by giving another name. The use of table aliases means to rename a table in a specific SQL statement. Using aliasing, we can provide a small name to a large name that will save our time.
 
Use Transaction Management
A transaction is a unit of work performed against the database. A transaction is a set of work (T-SQL statements) that execute together like a single unit in a specific logical order as a single unit. If all the statements are executed successfully then the transaction is complete and the transaction is committed and the data will be saved in the database permanently. If any single statement fails then the entire transaction will fail and then the complete transaction is either canceled or rolled back.
 
Use Index Name in Query
Although in most cases the query optimizer will pick the appropriate index for a specific table based on statistics, sometimes it is better to specify the index name in your SELECT query.
 
Example
    SELECT  
    e.Emp_IId,  
    e.First_Name,  
    e.Last_Name  
    FROM dbo.EMPLOYEE e  
    WITH (INDEX (Clus_Index))  
    WHERE e.Emp_IId > 5  
    Select Limited Data  


We should retrieve only the required data and ignore the unimportant data. The fewer data retrieved, the faster the query will run. Rather than filtering on the client, push as much filtering as possible on the server-end. This will result in less data being sent on the wire and you will see results much faster.
 
Drop Index before Bulk Insertion of Data

We should drop the index before the insertion of a large amount of data. This makes the insert statement run faster. Once the inserts are completed, you can recreate the index again.
 
Use Unique Constraint and Check Constraint
A Check constraint checks for a specific condition before inserting data into a table. If the data passes all the Check constraints then the data will be inserted into the table otherwise the data for insertion will be discarded. The CHECK constraint ensures that all values in a column satisfy certain conditions.
 
A Unique Constraint ensures that each row for a column must have a unique value. It is like a Primary key but it can accept only one null value. In a table, one or more column can contain a Unique Constraint. So we should use a Check Constraint and Unique Constraint because it maintains the integrity in the database.
 
Importance of Column Order in index
If we are creating a Non-Clustered index on more than one column then we should consider the sequence of the columns. The order or position of a column in an index also plays a vital role in improving SQL query performance. An index can help to improve the SQL query performance if the criteria of the query match the columns that are left most in the index key. So we should place the most selective column on left most side of a non-clustered index.
 
Recompiled Stored Procedure
We all know that Stored Procedures execute T-SQL statements in less time than a similar set of T-SQL statements are executed individually. The reason is that the query execution plan for the Stored Procedures is already stored in the "sys. procedures" system-defined view. We all know that recompilation of a Stored Procedure reduces SQL performance. But in some cases, it requires recompilation of the Stored Procedure. Dropping and altering of a column, index, and/or trigger of a table. Updating the statistics used by the execution plan of the Stored Procedure. Altering the procedure will cause the SQL Server to create a new execution plan.
 
Use Sparse Column
Sparse columns provide better performance for NULL and Zero data. If you have any column that contains large amounts numbers of NULL and Zero then prefer Sparse Column instead of the default column of SQL Server. The sparse column takes lesser space than the regular column (without the SPARSE clause).
 
Example
    Create Table Table_Name  
    (  
    Id int, //Default Column  
    Group_Id int Sparse // Sparse Column  
    )  


Avoid Loops In Coding
Suppose you want to insert 10 records into the table then instead of using a loop to insert the data into the table you can insert all data using a single insert query.
    declare @int int;  
    set @int=1;  
    while @int<=10  
    begin  
    Insert Into Tab values(@int,'Value'+@int);  
    set @int=@int+1;  
    end  

The above method is not a good approach to insert the multiple records instead of this you can use another method like below.
    Insert Into Tab values(1,'Value1'),(2,'Value2'),(3,'Value3'),(4,'Value4'),(5,'Value5'),(6,'Value6'),(7,'Value7'),(8,'Value8'),(9,'Value9'),(10,'Value10');  

Avoid Correlated Queries
In A Correlated query inner query take input from the outer(parent) query, this query runs for each row that reduces the performance of the database.
    Select Name, City, (Select Company_Name  
    from  
    Company where companyId=cs.CustomerId) from Customer cs  


The best method is that we should prefer the join instead of the correlated query as below.
    Select cs.Name, cs.City, co.Company_Name  
    from Customer cs  
    Join  
    Company co  
    on  
    cs.CustomerId=co.CustomerId  


Avoid index and join hints
In some cases, index and join hint may increase the performance of a database, but if you provide any join or index hint then the server always tries to use the hint provided by you although it has a better execution plan, so such type of approach may reduce the database performance. Use Join or index hint if you are confident that there is not any better execution plan. If you have any doubt then make the server free to choose an execution plan.
 
Avoid Use of Temp table
Avoid the use of a temp table as much as you can because a temp table is created into a temp database like any basic table structure. After completion of the task, we require to drop the temp table. That raises the load on the database. You can prefer the table variable instead of this.
 
Use Index for required columns
The index should be created for all columns which are using the Where, Group By, Order By, Top, and Distinct command.
 
Don't use Index
It is true that the use of an index makes the fast retrieval of the result. But, it is not always true. In some cases, the use of index doesn't affect the performance of the query. In such cases, we can avoid the use of the index.

    When the size of the table is very small.
    The index is not used in the query optimizer
    DML(insert, Update, Delete) operations are frequently used.
    Column contains TEXT, nText type of data.

Use View for complex queries
If you are using join on two or more tables and the result of queries is frequently used, then it will be better to make a View that will contain the result of the complex query. Now, you can use this View multiple times, so that you don't have to execute the query multiple times to get the same result.
 
Make Transaction short
It will be better to keep the transaction as short as we can. Because the big size of transactions makes the table locked and reduces the database concurrency. So, always try to make shorter transactions.
 
Use Full-text Index
If your query contains multiple wild card searches using LIKE(%%), then the use of Full-text Index can increase the performance. Full-text queries can include simple words and phrases or multiple forms of a word or phrase. A full-text query returns any document that contains at least one match (also known as a hit). A match occurs when a target document contains all the terms specified in the Full-text query and meets any other search conditions, such as the distance between the matching terms.
 
Thanks for reading the article. As I have asked in the starting, if you have any doubt or I wrote something wrong, then write me back in the comments section.

HostForLIFEASP.NET SQL Server 2019 Hosting

 



SQL Server Hosting - HostForLIFE :: How To Insert SharePoint List Item To MS SQL Server Using PowerShell?

clock April 15, 2021 08:49 by author Peter

This article demonstrates how to Insert the SharePoint Online List item to On-Premise Microsoft SQL Table using PowerShell. It starts with the introduction of the Get-PnPListItem command available in SharePointPnpPowerShellOnline PowerShell module to read List items from SharePoint List. After that, it demonstrates how to create the SQL query and use Invoke-SQLcmd to create rows for each item in SQL Table. In the end, the article discusses how to Update Column MoveToSQL in the SharePoint List item,  so that in next run of PowerShell code it does not insert the same items in SQL table.
 
This PowerShell script can be used as either a one time activity to insert SharePoint List Item to SQL Table or used as a scheduler so it keeps inserting the List items into Microsoft SQL Table.
 
Pre-requisites (Environment Details)
    Windows PowerShell
    SharePointPnpPowerShellOnline Module

Please install the SharePointPnpPowerShellOnline module if it’s not already present using the below command.

    Install-Module SharePointPnPPowerShellOnline  

SharePoint List sample columns with data


Micrsoft SQL server Table without data


Variables Explanations in this Article
    $MSSQLServerInstanceName="ContosoServer\InstanceName" - on-premise SQL Server Instance
    $DatabaseName="DBTest" - on-premise SQL Database Name
    $TableName="Product" - SQL Table Name
    $O365ServiceAccount="[email protected]" - Your Service Account Name
    $O365ServiceAccountPwd="abc@2020" - Your Service Account password
    $siteURL="https://abc.sharepoint.com/sites/test" - SharePoint Site URL from where the item will be read
    $List="Product" - List Name


Here you can see we have provided the password in plain text which you need to provide if you want this PowerShell script to run automatically through Timer Job or Scheduler.
 
For manual execution please use the  Get-Credential command to read the user name and password from the user input.
 
Read SharePoint List Items
SharePointPnpPowerShellOnline Module of PowerShell has made developers' lives easier. To read the all the list items from a list use the code snippet as below.
 
Please find all the parameter associated with Get-PnpListItem

    $Items=Get-PnPListItem -List $List  

Here $Items variable will hold all the items from the List.
 
Before reading the list you should be connecting to the SharePoint Site using the below snippet:

    [SecureString]$SecurePass = ConvertTo-SecureString $O365ServiceAccountPwd -AsPlainText –Force    
    [System.Management.Automation.PSCredential]$PSCredentials = New-Object System.Management.Automation.PSCredential($O365ServiceAccount, $SecurePass)    
    Connect-PnPOnline -Url $siteURL -Credentials $PSCredentials     

Create SQL Query and Invoke-SQLcmd
 
Once the PowerShell Script has read all the items of the list and stores in a Variable, the next step is to read the List item form a SQL query with column values and insert to SQL Table one by one in for loop using Invoke-sqlcmd.
 
If Invoke-sqlcmd is not found as PowerShell Module , please install the SQL Server PowerShell Module using the command below:

    Install-Module -Name SqlServer  

The below snippet will read items from List Items Collection variable $Items and insert the items into SQL table.
    #Loop through the items  
    foreach($Item in $Items) {  
            $MovedToSQL = $Item["MovedToSQL"]  
            # check  
            if item already moved to SQL  
            if ($MovedToSQL - ne "Yes") {  
                #  
                $ productName = $Item["ProductName"]  
                $ productDescription = $Item["ProductDescription"]  
                $ productCost = $Item["ProductCost"]  
                $ productMake = $Item["ProductMake"]  
                # Insert query  
                for SQL Table  
                $insertquery = "    
                INSERT INTO[$DatabaseName]. [$TableName]  
                    ([ProductName], [ProductDescription], [ProductCost], [ProductMake])  
                VALUES('$productName ', '$productDescription ', '$productCost', '$productMake')  
                GO "     


Invoke SQLcmd command to insert the item into SQL table based on Query generated in the above code snippet:
    Invoke-SQLcmd -ServerInstance $MSSQLServerInstanceName -query $insertquery -Database $DatabaseName  

Update MoveToSQL Column of SharePoint List
 
This step is necessary if we want to avoid duplication of the same data in SQL table . Once the Item is inserted into the SQL Table, update “MoveToSQL” Column of the SharePoint List Item with value “Yes” using Set-PnPListItem PowerShell Command.
 
Below is the code snippet to update the list item
    Set-PnPListItem -List $List -Identity $Item.Id -Values @{ "MovedToSQL"="Yes"}  

Complete Powershell script
    #This script to pull the SharePoint Online List data and move into SQL  
    #created by Vinit Kumar  
    #SQL data base information - variables - Please change  
    $MSSQLServerInstanceName = "ContosoServer\InstanceName"  
    $DatabaseName = "DBTest"  
    $TableName = " Product "  
    # SharePoint Variables - Please change  
    $O365ServiceAccount = "[email protected]"  
    $O365ServiceAccountPwd = "abc@2020"  
    $siteURL = "https://abc.sharepoint.com/sites/test"  
    $List = " Product "  
    #Connect to SharePoint Online[SecureString] $SecurePass = ConvertTo - SecureString $O365ServiceAccountPwd - AsPlainText - Force[System.Management.Automation.PSCredential] $PSCredentials = New - Object System.Management.Automation.PSCredential($O365ServiceAccount, $SecurePass)  
    Connect - PnPOnline - Url $siteURL - Credentials $PSCredentials  
    #Get Sharepoint List items  
    $Items = Get - PnPListItem - List $List  
    #Loop through the items  
    foreach($Item in $Items) {  
        $MovedToSQL = $Item["MovedToSQL"]  
        # check  
        if item already moved to SQL  
        if ($MovedToSQL - ne "Yes") {  
            #  
            $productName = $Item["ProductName"]  
            $productDescription = $Item["ProductDescription"]  
            $productCost = $Item["ProductCost"]  
            $productMake = $Item["ProductMake"]  
            # Insert query  
            for SQL Table  
            $insertquery = "    
            INSERT INTO[$DatabaseName]. [$TableName]  
                ([ProductName], [ProductDescription], [ProductCost], [ProductMake])  
            VALUES('$productName  , '$productDescription  , '$productCost ', '$productMake ')  
            GO "    
            #Invoke SQLcmd to insert the item into SQL table  
            Invoke - SQLcmd - ServerInstance $MSSQLServerInstanceName - query $insertquery - Database $DatabaseName  
            #Update the SharePoint List once item moved and update the Column "MovedToSQL" = "Yes"  
            Set - PnPListItem - List $List - Identity $Item.Id - Values @ {  
                "MovedToSQL" = "Yes"  
            }  
        }  
    }   


Result after execution of Script
Once the PowerShell has executed we have updated SharePoint List and SQL table as below:

SharePoint List Output

SQL Table Output

HostForLIFEASP.NET SQL Server 2019 Hosting



About HostForLIFE

HostForLIFE is European Windows Hosting Provider which focuses on Windows Platform only. We deliver on-demand hosting solutions including Shared hosting, Reseller Hosting, Cloud Hosting, Dedicated Servers, and IT as a Service for companies of all sizes.

We have offered the latest Windows 2019 Hosting, ASP.NET 5 Hosting, ASP.NET MVC 6 Hosting and SQL 2019 Hosting.


Month List

Tag cloud

Sign in