European Windows 2019 Hosting BLOG

BLOG about Windows 2019 Hosting and SQL 2019 Hosting - Dedicated to European Windows Hosting Customer

European SQL Server 2022 Hosting :: How to Securing Remote Access to SQL Server?

clock March 6, 2024 07:38 by author Peter

Securing remote access to SQL Server is crucial for maintaining data security, integrity, and availability. With the rise of remote work and reliance on cloud-based services, safeguarding SQL Server against unwanted access is more vital than ever. Implementing strong security measures helps to protect sensitive data and prevent any cyber threats.

In this post, we will look at numerous approaches for securing remote access to SQL Server, including examples.

Why is it necessary to secure SQL Server remote access?
SQL Server stores critical information for businesses, making it a prime target for cyber attacks. Here are some reasons why safeguarding remote access is important:

  • Data Protection: Unauthorized access can result in data breaches, exposing sensitive information.
  • Compliance Requirements: Many sectors have unique compliance rules (such as GDPR and HIPAA) that require data protection procedures.
  • Preventing Attacks: Securing remote access helps to protect against SQL injection, ransomware, and other cyber attacks.

2. Implementing Firewall Rules: Firewalls act as a barrier between the internet and your SQL Server, allowing only authorized traffic to access specific ports. Example:
Set up firewall rules to restrict access to SQL Server ports (usually TCP port 1433 for SQL Server and port 1434 for SQL Server Browser Service).
Whitelist specific IP addresses or IP ranges that are permitted to access SQL Server remotely.

3. Enabling Encryption: Enabling encryption ensures that data transmitted between the client and server remains secure and cannot be easily intercepted. Example:

  • Configure SQL Server to use SSL/TLS encryption for connections.
  • Obtain and install SSL certificates to encrypt data transmission.

4. Strong Authentication and Authorization: Implement robust authentication mechanisms and limit user access to only necessary databases and functions. Example:

  • Use Windows Authentication mode or implement strong SQL Server authentication.
  • Grant least privilege access to users, assigning permissions based on their roles or tasks.

5. Multi-Factor Authentication (MFA): MFA adds an extra layer of security by requiring users to provide multiple forms of verification before accessing SQL Server remotely. Example:

  • Integrate MFA solutions like Azure MFA, Google Authenticator, or DUO Security with SQL Server authentication.

6. Regular Software Updates and Patch Management: Keeping the SQL Server software up-to-date with the latest security patches is critical to address known vulnerabilities and security flaws. Example:

  • Establish a routine for checking and applying updates released by Microsoft for SQL Server. This can be automated using tools like Windows Server Update Services (WSUS) to ensure timely application of patches and updates.

Conclusion
Securing remote access to SQL Server involves implementing a combination of measures to fortify the server against potential threats. By utilizing VPNs, firewalls, encryption, robust authentication, and authorization mechanisms, organizations can mitigate risks and ensure the safety of their critical data. Regularly updating and reviewing security protocols is crucial in maintaining a strong defense against evolving cyber threats.

HostForLIFEASP.NET SQL Server 2022 Hosting

 



European SQL Server 2022 Hosting :: How to Open, Read, and View MS SQL Error Log Files?

clock March 1, 2024 06:07 by author Peter

The Microsoft SQL Server application is a dominant force in the field of relational database management systems, capable of managing large databases in a structured manner. Nonetheless, some users face difficulties while trying to view, read, or open SQL Server log files. Undoubtedly, the MS SQL Server database is praised by both novices and seasoned specialists for its powerful features.

Where is the SQL Server Error Log File located?
The SQL Server error log is usually kept in the log directory of the SQL Server instance's installation folder. The error log's default location changes according to the SQL Server version and installation configuration.

For most installations, the SQL Server error log is located in one of the following default locations:

SQL Server 2017 and later

C:\Program Files\Microsoft SQL Server\MSSQL15.MSSQLSERVER\MSSQL\Log\ERRORLOG

Replace MSSQL15.MSSQLSERVER with the appropriate instance name if you have named instances or if the default instance name differs.

SQL Server 2016

C:\Program Files\Microsoft SQL Server\MSSQL13.MSSQLSERVER\MSSQL\Log\ERRORLOG
Replace MSSQL13.MSSQLSERVER with the appropriate instance name if needed.

SQL Server 2014
C:\Program Files\Microsoft SQL Server\MSSQL12.MSSQLSERVER\MSSQL\Log\ERRORLOG

SQL Server 2012
C:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQL\Log\ERRORLOG

SQL Server 2008 and 2008 R2
C:\Program Files\Microsoft SQL Server\MSSQL10.MSSQLSERVER\MSSQL\Log\ERRORLOG

If SQL Server was installed in a custom location, the error log could be in a different directory. To find out where the error log is, review the SQL Server configuration or installation settings. You can also inspect the error log contents using SQL Server Management Studio (SSMS) or T-SQL queries, which allows for easier access to the log data without having to locate the physical log file.

Best Methods to Open, Read, or View MS SQL Error Log File
Opening, reading, and viewing the Microsoft SQL Server error log file can be done using various methods. Here are some of the best methods:

1. Using SQL Server Management Studio (SSMS)
SQL Server Management Studio provides a graphical interface for managing SQL Server instances, including viewing error logs. Here's how to access the error logs in SSMS:

  • Open SQL Server Management Studio and connect to the SQL Server instance.
  • In the Object Explorer, expand the server node, then expand Management, and finally expand SQL Server Logs.
  • You will see a list of error logs. Double-click on an error log to view its contents.

2. Using SQL Server Configuration Manager
SQL Server Configuration Manager is a tool provided by Microsoft to manage SQL Server services and configurations. You can also use it to view error logs:

  • Open SQL Server Configuration Manager.
  • Click on SQL Server Services in the left pane to display a list of SQL Server instances.
  • Right-click on a SQL Server instance and select View Error Logs from the context menu.


3. Using Windows Event Viewer
SQL Server error logs are also written to the Windows Event Viewer. You can access the Event Viewer by:

  • Pressing Win + R to open the Run dialog.
  • Typing eventvwr.msc and pressing Enter.
  • In the Event Viewer, navigate to Windows Logs > Application, and look for events with the source "MSSQLSERVER".

4. Using the Aryson SQL Log Analyzer Tool
The Aryson SQL Log Analyzer allows you to read, analyze, and recover data from SQL Server transaction logs. It supports reading online and offline transaction logs, auditing changes, and rolling back transactions.

5. Using T-SQL Queries

You can also read the SQL Server error log using Transact-SQL (T-SQL) queries. The error log information is stored in the system table sys.fn_read_errorlog(). Here's an example query:
EXEC sp_readerrorlog

This will display the current SQL Server error log. You can specify additional parameters to retrieve error logs from specific dates or with specific search criteria.

6. Using PowerShell

You can use PowerShell to read SQL Server error logs as well. Here's an example command:
Get-Content "C:\Program Files\Microsoft SQL Server\MSSQL15.MSSQLSERVER\MSSQL\Log\ERRORLOG"

Replace the file path with the actual path to your SQL Server error log file.Choose the method that best suits your preferences and requirements for accessing and viewing SQL Server error logs. Each method provides different levels of flexibility and ease of use.

HostForLIFEASP.NET SQL Server 2022 Hosting

 



European SQL Server 2022 Hosting :: How Can I Find Blocked Deadlocks in SQL?

clock February 23, 2024 06:20 by author Peter

SQL Query
SELECT
    r.blocking_session_id AS BlockingSessionID,
    DB_NAME(r.database_id) AS DatabaseName,
    --OBJECT_NAME(object_id, r.database_id) AS BlockedObjectName,
    --OBJECT_SCHEMA_NAME(object_id, r.database_id) AS BlockedObjectSchema,
    r.database_id AS BlockedObjectName,
    r.database_id AS BlockedObjectSchema,
    st.text AS BlockedSQLText,
    r.blocking_session_id AS BlockedBySessionID,
    r.command AS BlockingCommand,
    CASE WHEN r.transaction_id IS NULL THEN 'Not In Transaction' ELSE 'In Transaction' END AS BlockingInTransaction,
    es.login_name AS BlockedUser,
    er.blocking_session_id AS BlockingSessionID,
    es.host_name AS BlockingHostName,
    es.program_name AS BlockingProgram,
    er.start_time AS BlockingStartTime
FROM
    sys.dm_exec_requests AS r
JOIN
    sys.dm_exec_sessions AS es ON r.session_id = es.session_id
JOIN
    sys.dm_exec_connections AS ec ON es.session_id = ec.session_id
JOIN
    sys.dm_exec_requests AS er ON ec.most_recent_session_id = er.session_id
CROSS APPLY
    sys.dm_exec_sql_text(r.sql_handle) AS st
WHERE
    r.session_id != @@SPID
    AND r.blocking_session_id != 0
ORDER BY
    r.blocking_session_id;

Let's talk about this table definition and how it will assist us make this query.

sys.dm_exec_requestspan

sys.dm_exec_requests is not a table in the traditional sense; rather, it's a Dynamic Management View (DMV) provided by Microsoft SQL Server.

  • It is a useful tool for database administrators and developers to monitor and manage the current requests or sessions on the SQL Server instance.
  • This dynamic management view is particularly useful for monitoring and debugging database performance, recognizing long-running queries, discovering blocking issues, and determining server resource use. It provides significant insights into the present activity on the SQL Server instance, helping database administrators and developers to manage and modify database processes efficiently.

Overall, sys.dm_exec_requests is an essential tool for real-time performance monitoring and troubleshooting in SQL Server. It enables database administrators and developers to monitor server activity and take proactive steps to improve performance and stability.

sys.dm_exec_sessions
sys.dm_exec_sessions is another Dynamic Management View (DMV) offered by Microsoft SQL Server.

It contains information about all active sessions on the SQL Server instance, including user sessions, system sessions, and internal background processes.
The goal of sys.dm_exec_sessions is to offer a full picture of the current sessions connected to the SQL Server instance, as well as the many attributes and features associated with each. Here's how the DMV can be useful.

Overall, sys.dm_exec_sessions is a useful tool for monitoring and managing SQL Server sessions. Querying this DMV provides admins with insights into session activity, resource use, blocking circumstances, and transactional behavior, allowing them to efficiently optimize server performance and handle issues.

sys.dm_exec_connections
Microsoft SQL Server has a Dynamic Management View (DMV) called sys.dm_exec_connections.

It holds information on the current connections to the SQL Server instance, such as client connections, protocols, and network address.

The purpose of sys. dm_exec_connections is to give administrators and developers with information about the active connections to SQL Server instances.
Overall, sys.dm_exec_connections is an effective tool for monitoring and controlling client connections to SQL Server instances. Querying this DMV provides administrators with insights on connection attributes, network protocols, resource use, and session associations, allowing them to effectively diagnose connectivity issues and enhance server performance.

sys.dm_exec_sql_text
sys.dm_exec_sql_text is a Dynamic Management Function (DMF) that Microsoft SQL Server provides. It retrieves the text of SQL statements that are presently being executed or have been executed recently. This feature is especially useful for monitoring and troubleshooting purposes.

Overall, sys.dm_exec_sql_text is an effective monitoring and troubleshooting tool for SQL Server instances. It gives information about the SQL statements that are being executed, allowing administrators to identify performance issues, optimize queries, and assure the database's security and reliability.

Attention

Dynamic management views (DMVs) and dynamic management functions (DMFs) provide server state information that can be used to monitor a server instance's health, identify problems, and optimize performance.

Deadlock Monitor 

Dynamic Management View/Function Description Usage in Finding Deadlocks
sys.dm_exec_requests Provides information about each request that is currently executing or waiting for execution in SQL Server. Can be used to identify blocking and deadlock scenarios by analyzing the blocking_session_id column to find the sessions involved in the deadlock chain.
sys.dm_exec_sessions Returns one row per authenticated session on SQL Server. It includes information such as session ID, login name, hostname, and program name. Useful for obtaining details about the sessions involved in the deadlock, such as login name and hostname, to identify the users or applications causing the deadlock.
sys.dm_exec_connections Provides information about the connections established to SQL Server, including details such as session ID and client IP address. Helpful for identifying the client connections associated with the sessions involved in the deadlock, aiding in troubleshooting, and identifying the source of the deadlock.
sys.dm_exec_sql_text Returns the text of the SQL statements that are currently being executed or have been executed recently. It takes an SQL handle as input. Can be used to retrieve the SQL text of the queries involved in the deadlock chain, enabling administrators to analyze the queries causing the deadlock and take appropriate actions to resolve it.

HostForLIFEASP.NET SQL Server 2022 Hosting


 



European SQL Server 2022 Hosting :: Mastering T-SQL Rank Functions

clock February 7, 2024 07:31 by author Peter

SQL window functions are strong instruments for complex data manipulation and analysis. The four primary window functions in T-SQL are ROW_NUMBER, RANK, DENSE_RANK, and NTILE. Within result sets, these functions aid in ranking, sorting, and organizing data. Let's examine each function in more detail, learning about its purpose and practical applications.

To comprehend the ideas, let's look at the table below.

CREATE TABLE StudentMarks (
    StudentID INT,
    Subject VARCHAR(50),
    Marks INT
);

Adding a few sample records in the above table.

Student ID Subject Marks
1 Mathematics 90
2 Science 85
3 History 75
4 Mathematics 90

1. ROW_NUMBER

A window function called ROW_NUMBER uses the given ordering to assign a distinct sequential number to each row inside a partition of a result set. It starts at 1 and generates a new number for every row, without any pauses.

Use Cases: ROW_NUMBER is frequently used to filter the top N rows in a partition, discover duplicates, and create pagination.

Example

SELECT *,
       ROW_NUMBER() OVER (ORDER BY Marks DESC) AS RowNumber
FROM StudentMarks;


It will return the output.

2. RANK

StudentID Subject Marks RowNumber
4 Mathematics 90 1
1 Mathematics 90 2
2 Science 85 3
3 History 75 4

Another window method called RANK uses a given ordering to give each row inside a partition of a result set a distinct rank. When gaps occur in the ranking sequence, it leaves them there and gives the same rank to rows with equal values.

Use Cases: When ranking is permitted to have ties, like in the case of rating students based on exam results, RANK is frequently employed.

Another window method called RANK uses a given ordering to give each row inside a partition of a result set a distinct rank. When gaps occur in the ranking sequence, it leaves them there and gives the same rank to rows with equal values.

Use Cases: RANK is often used when ranking is allowed to contain ties, such as when grading students according to exam scores.

Example:

SELECT *,
       RANK() OVER (ORDER BY Marks DESC) AS Rank
FROM StudentMarks;

It will return the output.

StudentID Subject Marks Rank
4 mathematics 90 1
1 Mathematics 90 1
2 Science 85 3
3 History 75 4

3. DENSE_RANK
DENSE_RANK is similar to RANK but differs in that it assigns consecutive ranks to rows with equal values, without leaving gaps in the ranking sequence. It ensures that ranks are assigned in a continuous, sequential manner.

Use Cases: DENSE_RANK is preferred when consecutive ranking without gaps is desired, such as ranking products by sales performance.

Example
SELECT *,
       DENSE_RANK() OVER (ORDER BY Marks DESC) AS DenseRank
FROM StudentMarks;

It will return the output.

StudentID Subject Marks DenseRank
4 Mathematics 90 1
1 Mathematics 90 1
2 Science 85 2
3 History 75 3

4. NTILE
NTILE is a window function that divides the result set into a specified number of roughly equal-sized buckets or partitions, assigning each row to one of these buckets. The function ensures that the size difference between buckets is minimized.

Use Cases: NTILE is commonly used for data segmentation and percentile calculations, such as dividing customers into groups based on their income.

Example
SELECT *,
       NTILE(4) OVER (PARTITION BY SUBJECT ORDER BY Marks DESC) AS Student_Group
FROM StudentMarks;

It will return the output.
studentID   Subject

It will return the output.

studentID Subject Marks Student_Group
4 Mathematics 90 1
1 Mathematics 90 2
2 Science 85 1
3 History 75 1

In summary
With SQL Server, window functions like ROW_NUMBER, RANK, DENSE_RANK, and NTILE give analysts and developers strong capabilities for data analysis, ranking, and partitioning. Through the appropriate utilization of these functionalities, users can accomplish intricate analytical tasks, acquire a deeper understanding of their data, and retrieve important information from their databases. By being aware of the subtleties and functionalities of each window function, SQL practitioners can fully utilize T-SQL for complex data manipulation and analysis activities.

HostForLIFEASP.NET SQL Server 2022 Hosting

 



European SQL Server 2022 Hosting :: Enhance the Performance of SQL Databases

clock February 2, 2024 07:30 by author Peter

In the dynamic realm of database management, making sure that SQL performance is optimized is essential to the smooth and effective operation of applications. Delivering a responsive and scalable application depends heavily on performance optimization, regardless of the size of the database or the system in question. We'll look at important tactics and industry best practices for SQL database performance optimization in this post.

1. Recognizing Performance Optimization's Significance
Providing a seamless user experience requires good database performance. Application responsiveness and scalability may be impacted by bottlenecks caused by slow queries and wasteful database design. SQL database efficiency can be greatly increased by developers by devoting work to performance enhancement.

2. Determining Performance Bottlenecks via Profiling

Finding performance constraints is essential before implementing optimization approaches. To examine query execution times, resource consumption, and index statistics, use SQL profiling tools. This diagnostic process aids in identifying areas that require improvement.

3. Query Enhancement
Indexing Strategies: An essential component of query optimization is indexing. Examine various indexing techniques, such as non-clustered and clustered indexes. For quicker query execution, identify the columns that are commonly utilized in JOIN conditions or WHERE clauses and think about indexing them.

Indexing Strategies, for instance:

-- Creating a Non-Clustered Index
CREATE INDEX IX_Employee_LastName
ON Employee (LastName);

-- Query using the index
SELECT * FROM Employee WHERE LastName = 'Smith';

Rewriting Complex Queries: Assess and rework intricate queries to increase their effectiveness. JOIN optimizations, subquery removal, and appropriate index usage are a few strategies that can help speed up query processing.

4. Considerations for Database Design

  • Normalization: Aim for the best possible level of normalization for your database. Optimized normalized databases frequently yield higher performance. Finding a balance is necessary though, since over-normalization can also cause problems with performance.
  • Partitioning: Take into account dividing up data in large tables according to particular standards, like important values or date ranges. This can minimize the quantity of data that needs to be scanned, which can greatly improve query performance.
    Imagine you have a large Sales table with most queries including data from a given set of dates. Queries can target certain divisions in the table by partitioning it according to the transaction date, which facilitates quicker and more effective data retrieval.
-- Creating a Partition Function
CREATE PARTITION FUNCTION Pf_EmployeeByDate (DATE)
AS RANGE RIGHT FOR VALUES ('2022-01-01', '2023-01-01', '2024-01-01');

-- Creating a Partition Scheme
CREATE PARTITION SCHEME Ps_EmployeeByDate
AS PARTITION Pf_EmployeeByDate ALL TO ([PRIMARY]);

Monitoring and upkeep of performance
  • Frequent Index Maintenance: Fragmentation of indexes over time might affect query performance. Plan on performing routine index maintenance actions to rebuild or rearrange indexes and maintain the best possible state for the database. Indexes may fragment over time as new data is added, changed, and removed. Rebuilding or rearranging indexes on a regular basis contributes to maintaining optimal query performance.
-- Reorganizing Indexes
ALTER INDEX IX_Employee_LastName ON Employee REORGANIZE;

-- Rebuilding Indexes
ALTER INDEX IX_Employee_LastName ON Employee REBUILD;

Query Execution Plans: To find areas that need improvement, analyze and comprehend query execution plans. Utilize software such as SQL Server Management Studio to enhance and illustrate the execution strategy for intricate queries.

6. Methods of Caching
  • Query Result Caching: Use caching techniques for queries that are run frequently and are generally static. By providing cached results when appropriate, this lessens the strain on the database and speeds up response times.
  • Application-Level Caching: To store and retrieve frequently requested data without accessing the database, use application-level caching. This is especially useful for applications that require a lot of reading.
7. Hardware and Resource Efficiency
Optimize server configuration parameters, including disk I/O, parallelism, and memory allocation. Adapt these values to the device specifications and workload.
-- Configuring Maximum Degree of Parallelism (MAXDOP)
sp_configure 'max degree of parallelism', 4;
RECONFIGURE;

Storage Optimization: To improve disk I/O speed, think about using fast storage options like SSDs. To enable parallel access, split up database files among several disks.

8. Frequent Evaluation of Performance

Load Testing: To mimic high traffic situations and spot possible performance bottlenecks, do routine load tests. This proactive strategy enables changes to be made before problems affect end users.

Conclusion

Proactive maintenance, intelligent database architecture, and query optimization are all necessary for maintaining optimal SQL performance. Developers can improve the performance and user experience of their applications by putting the techniques in this tutorial to work to make their SQL databases more efficient. Remember that different database systems could have different optimization strategies, so adjust your strategy according to the SQL platform you're using.

HostForLIFEASP.NET SQL Server 2022 Hosting



European SQL Server 2022 Hosting :: Union and Union All

clock January 29, 2024 07:16 by author Peter

Make a Table By running the query and adding the information
CREATE TABLE employeeslist
(
    id INT IDENTITY(1,1) PRIMARY KEY,
    [Name] VARCHAR(100) NOT NULL,
    Gender VARCHAR(100) NOT NULL,
    city VARCHAR(100) NOT NULL,
    salary INT NOT NULL
);

INSERT INTO employeeslist VALUES ('Peter', 'M', 'London', 30000);
INSERT INTO employeeslist VALUES ('Laura', 'F', 'London', 35000);
INSERT INTO employeeslist VALUES ('Anna', 'F', 'London', 33000);
INSERT INTO employeeslist VALUES ('Tom', 'M', 'London', 38000);
INSERT INTO employeeslist VALUES ('Maria', 'F', 'Liverpool', 36000);
INSERT INTO employeeslist VALUES ('Catherine', 'F', 'Liverpool', 32000);
INSERT INTO employeeslist VALUES ('Lily', 'F', 'Liverpool', 35000);
INSERT INTO employeeslist VALUES ('Mia', 'F', 'Liverpool', 31000);
INSERT INTO employeeslist VALUES ('Grace', 'F', 'Leeds', 37000);
INSERT INTO employeeslist VALUES ('Arthur', 'M', 'Leeds', 35000);


CREATE TABLE employeeslist1
(
    id INT IDENTITY(1,1) PRIMARY KEY,
    [Name] VARCHAR(100) NOT NULL,
    Gender VARCHAR(100) NOT NULL,
    city VARCHAR(100) NOT NULL,
    salary INT NOT NULL
);

INSERT INTO employeeslist1 VALUES ('Peter', 'M', 'London', 30000);
INSERT INTO employeeslist1 VALUES ('Laura', 'F', 'London', 35000);
INSERT INTO employeeslist1 VALUES ('Grace', 'F', 'Liverpool', 12000);
INSERT INTO employeeslist1 VALUES ('Sophia', 'F', 'Leeds', 37000);
INSERT INTO employeeslist1 VALUES ('Alfie', 'M', 'York', 38000);


Questions
1. Fetch a resultant table that includes all records from both the table as one table.
Answers
select * from employeesGH
union all
select * from employeesGH1


2. Fetch a resultant table that includes all records from both tables as one table but no duplicate values.
Answer
select * from employeesGH
union
select * from employeesGH1


3. Fetch a resultant table that has the id, name, and city from the first table and the id and name from the second table in the same order.
Answer
select Id,name,city from employeesGH
union
select id,name from employeesGH1

* this will result in error as no. of colums should be equal in both select statements.

4. Fetch a resultant table that has id and city from the first table and name and id from the second table in the same order.
Answer
select name,city,Id from employeesGH
union
select id,name,city from employeesGH1
* this will result in error as the datatype of id and city and name and city is not same.


5. Fetch a resultant table that has the id, name, and city from the first table and the id, city, and name from the second table in the same order.
Answer
select id,[name],city from employeesGH
union
select id,city,[name] from employeesGH1
this will not result in error as the datatype of city and name is same but it will not be the proper result.

Union vs Union All
Union eliminates redundant rows but Union All does not
Union operates more quickly than Union All because it uses unique sorts to eliminate duplication.

Note: The select statement should have the same number of columns, data types, and column orders.

Summary
All of Union and Union together -> combines two or more select queries' result sets.

HostForLIFEASP.NET SQL Server 2022 Hosting

 



European SQL Server 2022 Hosting :: How to Use SQL Server to Check Your Most Resource-Intensive Queries?

clock January 9, 2024 07:13 by author Peter

After seeing that my online application was frequently experiencing SQL Server Request Timeouts, I examined the server and discovered that the CPU was nearly entirely utilized.

SQL Server used up most of the resources.

I was aware that there was a problem with my database query function, but I'm not sure which specific queries from my online application are to blame. Thus, I run the following SQL Studio Management Studio query to obtain the top ten database queries in terms of resource consumption.

SELECT TOP 10
    qs.total_logical_reads + qs.total_logical_writes AS total_io,
    qs.execution_count,
    qs.total_worker_time AS total_cpu_time,
    qs.total_elapsed_time,
    st.text AS query_text,
    qp.query_plan
FROM
    sys.dm_exec_query_stats AS qs
    CROSS APPLY sys.dm_exec_sql_text(qs.sql_handle) AS st
    CROSS APPLY sys.dm_exec_query_plan(qs.plan_handle) AS qp
ORDER BY
    total_io DESC;

And the query will give me such result.

You can examine your most costly queries by looking at the 'query_text' column, or you can look for ways to optimize your query by looking at the query_plan column.

In my situation, I've come to the conclusion that improper indexing is the main reason behind one of my tables, which has millions of rows in it.

Additionally, I was able to spot a few queries that had string manipulation functions like SUBSTRING, REVERSE, CHARINDEX, and others but did not employ an appropriate join statement. In order to make it more effective and less resource-intensive, I must rework a few queries.

HostForLIFEASP.NET SQL Server 2022 Hosting




 



European SQL Server 2022 Hosting :: In MSSQL, Truncate All Table Data

clock January 3, 2024 06:36 by author Peter

Dynamic SQL can be used to truncate every table in a database. When performing this kind of operation, use caution because there is no way to undo the deletion of all data across all tables.

This is an illustration of how to write a query in a SQL Server database to truncate every table.

DECLARE @TableName NVARCHAR(128)
DECLARE @TruncateQuery NVARCHAR(MAX)

DECLARE tableCursor CURSOR FOR
SELECT TABLE_NAME
FROM INFORMATION_SCHEMA.TABLES
WHERE TABLE_TYPE = 'BASE TABLE'

OPEN tableCursor

FETCH NEXT FROM tableCursor INTO @TableName

WHILE @@FETCH_STATUS = 0
BEGIN
    SET @TruncateQuery = 'TRUNCATE TABLE ' + @TableName
    EXEC sp_executesql @TruncateQuery

    FETCH NEXT FROM tableCursor INTO @TableName
END

CLOSE tableCursor
DEALLOCATE tableCursor

Kindly take note of

  • This operation carries a risk: When tables are truncated, all data is lost and cannot be recovered.
  • Make a backup of your data: It's imperative to have a backup of your data in case the original is needed before running any queries that alter data in this manner.
  • Examine and make sure it passes: Before running such queries in a production environment, always properly evaluate and test in a secure setting.

Before carrying out such operations, it's also imperative to take permissions and the effect of truncating tables in a live system into account.

HostForLIFEASP.NET SQL Server 2022 Hosting

 



European SQL Server 2022 Hosting :: Finding and Dropping a User-Defined Store Procedure Query

clock December 21, 2023 09:23 by author Peter

This little piece of code dumps each user-defined stored procedure one by one as iterates through them all (system procedures excluded). Note: You should not run the attached code snippet in production environments as it is a risky script. Every user-defined stored procedure in the database is iterated through and dropped one at a time. Serious repercussions may result from this, particularly if your database contains mission-critical processes.

declare @procName varchar(500)
declare cur cursor

for
select [name] from sys.objects
 where type = 'p' and  is_ms_shipped = 0

open cur
fetch next from cur into @procName
while @@fetch_status = 0
begin
    exec('drop procedure [' + @procName + ']')
    fetch next from cur into @procName
end
close cur
deallocate cur

Understanding the Code
The provided code demonstrates dropping user-defined stored procedures in a database. Here's a breakdown:

Variable declaration:

  • @procName: String variable to store the name of each procedure.
  • cur: Cursor object to iterate through existing procedures.

Cursor definition:

  • Uses sys.objects system table to find user-defined procedures (type = 'p') excluding system procedures (is_ms_shipped = 0).

Looping through procedures:

  • fetch next retrieves the next procedure's name into @procName.
  • while @@fetch_status = 0 repeats the loop until no more procedures are found.

Dropping procedures:

  • Constructs a dynamic SQL statement (exec) to drop the current procedure (@procName).
  • Executes the dynamic statement.

Cleanup:

  • Closes and deallocates the cursor to free resources.

HostForLIFEASP.NET SQL Server 2022 Hosting



European SQL Server 2022 Hosting :: Which SQL Window methods are ROW_NUMBER(), Rank(), and DENSE_RANK()?

clock December 12, 2023 07:15 by author Peter

Within the ever-changing SQL Server environment, Windows functions are particularly potent instruments for analytical processing. They provide a unique method for carrying out calculations over a certain range of rows that are connected to the current row. In this blog, we'll explore Window Functions through a number of scenarios, a basic table example, and the use of normalizing techniques.

Knowing How Windows Work
Without lowering the result set, window functions work inside a given window of rows that are connected to the current row. They provide SQL queries access to new levels of analytical power. Let's begin by making a straightforward table and adding data to it.

-- Create a simple table
CREATE TABLE Sales (
    SaleID INT PRIMARY KEY,
    ProductID INT,
    SaleDate DATE,
    Amount DECIMAL(10, 2)
);

-- Insert sample data
INSERT INTO Sales VALUES (1, 101, '2023-01-01', 1500.00);
INSERT INTO Sales VALUES (2, 102, '2023-01-02', 2000.00);
INSERT INTO Sales VALUES (3, 101, '2023-01-03', 1200.00);
INSERT INTO Sales VALUES (4, 102, '2023-01-04', 1800.00);

ROW_NUMBER() - Enumerating Rows

The ROW_NUMBER() function assigns a unique number to each row based on a specified order. This can be useful for pagination or ranking purposes.

-- Enumerate rows based on SaleDate
SELECT SaleID, SaleDate, Amount,
       ROW_NUMBER() OVER (ORDER BY SaleDate) AS RowNum
FROM Sales;

RANK() - Ranking Rows
RANK() assigns a unique rank to each distinct row, leaving gaps for tied values.
-- Rank rows based on Amount
SELECT SaleID, SaleDate, Amount,
       RANK() OVER (ORDER BY Amount DESC) AS SalesRank
FROM Sales;

DENSE_RANK() - Dense Ranking Rows
DENSE_RANK() is similar to RANK(), but without gaps for tied values.

-- Dense rank rows based on Amount
SELECT SaleID, SaleDate, Amount,
       DENSE_RANK() OVER (ORDER BY Amount DESC) AS DenseSalesRank
FROM Sales;

In this specific dataset, since there are no tied values in the Amount column, the rankings provided by both RANK() and DENSE_RANK() are identical.
Here's a brief explanation of the differences:
RANK()

  • Assigns a unique rank to each distinct row.
  • Leaves gaps in the ranking for tied values. If two rows have the same value, they both get the same rank, and the next rank is skipped.


DENSE_RANK()

  • Similar to RANK().
  • Does not leave gaps for tied values. If two rows have the same value, they both get the same rank, and the next rank is not skipped.

SUM() - Cumulative Sum
SUM() as a window function enables the calculation of cumulative sums.

-- Calculate cumulative sum of Amount
SELECT SaleID, SaleDate, Amount,
       SUM(Amount) OVER (ORDER BY SaleDate) AS CumulativeSum
FROM Sales;

AVG() - Moving Average
AVG() as a window function calculates a moving average over a specified window of rows.
-- Calculate 3-day moving average of Amount
SELECT SaleID, SaleDate, Amount,
       AVG(Amount) OVER (ORDER BY SaleDate ROWS BETWEEN 1 PRECEDING AND 1 FOLLOWING) AS MovingAverage
FROM Sales;

LEAD() and LAG() - Accessing Adjacent Rows
LEAD() and LAG() provide access to subsequent and preceding rows, respectively.
-- Access next and previous SaleDate
SELECT SaleID, SaleDate, Amount,
       LEAD(SaleDate) OVER (ORDER BY SaleDate) AS NextSaleDate,
       LAG(SaleDate) OVER (ORDER BY SaleDate) AS PreviousSaleDate
FROM Sales;


Normalizing Data Using Window Functions
Normalization is a key database design principle. Let's use Window Functions to normalize data by creating a new table to store aggregated information.
-- Create a normalized table
CREATE TABLE ProductSales (
    ProductID INT PRIMARY KEY,
    TotalSales DECIMAL(10, 2),
    AverageSaleAmount DECIMAL(10, 2)
);

-- Insert normalized data using Window Functions
INSERT INTO ProductSales
SELECT ProductID,
       SUM(Amount) OVER (PARTITION BY ProductID) AS TotalSales,
       AVG(Amount) OVER (PARTITION BY ProductID) AS AverageSaleAmount
FROM Sales
GROUP BY ProductID;


ProductSales

The ProductSales table in this normalized database removes unnecessary data by capturing the total sales and average selling amount for each product.

SQL Server Window Functions are a developer's and data analyst's gold mine. These routines improve your analytical skills, from ordering rows to computing cumulative sums and moving averages. Gaining proficiency with Windows functions will enable you to extract meaningful insights from your data and optimize the performance of your SQL queries. They can also be crucial in normalizing data, as shown in the aforementioned instances, which helps create an effective and well-designed database structure. Cheers to your inquiring!

HostForLIFEASP.NET SQL Server 2022 Hosting



About HostForLIFE

HostForLIFE is European Windows Hosting Provider which focuses on Windows Platform only. We deliver on-demand hosting solutions including Shared hosting, Reseller Hosting, Cloud Hosting, Dedicated Servers, and IT as a Service for companies of all sizes.

We have offered the latest Windows 2019 Hosting, ASP.NET 5 Hosting, ASP.NET MVC 6 Hosting and SQL 2019 Hosting.


Month List

Tag cloud

Sign in