Friday, March 23, 2012
Login failed for user (null)
I am using MS SQL 2000 Personal Edition on my Windows XP Professional machine. It is working fine with Enterprice manager and SQL Query analyser.
To use the db in my web application I configured a system dsn of "SQL Server" driver. I used "sa" user with my pass and tested the connection. It said it was ok. When I used in my web page it throws this exception
ODBC Error Code = 28000 (Invalid authorization specification)
[Microsoft][ODBC SQL Server Driver][SQL Server]Login failed for user '(null)'. Reason: Not associated with a trusted SQL Server connection.
Any one could help me figure this prob.Can you post the connection string you are using? To the best of my knowledge, if you are using DSN connection, you still need to supply the user name and password.
Regards,
Hugh Scott
Originally posted by karthiknataraj
Hi,
I am using MS SQL 2000 Personal Edition on my Windows XP Professional machine. It is working fine with Enterprice manager and SQL Query analyser.
To use the db in my web application I configured a system dsn of "SQL Server" driver. I used "sa" user with my pass and tested the connection. It said it was ok. When I used in my web page it throws this exception
ODBC Error Code = 28000 (Invalid authorization specification)
[Microsoft][ODBC SQL Server Driver][SQL Server]Login failed for user '(null)'. Reason: Not associated with a trusted SQL Server connection.
Any one could help me figure this prob.|||I use Cold fusion. So I created a system DSN and in my <cfquery> tag I simply give the dsn name. Another one site which is live uses sql server and I checked the code it the same.
<CFQUERY NAME="Q1" DATASOURCE="Test">
Select * from emp
</CFQUERY>|||I don't have any experience with Cold Fusion, and I try to stay away from DSN connections as much as possible. I am digging through the cobwebs of my memory to try to come up with an answer.
I would:
1. Re-verify that the password is saved in your DSN config
2. Try specifying the UID and password in the CFQUERY; your production web server may be using a Windows Trusted connection to your SQL Server.
3. See if Cold Fusion can make DSN-less connections to source databases.
HTH,
Hugh Scott
Originally posted by karthiknataraj
I use Cold fusion. So I created a system DSN and in my <cfquery> tag I simply give the dsn name. Another one site which is live uses sql server and I checked the code it the same.
<CFQUERY NAME="Q1" DATASOURCE="Test">
Select * from emp
</CFQUERY>|||I had the same problem. Microsft had me chage the AllowInProcess reg key to 0 for the provider type I was having issues with (in my case MADASQL). This will move these connections out of the MemToLeave area the SQL uses. However you will have to use SQL authentication after this.|||Even if you use a pedifined DSN for your connection, you still HAVE to supply the user="sa" and the pass="". I don't how you can do this with cold fusion, but I that's a guess:
"DATASOURCE=DSN2;USER=SA:PASSWORD="
Monday, March 12, 2012
Login as sa
> How do I login to Query Analyser or EM as system administrator (sa) ?
Enter "sa" as the user name
Enter the sa password
Press OK.
If your server is not set up for Mixed-mode security, you may need to
enable that option to use the "sa" login.
If you've lost the "sa" password, you can log into the server using any
other administrator account and change the password for the "sa" user.
What trouble are you running into?
David Gugick
Imceda Software
www.imceda.com|||I installed the SQL Server 2000 as Windows Authentication mode, just wonder
if there is a sa login.
"David Gugick" <davidg-nospam@.imceda.com> wrote in message
news:%23Cdsx1uGFHA.4084@.TK2MSFTNGP14.phx.gbl...
> Alan wrote:
> > How do I login to Query Analyser or EM as system administrator (sa) ?
> Enter "sa" as the user name
> Enter the sa password
> Press OK.
> If your server is not set up for Mixed-mode security, you may need to
> enable that option to use the "sa" login.
> If you've lost the "sa" password, you can log into the server using any
> other administrator account and change the password for the "sa" user.
> What trouble are you running into?
>
> --
> David Gugick
> Imceda Software
> www.imceda.com
>|||It exists but will give you an error that the connection is not trusted and
will refuse it if tried. However, give the account A VERY STRONG PASSWORD
as it is very easy to switch SQL Server's Authentication mode.
Sincerely,
Anthony Thomas
"Alan" <NOSPAMalan_pltse@.yahoo.com.au> wrote in message
news:eRk4SKgHFHA.3624@.tk2msftngp13.phx.gbl...
I installed the SQL Server 2000 as Windows Authentication mode, just wonder
if there is a sa login.
"David Gugick" <davidg-nospam@.imceda.com> wrote in message
news:%23Cdsx1uGFHA.4084@.TK2MSFTNGP14.phx.gbl...
> Alan wrote:
> > How do I login to Query Analyser or EM as system administrator (sa) ?
> Enter "sa" as the user name
> Enter the sa password
> Press OK.
> If your server is not set up for Mixed-mode security, you may need to
> enable that option to use the "sa" login.
> If you've lost the "sa" password, you can log into the server using any
> other administrator account and change the password for the "sa" user.
> What trouble are you running into?
>
> --
> David Gugick
> Imceda Software
> www.imceda.com
>
Login as sa
> How do I login to Query Analyser or EM as system administrator (sa) ?
Enter "sa" as the user name
Enter the sa password
Press OK.
If your server is not set up for Mixed-mode security, you may need to
enable that option to use the "sa" login.
If you've lost the "sa" password, you can log into the server using any
other administrator account and change the password for the "sa" user.
What trouble are you running into?
David Gugick
Imceda Software
www.imceda.com|||I installed the SQL Server 2000 as Windows Authentication mode, just wonder
if there is a sa login.
"David Gugick" <davidg-nospam@.imceda.com> wrote in message
news:%23Cdsx1uGFHA.4084@.TK2MSFTNGP14.phx.gbl...
> Alan wrote:
> Enter "sa" as the user name
> Enter the sa password
> Press OK.
> If your server is not set up for Mixed-mode security, you may need to
> enable that option to use the "sa" login.
> If you've lost the "sa" password, you can log into the server using any
> other administrator account and change the password for the "sa" user.
> What trouble are you running into?
>
> --
> David Gugick
> Imceda Software
> www.imceda.com
>|||It exists but will give you an error that the connection is not trusted and
will refuse it if tried. However, give the account A VERY STRONG PASSWORD
as it is very easy to switch SQL Server's Authentication mode.
Sincerely,
Anthony Thomas
"Alan" <NOSPAMalan_pltse@.yahoo.com.au> wrote in message
news:eRk4SKgHFHA.3624@.tk2msftngp13.phx.gbl...
I installed the SQL Server 2000 as Windows Authentication mode, just wonder
if there is a sa login.
"David Gugick" <davidg-nospam@.imceda.com> wrote in message
news:%23Cdsx1uGFHA.4084@.TK2MSFTNGP14.phx.gbl...
> Alan wrote:
> Enter "sa" as the user name
> Enter the sa password
> Press OK.
> If your server is not set up for Mixed-mode security, you may need to
> enable that option to use the "sa" login.
> If you've lost the "sa" password, you can log into the server using any
> other administrator account and change the password for the "sa" user.
> What trouble are you running into?
>
> --
> David Gugick
> Imceda Software
> www.imceda.com
>
Login as sa
Alan wrote:
> How do I login to Query Analyser or EM as system administrator (sa) ?
Enter "sa" as the user name
Enter the sa password
Press OK.
If your server is not set up for Mixed-mode security, you may need to
enable that option to use the "sa" login.
If you've lost the "sa" password, you can log into the server using any
other administrator account and change the password for the "sa" user.
What trouble are you running into?
David Gugick
Imceda Software
www.imceda.com
|||I installed the SQL Server 2000 as Windows Authentication mode, just wonder
if there is a sa login.
"David Gugick" <davidg-nospam@.imceda.com> wrote in message
news:%23Cdsx1uGFHA.4084@.TK2MSFTNGP14.phx.gbl...
> Alan wrote:
> Enter "sa" as the user name
> Enter the sa password
> Press OK.
> If your server is not set up for Mixed-mode security, you may need to
> enable that option to use the "sa" login.
> If you've lost the "sa" password, you can log into the server using any
> other administrator account and change the password for the "sa" user.
> What trouble are you running into?
>
> --
> David Gugick
> Imceda Software
> www.imceda.com
>
|||It exists but will give you an error that the connection is not trusted and
will refuse it if tried. However, give the account A VERY STRONG PASSWORD
as it is very easy to switch SQL Server's Authentication mode.
Sincerely,
Anthony Thomas
"Alan" <NOSPAMalan_pltse@.yahoo.com.au> wrote in message
news:eRk4SKgHFHA.3624@.tk2msftngp13.phx.gbl...
I installed the SQL Server 2000 as Windows Authentication mode, just wonder
if there is a sa login.
"David Gugick" <davidg-nospam@.imceda.com> wrote in message
news:%23Cdsx1uGFHA.4084@.TK2MSFTNGP14.phx.gbl...
> Alan wrote:
> Enter "sa" as the user name
> Enter the sa password
> Press OK.
> If your server is not set up for Mixed-mode security, you may need to
> enable that option to use the "sa" login.
> If you've lost the "sa" password, you can log into the server using any
> other administrator account and change the password for the "sa" user.
> What trouble are you running into?
>
> --
> David Gugick
> Imceda Software
> www.imceda.com
>
Wednesday, March 7, 2012
Logical reads to tune sql
I am primarily a developer and only a part time dba. I was trained by
a great dba, but it was back in the sql 6.5 and 7.0 days. I
concentrate pretty heavily on logical reads, and I am wondering if my
technique is still valid or if I should be concentrating elsewhere?
We use sql profiler and find queries that are taking longer than 15
seconds to run. I get the sql and run it in query analyzer and turn
statistics io on and look at logical reads. I keep tuning the sql and
try to get logical reads down. My goal is to get them to less than
10,000 logical reads. But if I get a query from 4,000,000 read to
200,000, I believe I have made a good improvement.
Some of our code calls multiple sql statements in sequence, so I
analyze the code and have a pretty good idea what tables will probably
be cached. If I see that the tables will be cached because of previous
sql statements, then I can live with a query that has higher logical
reads.
I pretty much don't understand the graphical Execution Plan. Am I
crazy ignoring this? Or is my above technique pretty good?The goal is always to get the logical reads to a minimum. But there is no
way to tell if a particular # is as low as it can be without looking at the
query plan to some degree. I don't know what kind of queries you are doing
but if your only looking for queries over 15 seconds and think 10K reads is
great you guy's must have a lot of time on your hands to wait for
results<g>. Usually anything over 150ms is too long for me. Hopefully
these are some type of reporting queries and the amount of reads are
justified..
--
Andrew J. Kelly SQL MVP
"Bill Ray" <rinfo@.mail.com> wrote in message
news:63b44ea.0407020726.156a0c50@.posting.google.com...
> I was wondering if somebody could critique my query tuning technique.
> I am primarily a developer and only a part time dba. I was trained by
> a great dba, but it was back in the sql 6.5 and 7.0 days. I
> concentrate pretty heavily on logical reads, and I am wondering if my
> technique is still valid or if I should be concentrating elsewhere?
> We use sql profiler and find queries that are taking longer than 15
> seconds to run. I get the sql and run it in query analyzer and turn
> statistics io on and look at logical reads. I keep tuning the sql and
> try to get logical reads down. My goal is to get them to less than
> 10,000 logical reads. But if I get a query from 4,000,000 read to
> 200,000, I believe I have made a good improvement.
> Some of our code calls multiple sql statements in sequence, so I
> analyze the code and have a pretty good idea what tables will probably
> be cached. If I see that the tables will be cached because of previous
> sql statements, then I can live with a query that has higher logical
> reads.
> I pretty much don't understand the graphical Execution Plan. Am I
> crazy ignoring this? Or is my above technique pretty good?
Logical reads to tune sql
I am primarily a developer and only a part time dba. I was trained by
a great dba, but it was back in the sql 6.5 and 7.0 days. I
concentrate pretty heavily on logical reads, and I am wondering if my
technique is still valid or if I should be concentrating elsewhere?
We use sql profiler and find queries that are taking longer than 15
seconds to run. I get the sql and run it in query analyzer and turn
statistics io on and look at logical reads. I keep tuning the sql and
try to get logical reads down. My goal is to get them to less than
10,000 logical reads. But if I get a query from 4,000,000 read to
200,000, I believe I have made a good improvement.
Some of our code calls multiple sql statements in sequence, so I
analyze the code and have a pretty good idea what tables will probably
be cached. If I see that the tables will be cached because of previous
sql statements, then I can live with a query that has higher logical
reads.
I pretty much don't understand the graphical Execution Plan. Am I
crazy ignoring this? Or is my above technique pretty good?
The goal is always to get the logical reads to a minimum. But there is no
way to tell if a particular # is as low as it can be without looking at the
query plan to some degree. I don't know what kind of queries you are doing
but if your only looking for queries over 15 seconds and think 10K reads is
great you guy's must have a lot of time on your hands to wait for
results<g>. Usually anything over 150ms is too long for me. Hopefully
these are some type of reporting queries and the amount of reads are
justified..
Andrew J. Kelly SQL MVP
"Bill Ray" <rinfo@.mail.com> wrote in message
news:63b44ea.0407020726.156a0c50@.posting.google.co m...
> I was wondering if somebody could critique my query tuning technique.
> I am primarily a developer and only a part time dba. I was trained by
> a great dba, but it was back in the sql 6.5 and 7.0 days. I
> concentrate pretty heavily on logical reads, and I am wondering if my
> technique is still valid or if I should be concentrating elsewhere?
> We use sql profiler and find queries that are taking longer than 15
> seconds to run. I get the sql and run it in query analyzer and turn
> statistics io on and look at logical reads. I keep tuning the sql and
> try to get logical reads down. My goal is to get them to less than
> 10,000 logical reads. But if I get a query from 4,000,000 read to
> 200,000, I believe I have made a good improvement.
> Some of our code calls multiple sql statements in sequence, so I
> analyze the code and have a pretty good idea what tables will probably
> be cached. If I see that the tables will be cached because of previous
> sql statements, then I can live with a query that has higher logical
> reads.
> I pretty much don't understand the graphical Execution Plan. Am I
> crazy ignoring this? Or is my above technique pretty good?
Logical reads to tune sql
I am primarily a developer and only a part time dba. I was trained by
a great dba, but it was back in the sql 6.5 and 7.0 days. I
concentrate pretty heavily on logical reads, and I am wondering if my
technique is still valid or if I should be concentrating elsewhere?
We use sql profiler and find queries that are taking longer than 15
seconds to run. I get the sql and run it in query analyzer and turn
statistics io on and look at logical reads. I keep tuning the sql and
try to get logical reads down. My goal is to get them to less than
10,000 logical reads. But if I get a query from 4,000,000 read to
200,000, I believe I have made a good improvement.
Some of our code calls multiple sql statements in sequence, so I
analyze the code and have a pretty good idea what tables will probably
be cached. If I see that the tables will be cached because of previous
sql statements, then I can live with a query that has higher logical
reads.
I pretty much don't understand the graphical Execution Plan. Am I
crazy ignoring this? Or is my above technique pretty good?The goal is always to get the logical reads to a minimum. But there is no
way to tell if a particular # is as low as it can be without looking at the
query plan to some degree. I don't know what kind of queries you are doing
but if your only looking for queries over 15 seconds and think 10K reads is
great you guy's must have a lot of time on your hands to wait for
results<g>. Usually anything over 150ms is too long for me. Hopefully
these are some type of reporting queries and the amount of reads are
justified..
Andrew J. Kelly SQL MVP
"Bill Ray" <rinfo@.mail.com> wrote in message
news:63b44ea.0407020726.156a0c50@.posting.google.com...
> I was wondering if somebody could critique my query tuning technique.
> I am primarily a developer and only a part time dba. I was trained by
> a great dba, but it was back in the sql 6.5 and 7.0 days. I
> concentrate pretty heavily on logical reads, and I am wondering if my
> technique is still valid or if I should be concentrating elsewhere?
> We use sql profiler and find queries that are taking longer than 15
> seconds to run. I get the sql and run it in query analyzer and turn
> statistics io on and look at logical reads. I keep tuning the sql and
> try to get logical reads down. My goal is to get them to less than
> 10,000 logical reads. But if I get a query from 4,000,000 read to
> 200,000, I believe I have made a good improvement.
> Some of our code calls multiple sql statements in sequence, so I
> analyze the code and have a pretty good idea what tables will probably
> be cached. If I see that the tables will be cached because of previous
> sql statements, then I can live with a query that has higher logical
> reads.
> I pretty much don't understand the graphical Execution Plan. Am I
> crazy ignoring this? Or is my above technique pretty good?
Logical read and DBCC dropcleanbuffers
again.
Basically, I am in a process of tuning a query and number of logical reads
gets change after I run dbcc dropcleanbuffers.
I understand that physical reads will be different after cache is flushed
out but why logical read get changed after flushing the cache but with exact
same execution plan.
Sql 200 sp4
Hi James
The Reads reported in Profiler also includes reads performed in the
procedure cache, which is variable depending on activity levels & dbcc
commands.
Profiler's reads data is useful for zero'ing on problems, but you should use
SET STATISTICS IO ON for more accurate report on the logical reads
performed by an individual query in the Query Analyser / Management Studio.
Regards,
Greg Linwood
SQL Server MVP
http://blogs.sqlserver.org.au/blogs/greg_linwood
"james" <kush@.brandes.com> wrote in message
news:u51QEfm5GHA.3560@.TK2MSFTNGP03.phx.gbl...
> Hi! I post a question last week but didn't get any responses, so trying it
> again.
> Basically, I am in a process of tuning a query and number of logical reads
> gets change after I run dbcc dropcleanbuffers.
> I understand that physical reads will be different after cache is flushed
> out but why logical read get changed after flushing the cache but with
> exact same execution plan.
> Sql 200 sp4
>
|||Hi Greg,
Thanks for the reply. I am already using Set statistics io on to get the
logical reads, its not from Profiler.
Do you think of any reason why logical read would be different for same
qyery, same dataset with exact same execution plan and only difference is I
flushed the cache with DBCC dropcleanbuffer.
Any input is greatly appreciated.
"Greg Linwood" <g_linwood@.hotmail.com> wrote in message
news:OB$Xvum5GHA.2536@.TK2MSFTNGP06.phx.gbl...
> Hi James
> The Reads reported in Profiler also includes reads performed in the
> procedure cache, which is variable depending on activity levels & dbcc
> commands.
> Profiler's reads data is useful for zero'ing on problems, but you should
> use SET STATISTICS IO ON for more accurate report on the logical reads
> performed by an individual query in the Query Analyser / Management
> Studio.
> Regards,
> Greg Linwood
> SQL Server MVP
> http://blogs.sqlserver.org.au/blogs/greg_linwood
> "james" <kush@.brandes.com> wrote in message
> news:u51QEfm5GHA.3560@.TK2MSFTNGP03.phx.gbl...
>
|||Hi James
Not off the top unfortunately. How much difference are you seeing? Are you
seeing only small differences or large differences/
Is your scenario reproducable via a script that you could post? If so, I'd
be happy to spend some time looking into it. If not, any chance you could
post the set statistics io output & profiler showplan all event's BinaryData
output for each call?
Regards,
Greg Linwood
SQL Server MVP
http://blogs.sqlserver.org.au/blogs/greg_linwood
"james" <kush@.brandes.com> wrote in message
news:enD3PXz5GHA.1860@.TK2MSFTNGP04.phx.gbl...
> Hi Greg,
> Thanks for the reply. I am already using Set statistics io on to get the
> logical reads, its not from Profiler.
> Do you think of any reason why logical read would be different for same
> qyery, same dataset with exact same execution plan and only difference is
> I flushed the cache with DBCC dropcleanbuffer.
> Any input is greatly appreciated.
> "Greg Linwood" <g_linwood@.hotmail.com> wrote in message
> news:OB$Xvum5GHA.2536@.TK2MSFTNGP06.phx.gbl...
>
|||Hi Greg,
I have replied to your email address the actual query, statistics io output
and profiler trace output that you asked for. Forgive me for sending to your
email rather than entire group. I thought with all the attachment and stuff
it would be better send it to directly. Let me know if any problem with
that. Thanks once again.
"Greg Linwood" <g_linwood@.hotmail.com> wrote in message
news:u5NBAjz5GHA.4112@.TK2MSFTNGP04.phx.gbl...
> Hi James
> Not off the top unfortunately. How much difference are you seeing? Are you
> seeing only small differences or large differences/
> Is your scenario reproducable via a script that you could post? If so, I'd
> be happy to spend some time looking into it. If not, any chance you could
> post the set statistics io output & profiler showplan all event's
> BinaryData output for each call?
> Regards,
> Greg Linwood
> SQL Server MVP
> http://blogs.sqlserver.org.au/blogs/greg_linwood
> "james" <kush@.brandes.com> wrote in message
> news:enD3PXz5GHA.1860@.TK2MSFTNGP04.phx.gbl...
>
Logical read and DBCC dropcleanbuffers
again.
Basically, I am in a process of tuning a query and number of logical reads
gets change after I run dbcc dropcleanbuffers.
I understand that physical reads will be different after cache is flushed
out but why logical read get changed after flushing the cache but with exact
same execution plan.
Sql 200 sp4Hi James
The Reads reported in Profiler also includes reads performed in the
procedure cache, which is variable depending on activity levels & dbcc
commands.
Profiler's reads data is useful for zero'ing on problems, but you should use
SET STATISTICS IO ON for more accurate report on the logical reads
performed by an individual query in the Query Analyser / Management Studio.
Regards,
Greg Linwood
SQL Server MVP
http://blogs.sqlserver.org.au/blogs/greg_linwood
"james" <kush@.brandes.com> wrote in message
news:u51QEfm5GHA.3560@.TK2MSFTNGP03.phx.gbl...
> Hi! I post a question last week but didn't get any responses, so trying it
> again.
> Basically, I am in a process of tuning a query and number of logical reads
> gets change after I run dbcc dropcleanbuffers.
> I understand that physical reads will be different after cache is flushed
> out but why logical read get changed after flushing the cache but with
> exact same execution plan.
> Sql 200 sp4
>|||Hi Greg,
Thanks for the reply. I am already using Set statistics io on to get the
logical reads, its not from Profiler.
Do you think of any reason why logical read would be different for same
qyery, same dataset with exact same execution plan and only difference is I
flushed the cache with DBCC dropcleanbuffer.
Any input is greatly appreciated.
"Greg Linwood" <g_linwood@.hotmail.com> wrote in message
news:OB$Xvum5GHA.2536@.TK2MSFTNGP06.phx.gbl...
> Hi James
> The Reads reported in Profiler also includes reads performed in the
> procedure cache, which is variable depending on activity levels & dbcc
> commands.
> Profiler's reads data is useful for zero'ing on problems, but you should
> use SET STATISTICS IO ON for more accurate report on the logical reads
> performed by an individual query in the Query Analyser / Management
> Studio.
> Regards,
> Greg Linwood
> SQL Server MVP
> http://blogs.sqlserver.org.au/blogs/greg_linwood
> "james" <kush@.brandes.com> wrote in message
> news:u51QEfm5GHA.3560@.TK2MSFTNGP03.phx.gbl...
>|||Hi James
Not off the top unfortunately. How much difference are you seeing? Are you
seeing only small differences or large differences/
Is your scenario reproducable via a script that you could post? If so, I'd
be happy to spend some time looking into it. If not, any chance you could
post the set statistics io output & profiler showplan all event's BinaryData
output for each call?
Regards,
Greg Linwood
SQL Server MVP
http://blogs.sqlserver.org.au/blogs/greg_linwood
"james" <kush@.brandes.com> wrote in message
news:enD3PXz5GHA.1860@.TK2MSFTNGP04.phx.gbl...
> Hi Greg,
> Thanks for the reply. I am already using Set statistics io on to get the
> logical reads, its not from Profiler.
> Do you think of any reason why logical read would be different for same
> qyery, same dataset with exact same execution plan and only difference is
> I flushed the cache with DBCC dropcleanbuffer.
> Any input is greatly appreciated.
> "Greg Linwood" <g_linwood@.hotmail.com> wrote in message
> news:OB$Xvum5GHA.2536@.TK2MSFTNGP06.phx.gbl...
>|||Hi Greg,
I have replied to your email address the actual query, statistics io output
and profiler trace output that you asked for. Forgive me for sending to your
email rather than entire group. I thought with all the attachment and stuff
it would be better send it to directly. Let me know if any problem with
that. Thanks once again.
"Greg Linwood" <g_linwood@.hotmail.com> wrote in message
news:u5NBAjz5GHA.4112@.TK2MSFTNGP04.phx.gbl...
> Hi James
> Not off the top unfortunately. How much difference are you seeing? Are you
> seeing only small differences or large differences/
> Is your scenario reproducable via a script that you could post? If so, I'd
> be happy to spend some time looking into it. If not, any chance you could
> post the set statistics io output & profiler showplan all event's
> BinaryData output for each call?
> Regards,
> Greg Linwood
> SQL Server MVP
> http://blogs.sqlserver.org.au/blogs/greg_linwood
> "james" <kush@.brandes.com> wrote in message
> news:enD3PXz5GHA.1860@.TK2MSFTNGP04.phx.gbl...
>
Logical read and DBCC dropcleanbuffers
again.
Basically, I am in a process of tuning a query and number of logical reads
gets change after I run dbcc dropcleanbuffers.
I understand that physical reads will be different after cache is flushed
out but why logical read get changed after flushing the cache but with exact
same execution plan.
Sql 200 sp4Hi James
The Reads reported in Profiler also includes reads performed in the
procedure cache, which is variable depending on activity levels & dbcc
commands.
Profiler's reads data is useful for zero'ing on problems, but you should use
SET STATISTICS IO ON for more accurate report on the logical reads
performed by an individual query in the Query Analyser / Management Studio.
Regards,
Greg Linwood
SQL Server MVP
http://blogs.sqlserver.org.au/blogs/greg_linwood
"james" <kush@.brandes.com> wrote in message
news:u51QEfm5GHA.3560@.TK2MSFTNGP03.phx.gbl...
> Hi! I post a question last week but didn't get any responses, so trying it
> again.
> Basically, I am in a process of tuning a query and number of logical reads
> gets change after I run dbcc dropcleanbuffers.
> I understand that physical reads will be different after cache is flushed
> out but why logical read get changed after flushing the cache but with
> exact same execution plan.
> Sql 200 sp4
>|||Hi Greg,
Thanks for the reply. I am already using Set statistics io on to get the
logical reads, its not from Profiler.
Do you think of any reason why logical read would be different for same
qyery, same dataset with exact same execution plan and only difference is I
flushed the cache with DBCC dropcleanbuffer.
Any input is greatly appreciated.
"Greg Linwood" <g_linwood@.hotmail.com> wrote in message
news:OB$Xvum5GHA.2536@.TK2MSFTNGP06.phx.gbl...
> Hi James
> The Reads reported in Profiler also includes reads performed in the
> procedure cache, which is variable depending on activity levels & dbcc
> commands.
> Profiler's reads data is useful for zero'ing on problems, but you should
> use SET STATISTICS IO ON for more accurate report on the logical reads
> performed by an individual query in the Query Analyser / Management
> Studio.
> Regards,
> Greg Linwood
> SQL Server MVP
> http://blogs.sqlserver.org.au/blogs/greg_linwood
> "james" <kush@.brandes.com> wrote in message
> news:u51QEfm5GHA.3560@.TK2MSFTNGP03.phx.gbl...
>> Hi! I post a question last week but didn't get any responses, so trying
>> it again.
>> Basically, I am in a process of tuning a query and number of logical
>> reads gets change after I run dbcc dropcleanbuffers.
>> I understand that physical reads will be different after cache is flushed
>> out but why logical read get changed after flushing the cache but with
>> exact same execution plan.
>> Sql 200 sp4
>|||Hi James
Not off the top unfortunately. How much difference are you seeing? Are you
seeing only small differences or large differences/
Is your scenario reproducable via a script that you could post? If so, I'd
be happy to spend some time looking into it. If not, any chance you could
post the set statistics io output & profiler showplan all event's BinaryData
output for each call?
Regards,
Greg Linwood
SQL Server MVP
http://blogs.sqlserver.org.au/blogs/greg_linwood
"james" <kush@.brandes.com> wrote in message
news:enD3PXz5GHA.1860@.TK2MSFTNGP04.phx.gbl...
> Hi Greg,
> Thanks for the reply. I am already using Set statistics io on to get the
> logical reads, its not from Profiler.
> Do you think of any reason why logical read would be different for same
> qyery, same dataset with exact same execution plan and only difference is
> I flushed the cache with DBCC dropcleanbuffer.
> Any input is greatly appreciated.
> "Greg Linwood" <g_linwood@.hotmail.com> wrote in message
> news:OB$Xvum5GHA.2536@.TK2MSFTNGP06.phx.gbl...
>> Hi James
>> The Reads reported in Profiler also includes reads performed in the
>> procedure cache, which is variable depending on activity levels & dbcc
>> commands.
>> Profiler's reads data is useful for zero'ing on problems, but you should
>> use SET STATISTICS IO ON for more accurate report on the logical reads
>> performed by an individual query in the Query Analyser / Management
>> Studio.
>> Regards,
>> Greg Linwood
>> SQL Server MVP
>> http://blogs.sqlserver.org.au/blogs/greg_linwood
>> "james" <kush@.brandes.com> wrote in message
>> news:u51QEfm5GHA.3560@.TK2MSFTNGP03.phx.gbl...
>> Hi! I post a question last week but didn't get any responses, so trying
>> it again.
>> Basically, I am in a process of tuning a query and number of logical
>> reads gets change after I run dbcc dropcleanbuffers.
>> I understand that physical reads will be different after cache is
>> flushed
>> out but why logical read get changed after flushing the cache but with
>> exact same execution plan.
>> Sql 200 sp4
>>
>|||Hi Greg,
I have replied to your email address the actual query, statistics io output
and profiler trace output that you asked for. Forgive me for sending to your
email rather than entire group. I thought with all the attachment and stuff
it would be better send it to directly. Let me know if any problem with
that. Thanks once again.
"Greg Linwood" <g_linwood@.hotmail.com> wrote in message
news:u5NBAjz5GHA.4112@.TK2MSFTNGP04.phx.gbl...
> Hi James
> Not off the top unfortunately. How much difference are you seeing? Are you
> seeing only small differences or large differences/
> Is your scenario reproducable via a script that you could post? If so, I'd
> be happy to spend some time looking into it. If not, any chance you could
> post the set statistics io output & profiler showplan all event's
> BinaryData output for each call?
> Regards,
> Greg Linwood
> SQL Server MVP
> http://blogs.sqlserver.org.au/blogs/greg_linwood
> "james" <kush@.brandes.com> wrote in message
> news:enD3PXz5GHA.1860@.TK2MSFTNGP04.phx.gbl...
>> Hi Greg,
>> Thanks for the reply. I am already using Set statistics io on to get the
>> logical reads, its not from Profiler.
>> Do you think of any reason why logical read would be different for same
>> qyery, same dataset with exact same execution plan and only difference is
>> I flushed the cache with DBCC dropcleanbuffer.
>> Any input is greatly appreciated.
>> "Greg Linwood" <g_linwood@.hotmail.com> wrote in message
>> news:OB$Xvum5GHA.2536@.TK2MSFTNGP06.phx.gbl...
>> Hi James
>> The Reads reported in Profiler also includes reads performed in the
>> procedure cache, which is variable depending on activity levels & dbcc
>> commands.
>> Profiler's reads data is useful for zero'ing on problems, but you should
>> use SET STATISTICS IO ON for more accurate report on the logical reads
>> performed by an individual query in the Query Analyser / Management
>> Studio.
>> Regards,
>> Greg Linwood
>> SQL Server MVP
>> http://blogs.sqlserver.org.au/blogs/greg_linwood
>> "james" <kush@.brandes.com> wrote in message
>> news:u51QEfm5GHA.3560@.TK2MSFTNGP03.phx.gbl...
>> Hi! I post a question last week but didn't get any responses, so trying
>> it again.
>> Basically, I am in a process of tuning a query and number of logical
>> reads gets change after I run dbcc dropcleanbuffers.
>> I understand that physical reads will be different after cache is
>> flushed
>> out but why logical read get changed after flushing the cache but with
>> exact same execution plan.
>> Sql 200 sp4
>>
>>
>
logical read
Hi
I am trying to tune a sql query. The query return about 300 rows of uniqueindentifier. But in the statistics IO, I see Logical reads of 7228, that does not seems right? I tried to rebuild the index, but the logical read did not help. But I set up the database on another computer, and the IO on this on came down to 10. Is there something wrong with the computer with logical read of 7228
thanks
Pauli
There may not be enough memory available to SQL Server, and the data cache is being flushed.
What is the memory/load differences between the two computers?
|||Are the indexes the same in both databases? It really sounds like you are missing an index in the first database. Even on a machine with memory pressure, that much of a difference in logical IO's points at an index or statistics problem. Are you seeing physical reads when you run the query in the first database?logical inserted table and after triggers
for some reason my create trigger query fails because sql server cannot resolve the term inserted. It won't even parse without errors. This is what my t-sql looks like
CREATE TRIGGER UserTypeTrig ON GTData.dbo.GTUserType
AFTER Insert, Update
AS
SET NOCOUNT ON
IF EXISTS(SELECT *
FROM GTData.dbo.GTUserType G
JOIN INSERTED I
ON G.TypeID != I.TypeID AND LOWER(G.UserType) = LOWER(I.UserType);
BEGIN;
RAISERROR('cannot insert duplicate userType', 16, 1)
ROLLBACK TRANSACTION
RETURN
END;
appreciate all the help i can get.
Your syntax is not correct. You are using statement terminators in the wrong place. See the modified code:
CREATE TRIGGER UserTypeTrig ON GTData.dbo.GTUserType
AFTER Insert, Update
AS
SET NOCOUNT ON;
IF EXISTS(SELECT *
FROM GTData.dbo.GTUserType G
JOIN INSERTED I
ON G.TypeID != I.TypeID AND LOWER(G.UserType) = LOWER(I.UserType)
)
BEGIN;
RAISERROR('cannot insert duplicate userType', 16, 1);
ROLLBACK TRANSACTION;
RETURN;
END;
|||
Umachandar Jayachandran
Thanks a lot!
Friday, February 24, 2012
Logic statement using select query
I'd like to make a logic statement, that would take as arguments result of the sql select query. In more details: I would like to create a local Bool variable that would be false if some value is NULL in the table (or select query).
Query example:
select taskID from Users where Login=@.username
Which classes/methods should i use to solve this problem? I use SqlDataSource to get access to database and i think i should use something like SqlDataSource.UpdateCommand and SqlDataSource.UpdateParameters but dont know how to build from this a logic statement.
Thanks in advance
If you are using the SqlDataSource, in its Selected event, you can check the RowsAffected property. If it is > 0, set your bool to true:
bool Check = False;
protected void SqlDataSource1_Selected(object sender, SqlDataSourceStatusEventArgs e)
{
int RecordCount = e.AffectedRows;
if (RecordCount > 0)
{ Check = True; }
}
Actually I tried to make it in the Page_Load(object sender, EventArgs e) event. Does it change anything?
I'm wondering if there is a way to get to sql query from C# code. Something like:
[for each row from select query]
bool present;
if ( [row.taskID] == null )
{
present = false;
}
|||You will not get rows containing null from the query you showed in your example. You will only get rows that contain the TaskID that belong to the user specified. If there are no TaskIDs for that user, you will get no rows at all.|||I think i will get a row of some user even if he has NULL in taskID field. I checked this and i got the row that i wanted with empty value of the taskID field.
But even if there would be e.g. -1 the problem still exists and i have no idea how to solve it or even where to look for solution.
|||I beg your pardon. You are right. I didn't read your query correctly. You can access values from a datasource as described here:http://www.mikesdotnetting.com/Article.aspx?ArticleID=45, and check each value, setting your bool accordingly|||Thaks again, it works fine now. You are gosu ;)
|||
DCVer:
You are gosu ;)
Is that a good thing?
Yes, you are master, the king etc
Logic on UPDATE query
I used an update query that looks like this:
UPDATE Acct_table Set Acct_table.Score =
(Select Score_tbl.Score from Score_tbl
Where Acct_table.Acctnb = Score_tbl.Acctnb
This process has been running for over an hour and a half and is building a large log file. I am curious to know if there is a better command that I can use in order to join the tables and then just drop the column from one to the other. Both tables are indexed on Acctnb.
Any insight would truly help.
Thanks!UPDATE A Set A.Score = S.Score
from Acct_table A
JOIN Score_tbl S
ON A.Acctnb = S.Acctnb|||Does anyone else hear an echo?|||Does anyone else hear an echo?No, not a thing. Why?
Yes, I deleted Thrasy's duplicated post
-PatP|||aaaaaaaaaa
logic of sum() with joins and using query hint
I have a question about the sum() function. when I join two tabeles and one
of them is the main table which I used in the from statement, sum function I
used for the joined table is giving the sum incorrectly(it is governing time
s
the other joined tabele). how can i eleminate the problem? do I have to use
query hint or someting. if so how?
the query is the basis of a fifo report. I didnt want to use a cursor and so
I wrote such a query. I solved the problem with UDFs but i want to learn the
logic and how to use query hint for sum function
id is primary key for both tables
CREATE TABLE [order] (
[id] [int] NULL ,
[date_order] [datetime] NULL ,
[product] [char] (10) ,
[quantity] [int] NULL
)
go
CREATE TABLE [distribute] (
[id] [int] NULL ,
[date_distribute] [datetime] NULL ,
[product] [char] (10),
[quantity] [int] NULL
)
--sample data
insert into distribute (id,date_distribute,product,quantity) values
(51,'2005-01-01','aaa',10)
insert into distribute (id,date_distribute,product,quantity) values
(52,'2005-01-04','aaa',13)
insert into distribute (id,date_distribute,product,quantity) values
(53,'2005-01-05','aaa',3)
insert into distribute (id,date_distribute,product,quantity) values
(54,'2005-01-06','aaa',-2)
insert into distribute (id,date_distribute,product,quantity) values
(55,'2005-01-07','aaa',8)
insert into distribute (id,date_distribute,product,quantity) values
(56,'2005-01-08','aaa',45)
insert into distribute (id,date_distribute,product,quantity) values
(57,'2005-01-10','aaa',10)
insert into [order] (id,date_order,product,quantity) values
(11,'2005-01-01','aaa',10)
insert into [order] (id,date_order,product,quantity) values
(12,'2005-01-03','aaa',20)
insert into [order] (id,date_order,product,quantity) values
(13,'2005-01-05','aaa',30)
insert into [order] (id,date_order,product,quantity) values
(14,'2005-01-08','aaa',15)
insert into [order] (id,date_order,product,quantity) values
(15,'2005-01-09','aaa',10)
--query
select o1.id,o1.date_order,o1.product, o1.quantity,d1.id as Distribute_id ,
isnull(sum(d2.quantity),0)as forobservingdist,
isnull(sum(o2.quantity),0)as forobservingord
from [order] o1
left join [order] o2 on o1.id>=o2.id and o1.product=o2.product
left join Distribute d1 on o1.date_order<=d1.date_distribute and
o1.product=d1.product
left join Distribute d2 on d2.id<=d1.id and d2.product=d1.product
group by o1.id, o1.date_order,o1.product, o1.quantity,d1.id
,d1.date_distribute,d1.quantity,d1.product
the query below has the same logic with the above query. I used UDFs for the
sum functions.
when you run the queries, forobservingord column must be the same as
forobservingord in the results of the query below
CREATE FUNCTION getorderdogan
(@.id int,@.product nvarchar(50))
RETURNS int
AS
BEGIN
DECLARE @.sum AS int
select @.sum = sum(o2.quantity) from [order] o2 where o2.id<=@.id and
o2.product=@.product
RETURN @.sum
END
go
CREATE FUNCTION getdistributedogan
(@.id int,@.product nvarchar(50))
RETURNS int
AS
BEGIN
DECLARE @.sum AS int
select @.sum = sum(d2.quantity) from Distribute d2 where d2.id<=@.id and
d2.product=@.product
RETURN isnull(@.sum,0)
END
select o1.id,o1.date_order,o1.product, o1.quantity,d1.id as Distribute_id ,
d1.date_distribute,
dbo.getdistributedogan(d1.id,d1.product)as forobservingdist,
dbo.getorderdogan(o1.id,o1.product)as forobservingord
from dbo.[order] o1
left outer join Distribute d1 on o1.date_order<=d1.date_distribute and
o1.product=d1.product
group by o1.id, o1.date_order,o1.product, o1.quantity,d1.id
,d1.date_distribute,d1.quantity,d1.productYou can achieve the same results (with better performance) by using
subqueries instead of UDF-s:
select o1.id, o1.date_order, o1.product,o1.quantity,
d1.id as Distribute_id, d1.date_distribute, (
select sum(d2.quantity) from Distribute d2
where d2.id<=d1.id and d2.product=d1.product
) as forobservingdist, (
select sum(o2.quantity) from [order] o2
where o2.id<=o1.id and o2.product=o1.product
) as forobservingord
from dbo.[order] o1
left outer join Distribute d1
on o1.date_order<=d1.date_distribute and o1.product=d1.product
group by o1.id, o1.date_order, o1.product, o1.quantity,
d1.id, d1.date_distribute, d1.quantity, d1.product
Razvan|||tnx Razvan sure I didnt think this:))
do you have any info about using query hint works like that?
also the last form of the query is like that
select o1.id, o1.date_order, o1.product, o1.quantity, d1.id as Distribute_id
, d1.date_distribute,
case when
0>(dbo.getorderdogan(o1.id,o1.product)-dbo.getdistributedogan(d1.id,d1.produ
ct))
then
isnull(d1.quantity,0)+(dbo.getorderdogan(o1.id,o1.product)-dbo.getdistribute
dogan(d1.id,d1.product))
when
d1.quantity>o1.quantity-(dbo.getorderdogan(o1.id,o1.product)-dbo.getdistribu
tedogan(d1.id,d1.product))
then
(o1.quantity-(dbo.getorderdogan(o1.id,o1.product)-dbo.getdistributedogan(d1.
id,d1.product)))
else
isnull(d1.quantity,0)
end as distributed_quantity,
case when
(dbo.getorderdogan(o1.id,o1.product)-dbo.getdistributedogan(d1.id,d1.product
))>o1.quantity
then
o1.quantity
when
(dbo.getorderdogan(o1.id,o1.product)-dbo.getdistributedogan(d1.id,d1.product
))<0
then
0
else
(dbo.getorderdogan(o1.id,o1.product)-dbo.getdistributedogan(d1.id,d1.product
))
end as remainingorder,
dbo.getdistributedogan(d1.id,d1.product)as forobservingdist,
dbo.getorderdogan(o1.id,o1.product)as forobservingord
from dbo.[order] o1
left outer join Distribute d1 on o1.date_order<=d1.date_distribute and
o1.product=d1.product
group by o1.id, o1.date_order, o1.product, o1.quantity, d1.id ,
d1.date_distribute, d1.quantity, d1.product
having
((dbo.getorderdogan(o1.id,o1.product)-dbo.getdistributedogan(d1.id,d1.produc
t))+isnull(d1.quantity,0)>0
and
o1.quantity>(dbo.getorderdogan(o1.id,o1.product)-dbo.getdistributedogan(d1.i
d,d1.product)))
or isnull(d1.quantity,0)=0
I changed the UDFs with subqueries and its working. I looked at the query
execution plan and it looks more simple with the udfs. are you sure this wil
l
work with better performance? probably you are:))
also plan of the query with subqueries shows many hash matches. Can't I use
query hint making hash matches with left join?
thanks again
select o1.id, o1.date_order, o1.product, o1.quantity, d1.id as Distribute_id
, d1.date_distribute,
case when
0>((select sum(o2.quantity) from [order] o2 where o2.id<=o1.id and
o2.product=o1.product)-(select sum(d2.quantity) from Distribute d2 where
d2.id<=d1.id and d2.product=d1.product))
then
isnull(d1.quantity,0)+((select sum(o2.quantity) from [order] o2 where
o2.id<=o1.id and o2.product=o1.product)-(select sum(d2.quantity) from
Distribute d2 where d2.id<=d1.id and d2.product=d1.product))
when
d1.quantity>o1.quantity-((select sum(o2.quantity) from [order] o2 where
o2.id<=o1.id and o2.product=o1.product)-(select sum(d2.quantity) from
Distribute d2 where d2.id<=d1.id and d2.product=d1.product))
then
(o1.quantity-((select sum(o2.quantity) from [order] o2 where o2.id<=o1.id
and o2.product=o1.product)-(select sum(d2.quantity) from Distribute d2 where
d2.id<=d1.id and d2.product=d1.product)))
else
isnull(d1.quantity,0)
end as distributed_quantity,
case when
((select sum(o2.quantity) from [order] o2 where o2.id<=o1.id and
o2.product=o1.product)-(select sum(d2.quantity) from Distribute d2 where
d2.id<=d1.id and d2.product=d1.product))>o1.quantity
then
o1.quantity
when
((select sum(o2.quantity) from [order] o2 where o2.id<=o1.id and
o2.product=o1.product)-(select sum(d2.quantity) from Distribute d2 where
d2.id<=d1.id and d2.product=d1.product))<0
then
0
else
((select sum(o2.quantity) from [order] o2 where o2.id<=o1.id and
o2.product=o1.product)-(select sum(d2.quantity) from Distribute d2 where
d2.id<=d1.id and d2.product=d1.product))
end as remainingorder,
(select sum(d2.quantity) from Distribute d2 where d2.id<=d1.id and
d2.product=d1.product)as forobservingdist,
(select sum(o2.quantity) from [order] o2 where o2.id<=o1.id and
o2.product=o1.product)as forobservingord
from dbo.[order] o1
left outer join Distribute d1 on o1.date_order<=d1.date_distribute and
o1.product=d1.product
group by o1.id, o1.date_order, o1.product, o1.quantity, d1.id ,
d1.date_distribute, d1.quantity, d1.product
having (((select sum(o2.quantity) from [order] o2 where o2.id<=o1.id and
o2.product=o1.product)-(select sum(d2.quantity) from Distribute d2 where
d2.id<=d1.id and d2.product=d1.product))+isnull(d1.quantity,0)>0
and o1.quantity>((select sum(o2.quantity) from [order] o2 where
o2.id<=o1.id and o2.product=o1.product)-(select sum(d2.quantity) from
Distribute d2 where d2.id<=d1.id and d2.product=d1.product)))
or isnull(d1.quantity,0)=0
"Razvan Socol" wrote:
> You can achieve the same results (with better performance) by using
> subqueries instead of UDF-s:
> select o1.id, o1.date_order, o1.product,o1.quantity,
> d1.id as Distribute_id, d1.date_distribute, (
> select sum(d2.quantity) from Distribute d2
> where d2.id<=d1.id and d2.product=d1.product
> ) as forobservingdist, (
> select sum(o2.quantity) from [order] o2
> where o2.id<=o1.id and o2.product=o1.product
> ) as forobservingord
> from dbo.[order] o1
> left outer join Distribute d1
> on o1.date_order<=d1.date_distribute and o1.product=d1.product
> group by o1.id, o1.date_order, o1.product, o1.quantity,
> d1.id, d1.date_distribute, d1.quantity, d1.product
> Razvan
>|||> do you have any info about using query hint works like that?
There are no query hints that modify the results; the hints are used
only for optimizations. For informations about hints, see the "query
hints" topic in Books Online:
http://msdn.microsoft.com/library/e..._qd_03_8upf.asp
> I looked at the query execution plan and it looks more simple with the udfs.[/colo
r]
The execution plan for a query that calls multi-statement UDF-s (scalar
or table-valued) do not contain the cost of the statements contained in
the UDF. Only in-line table-valued UDF-s are expaded in the execution
plan of the calling query.
> are you sure this will work with better performance? probably you are:))
To be sure, test it yourself using Profiler or using something like
this:
DECLARE @.t datetime
SET @.t=GETDATE()
SELECT ...
PRINT CONVERT(varchar(10),DATEDIFF(ms,@.t,GETDA
TE()))+' ms'
> also plan of the query with subqueries shows many hash matches.
> Can't I use query hint making hash matches with left join?
SQL Server can execute joins in one of three ways: nested loops, merge
or hash. These ways can be used for inner joins, as well as for outer
joins (left joins, right joins or full outer joins). The query
optimizer automatically selects the best way to execute a join (nested
loops, merge or hash), based on the number of rows and the available
indexes on the joined columns. If the optimizer used a hash join,
that's because this is probably the best way to execute the query in
this particular case. Adding a join hint will force SQL Server to use
nested loops joins or merge joins, but in most cases that would have an
inferior performance. A better idea would be to add indexes to the
columns that are used in the join (in this case: product and id) and
let the query optimizer choose the way it executes the query (the query
optimizer may realize that it's better not to use the index on the id
column, and use only the index on the product column, for example).
For more informations about the ways a join can be executed, see
"Advanced Query Tuning Concepts" topic in Books Online:
http://msdn.microsoft.com/library/e..._tun_1_8pv7.asp
Razvan|||thank you very much for the easy performance test code:))
I've tried it both query gives sometimes 10ms sometimes 20ms result
I know I had to test it with much more data. thanks again.
I've tried to add index as you

also I tried left hash join which returns a warning about changing the plan
and didnt change anything in result set.
then I tried some unconscious synthax but these instinctly tries didnt
change the result:)
so I give up:))
thanks
"Razvan Socol" wrote:
> There are no query hints that modify the results; the hints are used
> only for optimizations. For informations about hints, see the "query
> hints" topic in Books Online:
> http://msdn.microsoft.com/library/e..._qd_03_8upf.asp
>
> The execution plan for a query that calls multi-statement UDF-s (scalar
> or table-valued) do not contain the cost of the statements contained in
> the UDF. Only in-line table-valued UDF-s are expaded in the execution
> plan of the calling query.
>
> To be sure, test it yourself using Profiler or using something like
> this:
> DECLARE @.t datetime
> SET @.t=GETDATE()
> SELECT ...
> PRINT CONVERT(varchar(10),DATEDIFF(ms,@.t,GETDA
TE()))+' ms'
>
> SQL Server can execute joins in one of three ways: nested loops, merge
> or hash. These ways can be used for inner joins, as well as for outer
> joins (left joins, right joins or full outer joins). The query
> optimizer automatically selects the best way to execute a join (nested
> loops, merge or hash), based on the number of rows and the available
> indexes on the joined columns. If the optimizer used a hash join,
> that's because this is probably the best way to execute the query in
> this particular case. Adding a join hint will force SQL Server to use
> nested loops joins or merge joins, but in most cases that would have an
> inferior performance. A better idea would be to add indexes to the
> columns that are used in the join (in this case: product and id) and
> let the query optimizer choose the way it executes the query (the query
> optimizer may realize that it's better not to use the index on the id
> column, and use only the index on the product column, for example).
> For more informations about the ways a join can be executed, see
> "Advanced Query Tuning Concepts" topic in Books Online:
> http://msdn.microsoft.com/library/e..._tun_1_8pv7.asp
> Razvan
>
Monday, February 20, 2012
Logging SQL Queries
Hi,
I′m currently playing around with ASP.NET.
Is there a way to log all Queries that are send to the SQL-Server? Something like the query.log of a Mysql.
You can write your own logging system (I have done that when working with dynamic TSQL), however you can just run a SQL Trace from within SQL Server.
Thanks for your reply.
An own logging system won′t help if I use, for example, a sql datasource for a gridview. It would bypass the logging class.
I′ve searched the onlinehelp and it seems, that a SQL Trace is a bit overpowered for a simple querylog.
>I′ve searched the onlinehelp and it seems, that a SQL Trace is a bit overpowered for a simple querylog.
You would need to tune one of the standard selection else otherwise the volume of log data would be overwhelming.
Logging Sql queries
database to log to a file each query that gets run against it.
However, I cannot seen to find where to set this up in Enterprise
Manager."Mystery Man" <PromisedOyster@.hotmail.com> wrote in message
news:87c81238.0405130309.7c0013fb@.posting.google.c om...
> In SQLServer 2000, I am sure there was a way that you configure the
> database to log to a file each query that gets run against it.
> However, I cannot seen to find where to set this up in Enterprise
> Manager.
I don't know any way to do exactly what you want, but there are options:
If you're going through ODBC, you can enable ODBC logging.
You can also enable profiler to capture all the traffic.
Another option is to invest in
http://www.lumigent.com/products/le_sql/le_sql.htm|||Mystery Man (PromisedOyster@.hotmail.com) writes:
> In SQLServer 2000, I am sure there was a way that you configure the
> database to log to a file each query that gets run against it.
> However, I cannot seen to find where to set this up in Enterprise
> Manager.
The Profiler is the tool you should use. Or at least where you should
start looking. If you are seriously into log everything which happens
on the server, you should set up a server-side trace with help of
the sp_trace procedures.
Furthermore, if you cannot accept anything to be unlogged because the
trace file fills up the disk, set the C2-autiding configuration option,
which throttles the server in this situation.
--
Erland Sommarskog, SQL Server MVP, sommar@.algonet.se
Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techin.../2000/books.asp
Logging Query Messages From ExecuteSQL Task
The ExecuteSQL task makes it easy to capture the resultset of a query or stored proc in a variable. But what about the messages that would normally appear in the Messages tab when running queries in Management Studio or Query Analyzer? There must be a way to display and log rowcounts and other messages that a query normally produces.
Various package/task logging configrations haven't helped. I've read discussions about logging rowcounts in data flow tasks, but what about capturing messages in ExecuteSQL and other control flow tasks that run T-SQL?
I only have enough time to dabble in this technology so I may be missing something obvious or approaching this incorrectly. Any suggestion, reference, constructive criticism or reponse would be appreciated.
It is possible to capture the values of output parameters when using stored procedures with the Execute SQL Task. These values can be stored in SSIS variables to be further examined.
I hope this helps.
|||Duane Douglas wrote:
It is possible to capture the values of output parameters when using stored procedures with the Execute SQL Task. These values can be stored in SSIS variables to be further examined.
I hope this helps.
Except the OP is looking for diagnostic messages... For instance, in an update statement that only updates one row, you get "1 rows affected" or whatever the message says... The OP wants to be able to capture that output.|||
Phil Brammer wrote:
Duane Douglas wrote: It is possible to capture the values of output parameters when using stored procedures with the Execute SQL Task. These values can be stored in SSIS variables to be further examined.
I hope this helps.
Except the OP is looking for diagnostic messages... For instance, in an update statement that only updates one row, you get "1 rows affected" or whatever the message says... The OP wants to be able to capture that output.
Perhaps I don't completely understand what you're trying to accomplish. "Rows affected" is captured by the @.@.ROWCOUNT t-sql variable. As far as I know, the only other messages displayed in the query results window are errors which can also be captured by t-sql.|||
Duane Douglas wrote:
Phil Brammer wrote: Duane Douglas wrote: It is possible to capture the values of output parameters when using stored procedures with the Execute SQL Task. These values can be stored in SSIS variables to be further examined.
I hope this helps.
Except the OP is looking for diagnostic messages... For instance, in an update statement that only updates one row, you get "1 rows affected" or whatever the message says... The OP wants to be able to capture that output.
Perhaps I don't completely understand what you're trying to accomplish. "Rows affected" is captured by the @.@.ROWCOUNT t-sql variable. As far as I know, the only other messages displayed in the query results window are errors which can also be captured by t-sql.
Duane is correct. As a matter of course I generally finish off my Execute SQL Tasks with:
SELECT @.@.ROWCOUNT AS RowCnt
and store it in a package-scoped variable.
-Jamie
|||Perhaps I don't completely understand what you're trying to accomplish. "Rows affected" is captured by the @.@.ROWCOUNT t-sql variable. As far as I know, the only other messages displayed in the query results window are errors which can also be captured by t-sql.Duane Douglas wrote:
Phil Brammer wrote: Duane Douglas wrote: It is possible to capture the values of output parameters when using stored procedures with the Execute SQL Task. These values can be stored in SSIS variables to be further examined.
I hope this helps.
Except the OP is looking for diagnostic messages... For instance, in an update statement that only updates one row, you get "1 rows affected" or whatever the message says... The OP wants to be able to capture that output.
What about other messages thrown by SQL? I don't know what they are, but "Command(s) completed successfully." is a valid message. Can it be captured somehow? What about other messages that might be generated? (I don't know what they are, as I'm just speaking for the original poster.)|||
Phil Brammer wrote:
What about other messages thrown by SQL? I don't know what they are, but "Command(s) completed successfully." is a valid message. Can it be captured somehow? What about other messages that might be generated? (I don't know what they are, as I'm just speaking for the original poster.)
I may be completely wrong about this but messages such as "Command(s) completed successfully." are messages generated by SSMS, not the database engine. Hence, they wouldn't be accessible from SSIS.
-Jamie
|||
The answers and additional questions from all of you--Jamie, Phil and Duane along with some additional time experimenting with SSIS logging has been very helpful. I was indeed looking for SSMS or Query Analyzer message functionality to be duplicated in SSIS. Now I understand there are a few techniques that, combined, should provide the same messages.
It would be nice if such basic information were included in otherwise excellent references such as BOL or Kirk Haselden's Integration Services book. (Maybe this is mentioned somewhere and I missed it...). Anyway, thanks again.
|||
M.Glenn wrote:
The answers and additional questions from all of you--Jamie, Phil and Duane along with some additional time experimenting with SSIS logging has been very helpful. I was indeed looking for SSMS or Query Analyzer message functionality to be duplicated in SSIS. Now I understand there are a few techniques that, combined, should provide the same messages.
It would be nice if such basic information were included in otherwise excellent references such as BOL or Kirk Haselden's Integration Services book. (Maybe this is mentioned somewhere and I missed it...). Anyway, thanks again.
Mike,
No problem.
If you think that BOL is short somewhere then tell them. The option to provide feedback is taken very seriously (I know from experience). In this case its more SQL engine or T-SQL stuff rather than SSIS - but provide it anyway.
-Jamie
|||
SSMS is merely a GUI for SQL Server. Having said that, all of SQL Server's functionality can be accessed via T-SQL. For example, when a query executes without errors (in other words, when it completes successfully), SQL Server sets the @.@.ERROR variable to "0". This value can be returned to the client via a stored proceedure output parameter.M.Glenn wrote:
The answers and additional questions from all of you--Jamie, Phil and Duane along with some additional time experimenting with SSIS logging has been very helpful. I was indeed looking for SSMS or Query Analyzer message functionality to be duplicated in SSIS. Now I understand there are a few techniques that, combined, should provide the same messages.
It would be nice if such basic information were included in otherwise excellent references such as BOL or Kirk Haselden's Integration Services book. (Maybe this is mentioned somewhere and I missed it...). Anyway, thanks again.
One should bear in mind that SSIS is a very advanced feature of SQL Server. As such, a mastery of T-SQL is extremely helpful when using SSIS.
|||Duane Douglas wrote:
SSMS is merely a GUI for SQL Server. Having said that, all of SQL Server's functionality can be accessed via T-SQL. For example, when a query executes without errors (in other words, when it completes successfully), SQL Server sets the @.@.ERROR variable to "0". This value can be returned to the client via a stored proceedure output parameter. One should bear in mind that SSIS is a very advanced feature of SQL Server. As such, a mastery of T-SQL is extremely helpful when using SSIS.
Agreed, but mastery of T-SQL is not the central issue in this case. I've used @.@.ERROR, @.@.ROWCOUNT, etc. in scripts, stored procs and triggers plenty of times. DTS/SSIS is where I'm on less familiar ground--only recently finding time to delve deeper. My frustration, which Phil Brammer's questions helped to tease out, was explained when Jamie Thompson stated "...messages such as "Command(s) completed successfully." are messages generated by SSMS, not the database engine. Hence, they wouldn't be accessible from SSIS." This together with your posts made it clear that the message stream provided by SSMS or QA is not available in SSIS and rolling your own, so to speak, is the only solution. Okay, I can take it from there.
However, this is disappointing. If you're building an SSIS solution from scratch it's not a problem. But in this case I'm using SSIS to automate more than a dozen vendor developed DDL and DML scripts that otherwise have to be run manually in SSMS. One script would have to be manually rerun more than 40 times, entering a different "batch number" each time. I let them know I wasn't too happy about that one. Anyway, instead of someone sitting there babysitting this convoluted upgrade/cleanup process for hours (or in this case days), SSIS makes it easy to automate the entire process--run the scripts in proper order, manage precedence constraints, enumerate the batch numbers in a forEach loop, etc.
Everything is working beautifully except for logging. A few of the scripts contain 20 or more individual queries. Running them in SSMS or QA would automatically give me a nice message log of the number of records affected, output from print statements embedded in the scripts, error messages, etc. If I understand you guys correctly, there's no built-in functionality to generate this kind of message stream in SSIS. I'd have to break the big scripts down, create ExecuteSQL tasks for each individual query (around 50 to 60 altogether) add and map output parameters for each and configure logging to capture all this. Not a happy prospect.
Is there any reason Microsoft couldn't duplicate SSMS/QA message stream functionality in a future release of SSIS? It would save a lot of work in cases like this. I'll suggest it unless someone has a good reason why it couldn't be done or (preferably) knows of a workaround for this situation.
Happy holiday!
|||OK, so then how do you write the contents of the package-scoped variable to the log?
I think I've got my rowCnt into a variable but I can see what value it is.
I've tried having the package log 'OnVariableValueChanged' events, but it just tells me the name of the variable, not the value.
(I'm using SQL2005 SP2)
Thanks
|||
Andy Abel wrote:
OK, so then how do you write the contents of the package-scoped variable to the log?
I think I've got my rowCnt into a variable but I can see what value it is.
I've tried having the package log 'OnVariableValueChanged' events, but it just tells me the name of the variable, not the value.
(I'm using SQL2005 SP2)
Thanks
If you are just trying to see the value of the variable, add a breakpoint and use the Watch window to view it. If you need to record the value, you could add an event handler for the OnVariableValueChanged event, and use an Execute SQL task to write it to the database.
Logging Query Messages From ExecuteSQL Task
The ExecuteSQL task makes it easy to capture the resultset of a query or stored proc in a variable. But what about the messages that would normally appear in the Messages tab when running queries in Management Studio or Query Analyzer? There must be a way to display and log rowcounts and other messages that a query normally produces.
Various package/task logging configrations haven't helped. I've read discussions about logging rowcounts in data flow tasks, but what about capturing messages in ExecuteSQL and other control flow tasks that run T-SQL?
I only have enough time to dabble in this technology so I may be missing something obvious or approaching this incorrectly. Any suggestion, reference, constructive criticism or reponse would be appreciated.
It is possible to capture the values of output parameters when using stored procedures with the Execute SQL Task. These values can be stored in SSIS variables to be further examined.
I hope this helps.
|||Duane Douglas wrote:
It is possible to capture the values of output parameters when using stored procedures with the Execute SQL Task. These values can be stored in SSIS variables to be further examined.
I hope this helps.
Except the OP is looking for diagnostic messages... For instance, in an update statement that only updates one row, you get "1 rows affected" or whatever the message says... The OP wants to be able to capture that output.|||
Phil Brammer wrote:
Duane Douglas wrote: It is possible to capture the values of output parameters when using stored procedures with the Execute SQL Task. These values can be stored in SSIS variables to be further examined.
I hope this helps.
Except the OP is looking for diagnostic messages... For instance, in an update statement that only updates one row, you get "1 rows affected" or whatever the message says... The OP wants to be able to capture that output.
Perhaps I don't completely understand what you're trying to accomplish. "Rows affected" is captured by the @.@.ROWCOUNT t-sql variable. As far as I know, the only other messages displayed in the query results window are errors which can also be captured by t-sql.|||
Duane Douglas wrote:
Phil Brammer wrote: Duane Douglas wrote: It is possible to capture the values of output parameters when using stored procedures with the Execute SQL Task. These values can be stored in SSIS variables to be further examined.
I hope this helps.
Except the OP is looking for diagnostic messages... For instance, in an update statement that only updates one row, you get "1 rows affected" or whatever the message says... The OP wants to be able to capture that output.
Perhaps I don't completely understand what you're trying to accomplish. "Rows affected" is captured by the @.@.ROWCOUNT t-sql variable. As far as I know, the only other messages displayed in the query results window are errors which can also be captured by t-sql.
Duane is correct. As a matter of course I generally finish off my Execute SQL Tasks with:
SELECT @.@.ROWCOUNT AS RowCnt
and store it in a package-scoped variable.
-Jamie
|||Perhaps I don't completely understand what you're trying to accomplish. "Rows affected" is captured by the @.@.ROWCOUNT t-sql variable. As far as I know, the only other messages displayed in the query results window are errors which can also be captured by t-sql.Duane Douglas wrote:
Phil Brammer wrote: Duane Douglas wrote: It is possible to capture the values of output parameters when using stored procedures with the Execute SQL Task. These values can be stored in SSIS variables to be further examined.
I hope this helps.
Except the OP is looking for diagnostic messages... For instance, in an update statement that only updates one row, you get "1 rows affected" or whatever the message says... The OP wants to be able to capture that output.
What about other messages thrown by SQL? I don't know what they are, but "Command(s) completed successfully." is a valid message. Can it be captured somehow? What about other messages that might be generated? (I don't know what they are, as I'm just speaking for the original poster.)|||
Phil Brammer wrote:
What about other messages thrown by SQL? I don't know what they are, but "Command(s) completed successfully." is a valid message. Can it be captured somehow? What about other messages that might be generated? (I don't know what they are, as I'm just speaking for the original poster.)
I may be completely wrong about this but messages such as "Command(s) completed successfully." are messages generated by SSMS, not the database engine. Hence, they wouldn't be accessible from SSIS.
-Jamie
|||
The answers and additional questions from all of you--Jamie, Phil and Duane along with some additional time experimenting with SSIS logging has been very helpful. I was indeed looking for SSMS or Query Analyzer message functionality to be duplicated in SSIS. Now I understand there are a few techniques that, combined, should provide the same messages.
It would be nice if such basic information were included in otherwise excellent references such as BOL or Kirk Haselden's Integration Services book. (Maybe this is mentioned somewhere and I missed it...). Anyway, thanks again.
|||
M.Glenn wrote:
The answers and additional questions from all of you--Jamie, Phil and Duane along with some additional time experimenting with SSIS logging has been very helpful. I was indeed looking for SSMS or Query Analyzer message functionality to be duplicated in SSIS. Now I understand there are a few techniques that, combined, should provide the same messages.
It would be nice if such basic information were included in otherwise excellent references such as BOL or Kirk Haselden's Integration Services book. (Maybe this is mentioned somewhere and I missed it...). Anyway, thanks again.
Mike,
No problem.
If you think that BOL is short somewhere then tell them. The option to provide feedback is taken very seriously (I know from experience). In this case its more SQL engine or T-SQL stuff rather than SSIS - but provide it anyway.
-Jamie
|||
SSMS is merely a GUI for SQL Server. Having said that, all of SQL Server's functionality can be accessed via T-SQL. For example, when a query executes without errors (in other words, when it completes successfully), SQL Server sets the @.@.ERROR variable to "0". This value can be returned to the client via a stored proceedure output parameter.M.Glenn wrote:
The answers and additional questions from all of you--Jamie, Phil and Duane along with some additional time experimenting with SSIS logging has been very helpful. I was indeed looking for SSMS or Query Analyzer message functionality to be duplicated in SSIS. Now I understand there are a few techniques that, combined, should provide the same messages.
It would be nice if such basic information were included in otherwise excellent references such as BOL or Kirk Haselden's Integration Services book. (Maybe this is mentioned somewhere and I missed it...). Anyway, thanks again.
One should bear in mind that SSIS is a very advanced feature of SQL Server. As such, a mastery of T-SQL is extremely helpful when using SSIS.
|||Duane Douglas wrote:
SSMS is merely a GUI for SQL Server. Having said that, all of SQL Server's functionality can be accessed via T-SQL. For example, when a query executes without errors (in other words, when it completes successfully), SQL Server sets the @.@.ERROR variable to "0". This value can be returned to the client via a stored proceedure output parameter. One should bear in mind that SSIS is a very advanced feature of SQL Server. As such, a mastery of T-SQL is extremely helpful when using SSIS.
Agreed, but mastery of T-SQL is not the central issue in this case. I've used @.@.ERROR, @.@.ROWCOUNT, etc. in scripts, stored procs and triggers plenty of times. DTS/SSIS is where I'm on less familiar ground--only recently finding time to delve deeper. My frustration, which Phil Brammer's questions helped to tease out, was explained when Jamie Thompson stated "...messages such as "Command(s) completed successfully." are messages generated by SSMS, not the database engine. Hence, they wouldn't be accessible from SSIS." This together with your posts made it clear that the message stream provided by SSMS or QA is not available in SSIS and rolling your own, so to speak, is the only solution. Okay, I can take it from there.
However, this is disappointing. If you're building an SSIS solution from scratch it's not a problem. But in this case I'm using SSIS to automate more than a dozen vendor developed DDL and DML scripts that otherwise have to be run manually in SSMS. One script would have to be manually rerun more than 40 times, entering a different "batch number" each time. I let them know I wasn't too happy about that one. Anyway, instead of someone sitting there babysitting this convoluted upgrade/cleanup process for hours (or in this case days), SSIS makes it easy to automate the entire process--run the scripts in proper order, manage precedence constraints, enumerate the batch numbers in a forEach loop, etc.
Everything is working beautifully except for logging. A few of the scripts contain 20 or more individual queries. Running them in SSMS or QA would automatically give me a nice message log of the number of records affected, output from print statements embedded in the scripts, error messages, etc. If I understand you guys correctly, there's no built-in functionality to generate this kind of message stream in SSIS. I'd have to break the big scripts down, create ExecuteSQL tasks for each individual query (around 50 to 60 altogether) add and map output parameters for each and configure logging to capture all this. Not a happy prospect.
Is there any reason Microsoft couldn't duplicate SSMS/QA message stream functionality in a future release of SSIS? It would save a lot of work in cases like this. I'll suggest it unless someone has a good reason why it couldn't be done or (preferably) knows of a workaround for this situation.
Happy holiday!
|||OK, so then how do you write the contents of the package-scoped variable to the log?
I think I've got my rowCnt into a variable but I can see what value it is.
I've tried having the package log 'OnVariableValueChanged' events, but it just tells me the name of the variable, not the value.
(I'm using SQL2005 SP2)
Thanks
|||
Andy Abel wrote:
OK, so then how do you write the contents of the package-scoped variable to the log?
I think I've got my rowCnt into a variable but I can see what value it is.
I've tried having the package log 'OnVariableValueChanged' events, but it just tells me the name of the variable, not the value.
(I'm using SQL2005 SP2)
Thanks
If you are just trying to see the value of the variable, add a breakpoint and use the Watch window to view it. If you need to record the value, you could add an event handler for the OnVariableValueChanged event, and use an Execute SQL task to write it to the database.
Logging query errors
Is there a way to log errors raised while running a query. For instance a
table contains a duplicate entry and a scheduled query notices this. Is it
then possible to get an alert in for instance the SQL log?
Greets,
Fred
Fred
If you are on SQL Server 2005 take a look at Notification Services in the
BOL
One method is
IF EXISTS(SELECT *
FROM TableA
WHERE col = @.key
GROUP BY col
HAVING COUNT(*)>1)
BEGIN
RAISERROR ('There are duplicates in the table',16, 1)
END
ELSE
"Fred Wouters" <f.wouters@.tigra-tuning.nl> wrote in message
news:788AA79C-20C2-438B-9556-8AFF574B129B@.microsoft.com...
> Hi,
> Is there a way to log errors raised while running a query. For instance a
> table contains a duplicate entry and a scheduled query notices this. Is it
> then possible to get an alert in for instance the SQL log?
> Greets,
> Fred
|||I'm using SQL 2000. Would the query below also work on 2000?
"Uri Dimant" wrote:
> Fred
> If you are on SQL Server 2005 take a look at Notification Services in the
> BOL
>
> One method is
> IF EXISTS(SELECT *
> FROM TableA
> WHERE col = @.key
> GROUP BY col
> HAVING COUNT(*)>1)
> BEGIN
> RAISERROR ('There are duplicates in the table',16, 1)
> END
> ELSE
>
>
> "Fred Wouters" <f.wouters@.tigra-tuning.nl> wrote in message
> news:788AA79C-20C2-438B-9556-8AFF574B129B@.microsoft.com...
>
>
|||Fred
Yes
"Fred Wouters" <f.wouters@.tigra-tuning.nl> wrote in message
news:AB5685F6-EF5D-4346-A4C0-A8D02D54DDFB@.microsoft.com...[vbcol=seagreen]
> I'm using SQL 2000. Would the query below also work on 2000?
> "Uri Dimant" wrote: