I was wondering if somebody could critique my query tuning technique.
I am primarily a developer and only a part time dba. I was trained by
a great dba, but it was back in the sql 6.5 and 7.0 days. I
concentrate pretty heavily on logical reads, and I am wondering if my
technique is still valid or if I should be concentrating elsewhere?
We use sql profiler and find queries that are taking longer than 15
seconds to run. I get the sql and run it in query analyzer and turn
statistics io on and look at logical reads. I keep tuning the sql and
try to get logical reads down. My goal is to get them to less than
10,000 logical reads. But if I get a query from 4,000,000 read to
200,000, I believe I have made a good improvement.
Some of our code calls multiple sql statements in sequence, so I
analyze the code and have a pretty good idea what tables will probably
be cached. If I see that the tables will be cached because of previous
sql statements, then I can live with a query that has higher logical
reads.
I pretty much don't understand the graphical Execution Plan. Am I
crazy ignoring this? Or is my above technique pretty good?The goal is always to get the logical reads to a minimum. But there is no
way to tell if a particular # is as low as it can be without looking at the
query plan to some degree. I don't know what kind of queries you are doing
but if your only looking for queries over 15 seconds and think 10K reads is
great you guy's must have a lot of time on your hands to wait for
results<g>. Usually anything over 150ms is too long for me. Hopefully
these are some type of reporting queries and the amount of reads are
justified..
--
Andrew J. Kelly SQL MVP
"Bill Ray" <rinfo@.mail.com> wrote in message
news:63b44ea.0407020726.156a0c50@.posting.google.com...
> I was wondering if somebody could critique my query tuning technique.
> I am primarily a developer and only a part time dba. I was trained by
> a great dba, but it was back in the sql 6.5 and 7.0 days. I
> concentrate pretty heavily on logical reads, and I am wondering if my
> technique is still valid or if I should be concentrating elsewhere?
> We use sql profiler and find queries that are taking longer than 15
> seconds to run. I get the sql and run it in query analyzer and turn
> statistics io on and look at logical reads. I keep tuning the sql and
> try to get logical reads down. My goal is to get them to less than
> 10,000 logical reads. But if I get a query from 4,000,000 read to
> 200,000, I believe I have made a good improvement.
> Some of our code calls multiple sql statements in sequence, so I
> analyze the code and have a pretty good idea what tables will probably
> be cached. If I see that the tables will be cached because of previous
> sql statements, then I can live with a query that has higher logical
> reads.
> I pretty much don't understand the graphical Execution Plan. Am I
> crazy ignoring this? Or is my above technique pretty good?
Showing posts with label tune. Show all posts
Showing posts with label tune. Show all posts
Wednesday, March 7, 2012
Logical reads to tune sql
I was wondering if somebody could critique my query tuning technique.
I am primarily a developer and only a part time dba. I was trained by
a great dba, but it was back in the sql 6.5 and 7.0 days. I
concentrate pretty heavily on logical reads, and I am wondering if my
technique is still valid or if I should be concentrating elsewhere?
We use sql profiler and find queries that are taking longer than 15
seconds to run. I get the sql and run it in query analyzer and turn
statistics io on and look at logical reads. I keep tuning the sql and
try to get logical reads down. My goal is to get them to less than
10,000 logical reads. But if I get a query from 4,000,000 read to
200,000, I believe I have made a good improvement.
Some of our code calls multiple sql statements in sequence, so I
analyze the code and have a pretty good idea what tables will probably
be cached. If I see that the tables will be cached because of previous
sql statements, then I can live with a query that has higher logical
reads.
I pretty much don't understand the graphical Execution Plan. Am I
crazy ignoring this? Or is my above technique pretty good?
The goal is always to get the logical reads to a minimum. But there is no
way to tell if a particular # is as low as it can be without looking at the
query plan to some degree. I don't know what kind of queries you are doing
but if your only looking for queries over 15 seconds and think 10K reads is
great you guy's must have a lot of time on your hands to wait for
results<g>. Usually anything over 150ms is too long for me. Hopefully
these are some type of reporting queries and the amount of reads are
justified..
Andrew J. Kelly SQL MVP
"Bill Ray" <rinfo@.mail.com> wrote in message
news:63b44ea.0407020726.156a0c50@.posting.google.co m...
> I was wondering if somebody could critique my query tuning technique.
> I am primarily a developer and only a part time dba. I was trained by
> a great dba, but it was back in the sql 6.5 and 7.0 days. I
> concentrate pretty heavily on logical reads, and I am wondering if my
> technique is still valid or if I should be concentrating elsewhere?
> We use sql profiler and find queries that are taking longer than 15
> seconds to run. I get the sql and run it in query analyzer and turn
> statistics io on and look at logical reads. I keep tuning the sql and
> try to get logical reads down. My goal is to get them to less than
> 10,000 logical reads. But if I get a query from 4,000,000 read to
> 200,000, I believe I have made a good improvement.
> Some of our code calls multiple sql statements in sequence, so I
> analyze the code and have a pretty good idea what tables will probably
> be cached. If I see that the tables will be cached because of previous
> sql statements, then I can live with a query that has higher logical
> reads.
> I pretty much don't understand the graphical Execution Plan. Am I
> crazy ignoring this? Or is my above technique pretty good?
I am primarily a developer and only a part time dba. I was trained by
a great dba, but it was back in the sql 6.5 and 7.0 days. I
concentrate pretty heavily on logical reads, and I am wondering if my
technique is still valid or if I should be concentrating elsewhere?
We use sql profiler and find queries that are taking longer than 15
seconds to run. I get the sql and run it in query analyzer and turn
statistics io on and look at logical reads. I keep tuning the sql and
try to get logical reads down. My goal is to get them to less than
10,000 logical reads. But if I get a query from 4,000,000 read to
200,000, I believe I have made a good improvement.
Some of our code calls multiple sql statements in sequence, so I
analyze the code and have a pretty good idea what tables will probably
be cached. If I see that the tables will be cached because of previous
sql statements, then I can live with a query that has higher logical
reads.
I pretty much don't understand the graphical Execution Plan. Am I
crazy ignoring this? Or is my above technique pretty good?
The goal is always to get the logical reads to a minimum. But there is no
way to tell if a particular # is as low as it can be without looking at the
query plan to some degree. I don't know what kind of queries you are doing
but if your only looking for queries over 15 seconds and think 10K reads is
great you guy's must have a lot of time on your hands to wait for
results<g>. Usually anything over 150ms is too long for me. Hopefully
these are some type of reporting queries and the amount of reads are
justified..
Andrew J. Kelly SQL MVP
"Bill Ray" <rinfo@.mail.com> wrote in message
news:63b44ea.0407020726.156a0c50@.posting.google.co m...
> I was wondering if somebody could critique my query tuning technique.
> I am primarily a developer and only a part time dba. I was trained by
> a great dba, but it was back in the sql 6.5 and 7.0 days. I
> concentrate pretty heavily on logical reads, and I am wondering if my
> technique is still valid or if I should be concentrating elsewhere?
> We use sql profiler and find queries that are taking longer than 15
> seconds to run. I get the sql and run it in query analyzer and turn
> statistics io on and look at logical reads. I keep tuning the sql and
> try to get logical reads down. My goal is to get them to less than
> 10,000 logical reads. But if I get a query from 4,000,000 read to
> 200,000, I believe I have made a good improvement.
> Some of our code calls multiple sql statements in sequence, so I
> analyze the code and have a pretty good idea what tables will probably
> be cached. If I see that the tables will be cached because of previous
> sql statements, then I can live with a query that has higher logical
> reads.
> I pretty much don't understand the graphical Execution Plan. Am I
> crazy ignoring this? Or is my above technique pretty good?
Logical reads to tune sql
I was wondering if somebody could critique my query tuning technique.
I am primarily a developer and only a part time dba. I was trained by
a great dba, but it was back in the sql 6.5 and 7.0 days. I
concentrate pretty heavily on logical reads, and I am wondering if my
technique is still valid or if I should be concentrating elsewhere?
We use sql profiler and find queries that are taking longer than 15
seconds to run. I get the sql and run it in query analyzer and turn
statistics io on and look at logical reads. I keep tuning the sql and
try to get logical reads down. My goal is to get them to less than
10,000 logical reads. But if I get a query from 4,000,000 read to
200,000, I believe I have made a good improvement.
Some of our code calls multiple sql statements in sequence, so I
analyze the code and have a pretty good idea what tables will probably
be cached. If I see that the tables will be cached because of previous
sql statements, then I can live with a query that has higher logical
reads.
I pretty much don't understand the graphical Execution Plan. Am I
crazy ignoring this? Or is my above technique pretty good?The goal is always to get the logical reads to a minimum. But there is no
way to tell if a particular # is as low as it can be without looking at the
query plan to some degree. I don't know what kind of queries you are doing
but if your only looking for queries over 15 seconds and think 10K reads is
great you guy's must have a lot of time on your hands to wait for
results<g>. Usually anything over 150ms is too long for me. Hopefully
these are some type of reporting queries and the amount of reads are
justified..
Andrew J. Kelly SQL MVP
"Bill Ray" <rinfo@.mail.com> wrote in message
news:63b44ea.0407020726.156a0c50@.posting.google.com...
> I was wondering if somebody could critique my query tuning technique.
> I am primarily a developer and only a part time dba. I was trained by
> a great dba, but it was back in the sql 6.5 and 7.0 days. I
> concentrate pretty heavily on logical reads, and I am wondering if my
> technique is still valid or if I should be concentrating elsewhere?
> We use sql profiler and find queries that are taking longer than 15
> seconds to run. I get the sql and run it in query analyzer and turn
> statistics io on and look at logical reads. I keep tuning the sql and
> try to get logical reads down. My goal is to get them to less than
> 10,000 logical reads. But if I get a query from 4,000,000 read to
> 200,000, I believe I have made a good improvement.
> Some of our code calls multiple sql statements in sequence, so I
> analyze the code and have a pretty good idea what tables will probably
> be cached. If I see that the tables will be cached because of previous
> sql statements, then I can live with a query that has higher logical
> reads.
> I pretty much don't understand the graphical Execution Plan. Am I
> crazy ignoring this? Or is my above technique pretty good?
I am primarily a developer and only a part time dba. I was trained by
a great dba, but it was back in the sql 6.5 and 7.0 days. I
concentrate pretty heavily on logical reads, and I am wondering if my
technique is still valid or if I should be concentrating elsewhere?
We use sql profiler and find queries that are taking longer than 15
seconds to run. I get the sql and run it in query analyzer and turn
statistics io on and look at logical reads. I keep tuning the sql and
try to get logical reads down. My goal is to get them to less than
10,000 logical reads. But if I get a query from 4,000,000 read to
200,000, I believe I have made a good improvement.
Some of our code calls multiple sql statements in sequence, so I
analyze the code and have a pretty good idea what tables will probably
be cached. If I see that the tables will be cached because of previous
sql statements, then I can live with a query that has higher logical
reads.
I pretty much don't understand the graphical Execution Plan. Am I
crazy ignoring this? Or is my above technique pretty good?The goal is always to get the logical reads to a minimum. But there is no
way to tell if a particular # is as low as it can be without looking at the
query plan to some degree. I don't know what kind of queries you are doing
but if your only looking for queries over 15 seconds and think 10K reads is
great you guy's must have a lot of time on your hands to wait for
results<g>. Usually anything over 150ms is too long for me. Hopefully
these are some type of reporting queries and the amount of reads are
justified..
Andrew J. Kelly SQL MVP
"Bill Ray" <rinfo@.mail.com> wrote in message
news:63b44ea.0407020726.156a0c50@.posting.google.com...
> I was wondering if somebody could critique my query tuning technique.
> I am primarily a developer and only a part time dba. I was trained by
> a great dba, but it was back in the sql 6.5 and 7.0 days. I
> concentrate pretty heavily on logical reads, and I am wondering if my
> technique is still valid or if I should be concentrating elsewhere?
> We use sql profiler and find queries that are taking longer than 15
> seconds to run. I get the sql and run it in query analyzer and turn
> statistics io on and look at logical reads. I keep tuning the sql and
> try to get logical reads down. My goal is to get them to less than
> 10,000 logical reads. But if I get a query from 4,000,000 read to
> 200,000, I believe I have made a good improvement.
> Some of our code calls multiple sql statements in sequence, so I
> analyze the code and have a pretty good idea what tables will probably
> be cached. If I see that the tables will be cached because of previous
> sql statements, then I can live with a query that has higher logical
> reads.
> I pretty much don't understand the graphical Execution Plan. Am I
> crazy ignoring this? Or is my above technique pretty good?
logical read
Hi
I am trying to tune a sql query. The query return about 300 rows of uniqueindentifier. But in the statistics IO, I see Logical reads of 7228, that does not seems right? I tried to rebuild the index, but the logical read did not help. But I set up the database on another computer, and the IO on this on came down to 10. Is there something wrong with the computer with logical read of 7228
thanks
Pauli
There may not be enough memory available to SQL Server, and the data cache is being flushed.
What is the memory/load differences between the two computers?
|||Are the indexes the same in both databases? It really sounds like you are missing an index in the first database. Even on a machine with memory pressure, that much of a difference in logical IO's points at an index or statistics problem. Are you seeing physical reads when you run the query in the first database?
Subscribe to:
Posts (Atom)