I have an application which comprises a VB6 client/server with a SQL
server back end. All the queries that the application uses are in
stored procedures which are initially created and occasionally
recreated by the application itself. The application does this by
building a CREATE PROCEDURE... SQL statement and then executing this
via the execute method of an ADO connection.
Any stored procs that involve 'Activities' are quite database intensive
as they use a view which is a multi-union view from lots of tables.
However, until now, once these stored procedures are compiled they run
pretty fast (typically 1 or 2 seconds).
Here's the problem (which has just started happening): when the
application creates (or recreates) a stored procedure involving
Activities, the query now takes about 10 seconds (instead of the
previous 1 or 2 seconds). However (and this is the really strange
part) when I get the procedure definition SQL from the stored proc and
execute it in Query Analyzer to recreate the SP, it creates a stored
proc that executes quickly (i.e. back to the 1 - 2 seconds of before).
This is consistent (to a point) - each time I rebuild a stored proc
through the application the SP executes slowly and each time I create
it via Query Analyzer it executes quickly. In addition to this (just
to complicate things further!) after I've been testing it repeatedly
for a while it sometimes starts behaving ok - i.e. the stored proc runs
fast all the time regardless of whether it is created via the
application or via Query Analyzer.
I have tried this on our development SQL server and on a local MSDE
instance and I have tried it with different copies of the database -
the problem occurs in all tests so it doesn't seem to be db or server
related. I've noticed that the execution plan differs depending on how
the stored proc is created so I guess this is what's causing the big
time difference but the question is why should it matter how the stored
proc is created? The data is unchanged between tests and the stored
procedure text is the same - the only thing that changes is how the SP
is created (i.e. my application or Query Analyzer).
I have been tearing my hair out on this - can anyone please offer a
suggestion that might assist?
IanRead up on the SQL Server "procedure cache" and see if this would play a
role in what you are observing. Using SQL Profiler, you can trace
SP:CacheMiss, SP:CacheHit and other cache related events to determine what
is going on behind the scenes when your SP is being created or executed.
<ian__@.hotmail.com> wrote in message
news:1116345172.130396.298070@.g44g2000cwa.googlegroups.com...
> I have an application which comprises a VB6 client/server with a SQL
> server back end. All the queries that the application uses are in
> stored procedures which are initially created and occasionally
> recreated by the application itself. The application does this by
> building a CREATE PROCEDURE... SQL statement and then executing this
> via the execute method of an ADO connection.
> Any stored procs that involve 'Activities' are quite database intensive
> as they use a view which is a multi-union view from lots of tables.
> However, until now, once these stored procedures are compiled they run
> pretty fast (typically 1 or 2 seconds).
> Here's the problem (which has just started happening): when the
> application creates (or recreates) a stored procedure involving
> Activities, the query now takes about 10 seconds (instead of the
> previous 1 or 2 seconds). However (and this is the really strange
> part) when I get the procedure definition SQL from the stored proc and
> execute it in Query Analyzer to recreate the SP, it creates a stored
> proc that executes quickly (i.e. back to the 1 - 2 seconds of before).
> This is consistent (to a point) - each time I rebuild a stored proc
> through the application the SP executes slowly and each time I create
> it via Query Analyzer it executes quickly. In addition to this (just
> to complicate things further!) after I've been testing it repeatedly
> for a while it sometimes starts behaving ok - i.e. the stored proc runs
> fast all the time regardless of whether it is created via the
> application or via Query Analyzer.
> I have tried this on our development SQL server and on a local MSDE
> instance and I have tried it with different copies of the database -
> the problem occurs in all tests so it doesn't seem to be db or server
> related. I've noticed that the execution plan differs depending on how
> the stored proc is created so I guess this is what's causing the big
> time difference but the question is why should it matter how the stored
> proc is created? The data is unchanged between tests and the stored
> procedure text is the same - the only thing that changes is how the SP
> is created (i.e. my application or Query Analyzer).
> I have been tearing my hair out on this - can anyone please offer a
> suggestion that might assist?
> Ian
>|||Hi
Why do you need to re-create the stored procedures all the time? Usually
stored procedures are static code and you just change the values of the
parameters passed to them!
John
"ian__@.hotmail.com" wrote:
> I have an application which comprises a VB6 client/server with a SQL
> server back end. All the queries that the application uses are in
> stored procedures which are initially created and occasionally
> recreated by the application itself. The application does this by
> building a CREATE PROCEDURE... SQL statement and then executing this
> via the execute method of an ADO connection.
> Any stored procs that involve 'Activities' are quite database intensive
> as they use a view which is a multi-union view from lots of tables.
> However, until now, once these stored procedures are compiled they run
> pretty fast (typically 1 or 2 seconds).
> Here's the problem (which has just started happening): when the
> application creates (or recreates) a stored procedure involving
> Activities, the query now takes about 10 seconds (instead of the
> previous 1 or 2 seconds). However (and this is the really strange
> part) when I get the procedure definition SQL from the stored proc and
> execute it in Query Analyzer to recreate the SP, it creates a stored
> proc that executes quickly (i.e. back to the 1 - 2 seconds of before).
> This is consistent (to a point) - each time I rebuild a stored proc
> through the application the SP executes slowly and each time I create
> it via Query Analyzer it executes quickly. In addition to this (just
> to complicate things further!) after I've been testing it repeatedly
> for a while it sometimes starts behaving ok - i.e. the stored proc runs
> fast all the time regardless of whether it is created via the
> application or via Query Analyzer.
> I have tried this on our development SQL server and on a local MSDE
> instance and I have tried it with different copies of the database -
> the problem occurs in all tests so it doesn't seem to be db or server
> related. I've noticed that the execution plan differs depending on how
> the stored proc is created so I guess this is what's causing the big
> time difference but the question is why should it matter how the stored
> proc is created? The data is unchanged between tests and the stored
> procedure text is the same - the only thing that changes is how the SP
> is created (i.e. my application or Query Analyzer).
> I have been tearing my hair out on this - can anyone please offer a
> suggestion that might assist?
> Ian
>|||Are them being created with the same schema or owner in both cases (app and
QA)?
AMB
"ian__@.hotmail.com" wrote:
> I have an application which comprises a VB6 client/server with a SQL
> server back end. All the queries that the application uses are in
> stored procedures which are initially created and occasionally
> recreated by the application itself. The application does this by
> building a CREATE PROCEDURE... SQL statement and then executing this
> via the execute method of an ADO connection.
> Any stored procs that involve 'Activities' are quite database intensive
> as they use a view which is a multi-union view from lots of tables.
> However, until now, once these stored procedures are compiled they run
> pretty fast (typically 1 or 2 seconds).
> Here's the problem (which has just started happening): when the
> application creates (or recreates) a stored procedure involving
> Activities, the query now takes about 10 seconds (instead of the
> previous 1 or 2 seconds). However (and this is the really strange
> part) when I get the procedure definition SQL from the stored proc and
> execute it in Query Analyzer to recreate the SP, it creates a stored
> proc that executes quickly (i.e. back to the 1 - 2 seconds of before).
> This is consistent (to a point) - each time I rebuild a stored proc
> through the application the SP executes slowly and each time I create
> it via Query Analyzer it executes quickly. In addition to this (just
> to complicate things further!) after I've been testing it repeatedly
> for a while it sometimes starts behaving ok - i.e. the stored proc runs
> fast all the time regardless of whether it is created via the
> application or via Query Analyzer.
> I have tried this on our development SQL server and on a local MSDE
> instance and I have tried it with different copies of the database -
> the problem occurs in all tests so it doesn't seem to be db or server
> related. I've noticed that the execution plan differs depending on how
> the stored proc is created so I guess this is what's causing the big
> time difference but the question is why should it matter how the stored
> proc is created? The data is unchanged between tests and the stored
> procedure text is the same - the only thing that changes is how the SP
> is created (i.e. my application or Query Analyzer).
> I have been tearing my hair out on this - can anyone please offer a
> suggestion that might assist?
> Ian
>|||>> All the queries that the application uses are in stored procedures
which are initially created and occasionally recreated by the
application itself. The application does this by building a CREATE
PROCEDURE... SQL statement and then executing this via the execute
method of an ADO connection. <<
So you are such a bad SQL programmer that a random front end user
should be able to re-arrange the database. How did you expect to have
any data integrity?
If you had followed basic software engineering principles, the stored
procedures would be written, controlled and executed in the database
and not by the front end. This has nothing to do with SQL. This is
the foundations of all programming.
You need to start over, get a book on basic software engineering and
re-write what you have. As a rule of thumb, when you have "a
multi-union view from lots of tables", you usually have serious schema
design flaws.
Read about procedure casches, too. That is why dynamic things vary in
speed.|||On 17 May 2005 08:52:52 -0700, ian__@.hotmail.com wrote:
(snip)
>I have been tearing my hair out on this - can anyone please offer a
>suggestion that might assist?
Hi Ian,
First, let me state that I fully agree with the doubts expressed by John
Bell and Joe Celko regarding your design. I also agree with the possible
causes brought forward by JT and Alejandro Mesa.
But another possible explanation is this: check out the settings for the
options SET QUOTED_IDENTIFIER and SET ANSI_NULLS when creating the
procedure from QA or when creating it from ADO. These settings are saved
with the procedure when it's created (or rather: they are encoded into
the execution plan). A different value for one or both of these options
can result in a different execution plan.
Best, Hugo
--
(Remove _NO_ and _SPAM_ to get my e-mail address)|||Firstly, thanks to those who responded in a constructive and courteous
manner to my question - there's always a danger when posting that
small-minded individuals are going to respond with a load of unhelpful
and ill-informed comments. I really don't know how people can post
spiteful criticism based on assumption!
For the record, the application is of an extremely sophisticated nature
and is designed to enable the end-users to create their own, very
complex and powerful queries using a comparatively simple user
interface. These queries are created as stored procedures as they do
not change often and are executed frequently. The application ensures
data integrity but perhaps the notion of such an advanced design is
beyond people like CELKO?
Ian|||Thanks and congrats to Hugo! ANSI_NULLS were on in QA and off in the
ADO connection. I have amended the application code to set on before
rebuilding the stored procedure and the problem is fixed - I'm very
happy!
The reason this only started happening was due to a recent patch where
ANSI_NULLS were set OFF - this was not explicitly set before that
patch.
CELKO, see what can happen when you try to be helpful?
Ian|||On 18 May 2005 01:08:36 -0700, ian__@.hotmail.com wrote:
>For the record, the application is of an extremely sophisticated nature
>and is designed to enable the end-users to create their own, very
>complex and powerful queries using a comparatively simple user
>interface. These queries are created as stored procedures as they do
>not change often and are executed frequently. The application ensures
>data integrity but perhaps the notion of such an advanced design is
>beyond people like CELKO?
Hi Ian,
Actually, I think that Joe Celko has seen enough designs like this, AND
the results from it to make him very wary of this design.
Of course, Joe only sees the cases that have gone wrong (you don't pay
his rates to review a database that appears to be working fine), and
your situation might well be an exception, but still...
If you're allowing end users to write queries, then how do you gaurd
against the risk of injection of bad code? What do you do to prevent
someone including "DELETE FROM Customers WHERE 1 = 1" or "SHUTDOWN WITH
NOWAIT" or "EXEC sp_addrolemember 'System Administrators', 'Jeff'"?
Also, how do you gaurd against queries that run for hours, bringing the
database to it's knees or holding locks for so long that all concurrency
is lost?
If your end users are all developers and can be trusted not to do silly,
stupid, or even malevolent things, then why don't you simply grant them
the rights to add stored procedures and views, or to execute ad-hoc
queries against the database?
If your end users don't fall into this category, then you should not
give them a way to do development work they're not qualifeid for.
As I said - your situation might well be the exception. Not all
situations where designs like this have been implemented have
experienced the unwanteed side effects. But many do. I do hope that
you'll take Joe Celko's warning to heart - and that you seriously
consider other options.
(Since the queries don't change often, I'd set up a change request
system where the end users write stored procedures, sent them to a
skilled DBA or developer for review, and the latter executed the CREATE
(or ALTER) PROCEDURE script if the query is okay, or proposes
improvements and discusses them with the submitter of the query.)
Best, Hugo
--
(Remove _NO_ and _SPAM_ to get my e-mail address)|||>> The application ensures data integrity but perhaps the notion of
such an advanced design is beyond people like CELKO? <<
LOL!! Of course you have always and will for the entire life of the
database, hire only *perfect* programmers. In the thousands and
thousands of lines of code they will write over time, nobody will
forget any business rules. Not one single rule! Amazing.
All the application code will use *exactly* the same algorithms. Never
mind that different programming languages use different truncation,
rounding, MOD() functions, string comparisons and so forth. The
perfect programmers will change the compilers or write their own
functions exactly the same way.
All third party packages will follow all of our business rules. How
they are going to do this when those rules are spread over thousands
and thousands of lines of application code, I don't know. Perhaps you
can tell me.
And when -- not if -- one of these integrity rules changes, the perfect
programmers will instantly propagate the changes in thousands and
thousands of lines of application code. And they will verify these
changes instantly.
And finally only perfect programmers will get to use QA or other tools
that go directly to the database without application code.
Advanced design? This is a return to a very primitive file systems
architecture. Talk to an old COBOL Programmer. The redundancy and
total lack of data integrity in those file systems are some of the
reasons we moved to DBMS and finally to RDBMS. You have re-discovered
1950's style ADP!
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment