This document provides an overview of optimizing stored procedure performance in SQL Server 2000. It discusses the initial processing of stored procedures, including resolution, compilation, optimization and execution. It covers issues that can cause recompilation of stored procedures and different options for handling recompilation. The document also provides best practices for naming conventions, writing solid code to avoid excessive recompilations, and detecting recompilations. It recommends testing recompilation behavior and using modular code and statement recompilation where appropriate. The overview aims to help optimize stored procedure performance.
This document outlines Kumar Rajeev Rastogi's presentation on using native compilation, also known as schema binding, to improve PostgreSQL performance. It discusses trends toward in-memory databases and the need to reduce CPU instructions. Three methods for generating specialized access functions for tuples based on table schemas are described: 1) changing the tuple format, 2) using existing macros, and 3) reordering columns. Performance tests on TPC-H and hash joins show improvements of up to 36% and 23% respectively through dramatic reductions in CPU instructions via schema binding.
This document provides an introduction and overview of System Verilog. It discusses what System Verilog is, why it was developed, its uses for hardware description and verification. Key features of System Verilog are then outlined such as its data types, arrays, queues, events, structures, unions and classes. Examples are provided for many of these features.
The document provides an overview of using SystemVerilog coverage, including:
- The two types of functional coverage in SystemVerilog: cover properties and covergroups
- Examples of defining cover properties and covergroups
- Tips for using shorthand notation, adding covergroup arguments, and utilizing coverage options to make covergroups more flexible and reusable
Watch the companion webinar at: http://embt.co/16cXD4h
Join Oracle ACE Director, Dan Hotka and Solutions Consultant Director, Scott Walz in part two of the series, where they will continue to build on that knowledge and share even more expertise on PL/SQL procedures, functions and packages.
Watch the webinar to learn about:
+ Procedures, functions and packages
+ Tips on PL/SQL compiling options
+ Performance tuning
Learn more about DB PowerStudio at: http://embt.co/DBPower
PL/SQL is the procedural extension language for SQL. It allows developers to write reusable code by creating blocks, procedures, functions and packages. PL/SQL code can contain variable declarations, executable statements like SQL, and control structures. The PL/SQL compiler separates SQL statements from procedural logic, with SQL statements sent to the database and procedural code executed by the PL/SQL engine.
Introduction to PL/SQL exceptions
Oracle error codes
Pragmas
User Defined Exception and Pragma EXCEPTION_INIT
DBMS_UTILITY package
Guidelines for exception handling
Guidelines for exception handling - FORALL
Foolproof your PL/SQL programs Standalone procs and functions
Foolproof your PL/SQL programs - packages
Foolproof your PL/SQL programs - Assumptions
Foolproof your PL/SQL programs - Tracing
This document discusses dynamic SQL and metadata in Oracle. It describes how to build and execute SQL statements dynamically using native dynamic SQL with EXECUTE IMMEDIATE statements or the DBMS_SQL package. It also explains how to use the DBMS_METADATA package to obtain metadata from the data dictionary as XML or DDL that can be used to re-create database objects. Examples are provided for dynamic SQL DDL, DML, queries and PL/SQL blocks.
This document discusses randomization using SystemVerilog. It begins by introducing constraint-driven test generation and random testing. It explains that SystemVerilog allows specifying constraints in a compact way to generate random values that meet the constraints. The document then discusses using objects to model complex data types for randomization. It provides examples of using SystemVerilog functions like $random, $urandom, and $urandom_range to generate random numbers. It also discusses constraining randomization using inline constraints and randomizing objects with the randomize method.
This is Class 4 on a 6 week course I taught on Software Design Patterns.
This course goes over Command and Adapter pattern.
Class based on "Head First Design Patterns."
The document discusses benchmarking in Java using JMH (Java Microbenchmark Harness). It covers background on benchmarking, types of benchmarks (macro and micro), factors to consider in benchmarking, issues with hand-written benchmarks, and how to get started with JMH. Key points include that JMH helps minimize JVM optimizations to get accurate measurements, and that benchmarks should include a warmup phase to initialize the environment before recording results.
The document discusses the UVM register model, which provides an object-oriented shadow model for registers and memories in a DUT. It includes components like fields, registers, register files, memory, and blocks. The register model allows verification of register access and provides a standardized way to build reusable verification components.
The document discusses the architecture and APIs of the Java Virtual Machine (JVM). It begins with an overview of the JVM and its components, including data types, storage, instruction set, exceptions and errors, and binary classes. It then discusses how the Java platform is completed through APIs, providing examples of Java platforms and serialization APIs. It concludes by discussing the Java Native Interface and how it allows Java code to interoperate with native compiled code.
Lecture 6 from the IAG0040 Java course in TT.
See the accompanying source code written during the lectures: https://github.com/angryziber/java-course
Gives an overview how a software developer should organize their daily work, apart from technical skills.
Introduces Agile software development practices from XP and Scrum.
This document discusses using JDBC to access databases from Java applications like JSP pages. It covers loading the appropriate JDBC driver, establishing a connection with the database using a connection URL, executing SQL statements using Statement objects to retrieve and process result sets, and closing the connection when done. The core steps are to load the driver, get a connection, create statements, execute queries/updates, process results, and close the connection.
The Command Pattern encapsulates requests as objects, allowing clients to parameterize requests and supporting undoable operations. It decouples the invoker of a request from the implementation of the request, making it possible to queue, log, and execute requests at different times. Some benefits include flexibility in specifying, queueing, and executing requests, as well as supporting undo functionality. Some potential liabilities include efficiency losses and an excessive number of command classes.
A brief introduction to Process synchronization in Operating Systems with classical examples and solutions using semaphores. A good starting tutorial for beginners.
This is Class 2 on a 6 week course I taught on Software Design Patterns.
This course discusses Strategy and Template pattern.
Class based on "Head First Design Patterns."
This document discusses process synchronization and hardware solutions to critical section problems. It describes process synchronization as coordinating processes that share resources to maintain data consistency. Race conditions can occur if operations are not properly sequenced. Hardware solutions like locks, test-and-set instructions, and compare-and-swap can enforce mutual exclusion, progress, and bounded waiting to allow only one process in the critical section at a time.
Concurrency: Mutual Exclusion and SynchronizationAnas Ebrahim
油
This document discusses concurrency and synchronization in operating systems. It covers mutual exclusion and how it must be enforced to prevent interference between concurrent processes accessing shared resources. Various synchronization mechanisms are described, including semaphores, mutexes, monitors and event flags. The producer-consumer problem is presented and solutions shown using semaphores to ensure processes can access shared resources like buffers safely. Implementation of semaphores also discussed, needing atomic operations.
CppUnit is a unit testing framework for C++ that was originally ported from JUnit. The document discusses using CppUnit framework classes like TestCase, TestRunner, and TestFixture for unit testing. It also covers integrating CppUnit into the build process and using helper macros to minimize coding errors when creating tests.
This document discusses SageFrame, an open source content management framework (CMF) built on ASP.NET. It provides an overview of SageFrame's introduction and workflow, available modules, templates, and architecture. Upcoming features in version 2.1 are also outlined, along with information on community support and a request for any additional questions.
This document discusses A/B testing for e-commerce websites. It explains that A/B testing involves comparing two versions of a webpage to see which performs better in terms of conversion rates. It then provides examples of elements that can be tested, such as headlines, images, calls-to-action buttons. Specifically for e-commerce, it recommends testing buttons, pricing strategies, product displays, and checkout pages. It concludes that regular A/B testing can significantly improve design and usability, leading to enhanced customer perception and increased profits.
A/B testing compares two versions of a web page by randomly showing visitors version A or B and measuring which performs better based on conversion rates. JavaScript and web services are used to determine which version a visitor sees and to save visit and conversion data which can then be viewed in a dashboard report showing metrics like visits, conversions, and conversion rates for each version over time periods like day, week, or month.
Artificial intelligence (AI) is everywhere, promising self-driving cars, medical breakthroughs, and new ways of working. But how do you separate hype from reality? How can your company apply AI to solve real business problems?
Heres what AI learnings your business should keep in mind for 2017.
Study: The Future of VR, AR and Self-Driving CarsLinkedIn
油
We asked LinkedIn members worldwide about their levels of interest in the latest wave of technology: whether theyre using wearables, and whether they intend to buy self-driving cars and VR headsets as they become available. We asked them too about their attitudes to technology and to the growing role of Artificial Intelligence (AI) in the devices that they use. The answers were fascinating and in many cases, surprising.
This 際際滷Share explores the full results of this study, including detailed market-by-market breakdowns of intention levels for each technology and how attitudes change with age, location and seniority level. If youre marketing a tech brand or planning to use VR and wearables to reach a professional audience then these are insights you wont want to miss.
This document discusses randomization using SystemVerilog. It begins by introducing constraint-driven test generation and random testing. It explains that SystemVerilog allows specifying constraints in a compact way to generate random values that meet the constraints. The document then discusses using objects to model complex data types for randomization. It provides examples of using SystemVerilog functions like $random, $urandom, and $urandom_range to generate random numbers. It also discusses constraining randomization using inline constraints and randomizing objects with the randomize method.
This is Class 4 on a 6 week course I taught on Software Design Patterns.
This course goes over Command and Adapter pattern.
Class based on "Head First Design Patterns."
The document discusses benchmarking in Java using JMH (Java Microbenchmark Harness). It covers background on benchmarking, types of benchmarks (macro and micro), factors to consider in benchmarking, issues with hand-written benchmarks, and how to get started with JMH. Key points include that JMH helps minimize JVM optimizations to get accurate measurements, and that benchmarks should include a warmup phase to initialize the environment before recording results.
The document discusses the UVM register model, which provides an object-oriented shadow model for registers and memories in a DUT. It includes components like fields, registers, register files, memory, and blocks. The register model allows verification of register access and provides a standardized way to build reusable verification components.
The document discusses the architecture and APIs of the Java Virtual Machine (JVM). It begins with an overview of the JVM and its components, including data types, storage, instruction set, exceptions and errors, and binary classes. It then discusses how the Java platform is completed through APIs, providing examples of Java platforms and serialization APIs. It concludes by discussing the Java Native Interface and how it allows Java code to interoperate with native compiled code.
Lecture 6 from the IAG0040 Java course in TT.
See the accompanying source code written during the lectures: https://github.com/angryziber/java-course
Gives an overview how a software developer should organize their daily work, apart from technical skills.
Introduces Agile software development practices from XP and Scrum.
This document discusses using JDBC to access databases from Java applications like JSP pages. It covers loading the appropriate JDBC driver, establishing a connection with the database using a connection URL, executing SQL statements using Statement objects to retrieve and process result sets, and closing the connection when done. The core steps are to load the driver, get a connection, create statements, execute queries/updates, process results, and close the connection.
The Command Pattern encapsulates requests as objects, allowing clients to parameterize requests and supporting undoable operations. It decouples the invoker of a request from the implementation of the request, making it possible to queue, log, and execute requests at different times. Some benefits include flexibility in specifying, queueing, and executing requests, as well as supporting undo functionality. Some potential liabilities include efficiency losses and an excessive number of command classes.
A brief introduction to Process synchronization in Operating Systems with classical examples and solutions using semaphores. A good starting tutorial for beginners.
This is Class 2 on a 6 week course I taught on Software Design Patterns.
This course discusses Strategy and Template pattern.
Class based on "Head First Design Patterns."
This document discusses process synchronization and hardware solutions to critical section problems. It describes process synchronization as coordinating processes that share resources to maintain data consistency. Race conditions can occur if operations are not properly sequenced. Hardware solutions like locks, test-and-set instructions, and compare-and-swap can enforce mutual exclusion, progress, and bounded waiting to allow only one process in the critical section at a time.
Concurrency: Mutual Exclusion and SynchronizationAnas Ebrahim
油
This document discusses concurrency and synchronization in operating systems. It covers mutual exclusion and how it must be enforced to prevent interference between concurrent processes accessing shared resources. Various synchronization mechanisms are described, including semaphores, mutexes, monitors and event flags. The producer-consumer problem is presented and solutions shown using semaphores to ensure processes can access shared resources like buffers safely. Implementation of semaphores also discussed, needing atomic operations.
CppUnit is a unit testing framework for C++ that was originally ported from JUnit. The document discusses using CppUnit framework classes like TestCase, TestRunner, and TestFixture for unit testing. It also covers integrating CppUnit into the build process and using helper macros to minimize coding errors when creating tests.
This document discusses SageFrame, an open source content management framework (CMF) built on ASP.NET. It provides an overview of SageFrame's introduction and workflow, available modules, templates, and architecture. Upcoming features in version 2.1 are also outlined, along with information on community support and a request for any additional questions.
This document discusses A/B testing for e-commerce websites. It explains that A/B testing involves comparing two versions of a webpage to see which performs better in terms of conversion rates. It then provides examples of elements that can be tested, such as headlines, images, calls-to-action buttons. Specifically for e-commerce, it recommends testing buttons, pricing strategies, product displays, and checkout pages. It concludes that regular A/B testing can significantly improve design and usability, leading to enhanced customer perception and increased profits.
A/B testing compares two versions of a web page by randomly showing visitors version A or B and measuring which performs better based on conversion rates. JavaScript and web services are used to determine which version a visitor sees and to save visit and conversion data which can then be viewed in a dashboard report showing metrics like visits, conversions, and conversion rates for each version over time periods like day, week, or month.
Artificial intelligence (AI) is everywhere, promising self-driving cars, medical breakthroughs, and new ways of working. But how do you separate hype from reality? How can your company apply AI to solve real business problems?
Heres what AI learnings your business should keep in mind for 2017.
Study: The Future of VR, AR and Self-Driving CarsLinkedIn
油
We asked LinkedIn members worldwide about their levels of interest in the latest wave of technology: whether theyre using wearables, and whether they intend to buy self-driving cars and VR headsets as they become available. We asked them too about their attitudes to technology and to the growing role of Artificial Intelligence (AI) in the devices that they use. The answers were fascinating and in many cases, surprising.
This 際際滷Share explores the full results of this study, including detailed market-by-market breakdowns of intention levels for each technology and how attitudes change with age, location and seniority level. If youre marketing a tech brand or planning to use VR and wearables to reach a professional audience then these are insights you wont want to miss.
Upgrading MySQL databases do not come without risk. There is no guarantee that no problems will happen if you move to a new major MySQL version.
Should we just upgrade and rollback immediately if problems occur? But what if these problems only happen a few days after migrating to this new version?
You might have a database environment that is risk-adverse, where you really have to be sure that this new MySQL version will handle the workload properly.
Examples:
- Both MySQL 5.6 and 5.7 have a lot of changes in the MySQL Optimizer. It is expected that this improves performance of my queries, but is it really the case? What if there is a performance regression? How will this affect my database performance?
- Also, there are a lot of incompatible changes which are documented in the release notes, how do I know if I'm affected by this in my workload? It's a lot to read..
- Can I go immediately from MySQL 5.5 to 5.7 and skip MySQL 5.6 even though the MySQL documentation states that this is not supported?
- Many companies have staging environments, but is there a QA team and do they really test all functionality, under a similar workload?
This presentation will show you a process, using open source tools, of these types of migrations with a focus on assessing risk and fixing any problems you might run into prior to the migration.
This process can then be used for various changes:
- MySQL upgrades for major version upgrades
- Switching storage engines
- Changing hardware architecture
Additionally, we will describe ways to do the actual migration and rollback with the least amount of downtime.
It talks about native compilation technology, why it is required, what it is?
Also how we can apply this technology to compile table and procedure to achieve considerable performance gain with very minimal changes.
These are the slides which were used by Kumar Rajeev Rastogi of Huawei for his presentation at pgDay Asia 2016. He presented great idea about Native Compilation to improve CPU efficiency.
The document discusses techniques and tools for optimizing Rails applications. It covers topics like benchmarking tools, caching, session storage options, and common performance issues in Rails like slow helper methods and associations. The document provides recommendations on optimizing actions, views, and controllers in Rails.
The document provides an introduction to stored procedures in SQL. Key points include:
- Stored procedures allow code to be executed as a batch after being compiled once, improving performance over executing individual SQL statements.
- Stored procedures can accept input parameters, return output parameters, and be used to enforce consistent implementation of business logic and error handling.
- Best practices for stored procedures include adding documentation, error handling, and using input/output parameters to make procedures more flexible and reusable.
The document provides an introduction to stored procedures in SQL. Key points include:
- Stored procedures allow code to be executed faster than batches by pre-compiling the code.
- They centralize business logic and error handling routines for consistent implementation across users.
- Parameters can be passed into stored procedures to make them more flexible. Output parameters allow returning values.
- Best practices include adding comments, error handling, and using transactions for consistency across nested stored procedures.
Watch Re-runs on your SQL Server with RML Utilitiesdpcobb
油
RML Utilities provide command line tools and interactive reports enabling you to:
Take SQL trace files (captured with SQL Profiler, sp_trace or extended events in SQL 2012+),
Process them into replayable RML files (using readtrace.exe),
Play them back in a different SQL environment (using ostress.exe),
And compare the performance at a granular level (using reporter.exe or custom queries).
SQL Server 2008 Development for ProgrammersAdam Hutson
油
The document outlines a presentation by Adam Hutson on SQL Server 2008 development for programmers, including an overview of CRUD and JOIN basics, dynamic versus compiled statements, indexes and execution plans, performance issues, scaling databases, and Adam's personal toolbox of SQL scripts and templates. Adam has 11 years of database development experience and maintains a blog with resources for SQL topics.
- Legacy Perl code is code that uses outdated practices, has no tests or documentation, and is difficult to understand and modify. It often results from organic growth over many years and developers.
- Unit testing legacy code provides safety during refactoring, speeds up development by replacing debugging, and creates regression tests. However, the code's dependencies make it difficult to isolate and test.
- Techniques like dependency injection, sprouting, monkey patching, and temporary object reblessing can help break dependencies and make legacy code more testable. Instrumentation with profilers also aids understanding the code.
2011-02-03 LA RubyConf Rails3 TDD WorkshopWolfram Arnold
油
This document provides an overview of test-driven development (TDD) using Rails 3. It discusses why TDD is important, how to structure tests in different layers (model, controller, etc.), and what to test for models, controllers and views. It also covers RSpec 2 and useful tools like RVM. The presentation includes live coding demos and in-class exercises on TDD.
20201010 - Collabdays 2020 - Sandro Pereira - Power Automates: best practice...Sandro Pereira
油
In this session, we will do a reflection to your existing Power Automation flows and when thru a list of must-have best practices, tips, and tricks that will allow you to build more reliable and effective flows. At the same time, these will allow you to be more productive and document your flows from the beginning.
The document discusses stored procedures and functions in Oracle databases. It describes how procedures are compiled code stored in the database that can be called from client environments. Procedures allow encapsulating common operations like inserting records or updating salaries. The document provides examples of creating procedures and functions, specifying arguments, debugging errors, and managing dependencies.
Obevo is an open-source database deployment tool that handles complex database schemas and deployments at enterprise scale. It addresses challenges such as maintaining migration files, determining dependency order, and onboarding existing production databases. Obevo represents database objects as files in a similar structure to code, enables stateful objects through multiple change sections, and uses dependency analysis and topological sorting to determine deployment order. It also supports ORM and in-memory database integration through translation layers.
This document discusses using RSpec for testing Rails applications. It covers installing RSpec and RSpec-rails, generating Rails projects with different frameworks and databases, generating scaffolds for models, running tests, and using RSpec matchers, stubs, mocks and other features for testing controllers, models, views and test coverage. It also introduces the autotest tool for automatically running tests on file changes.
Parallel run selenium tests in a good wayCOMAQA.BY
油
1. The document discusses running automated tests in parallel to improve efficiency.
2. It proposes an algorithm that involves defining test attributes, shared entities, architecture, and using standard test runners or custom instruments to run tests in parallel across several processes or a Selenium Grid.
3. The benefits of parallel testing are given as decreasing testing windows and increasing regression frequency to improve test automation ROI.
This presentation is about managing database scripts, why we need to do it from theoretical and practice perspective, how it improves continues integration and delivery process on real projects.
This document provides tips for improving the performance of stored procedures and SQL queries. Some key points include:
1) Use column lists instead of "*" in SELECT statements to minimize network traffic and only return needed columns.
2) Use ANSI 92 syntax like INNER JOIN instead of older syntax to future proof queries.
3) Consider using table variables instead of temp tables to avoid recompiles in some cases.
4) Add proper indexes to avoid full table scans and improve performance of queries.
5) Avoid dynamic SQL when possible to prevent recompilation of execution plans.
This summarizes best practices for writing efficient stored procedures and queries discussed in the document.
This document discusses store procedures including:
1. What a store procedure is and how it improves performance over dynamic SQL.
2. The key differences between store procedures and functions.
3. How to create, call, and execute store procedures including examples of creating procedures to insert, update, delete records.
4. The processing that occurs when a store procedure is created including resolution, compilation, and execution.
5. How query plans are built and stored for procedures.
Antes de migrar de 10g a 11g o 12c, tome en cuenta las siguientes consideraciones. No es tan sencillo como simplemente cambiar de motor de base de datos, se necesita hacer consideraciones a nivel del aplicativo.
2. Introduction
Kimberly L. Tripp, SQL Server MVP
Principal Mentor, Solid Quality Learning
* In-depth, high quality training around the world!
www.SolidQualityLearning.com
Content Manager for www.SQLSkills.com
Writer/Editor for TSQL Solutions/SQL Mag
www.tsqlsolutions.com and www.sqlmag.com
Consultant/Trainer/Speaker
Coauthor for MSPress title: SQL Server 2000
High Availability
Presenter/Technical Manager for SQL Server
2000 High Availability Overview DVD
Very approachable. Please ask me questions!
3. Overview
Initial Processing - Review
Resolution
Compilation/Optimization
Execution/Recompilation
Recompilation Issues
When do you want to Recompile?
Options for Recompilation?
What to Recompile?
Stored Procedure Best Practices
Naming Conventions
Writing Solid Code
Excessive Recompilations How? Detecting?
4. Processing of Stored
Procedures
sysobjects
sysobjects
Parsing Name, type, etc.
Name, type, etc.
Creation Parsing
Creation syscomments
syscomments
Resolution
Resolution Text of object
Text of object
syscolumns
syscolumns
Parameter list
Parameter list
sysdepends
sysdepends
Resolution*
Resolution* Object dependencies
Object dependencies
Execution Optimization
Optimization
Execution
(first time
(first time
or recompile)
or recompile)
Compilation Compiled plan placed in
Compiled plan placed in
Compilation
unified cache
unified cache
5. Resolution
When a stored procedure is created all objects
referenced are resolved (checked to see whether
or not they exist).
The create will succeed even if the objects do not
exist
Procedures called that do not exist generate error
Cannot add rows to sysdepends for the current stored procedure because
it depends on the missing object 'missingobjectname'. The stored
procedure will still be created.
Benefit: Recursion is allowed!
Tables, Views, Functions called that do not exist - do
NOT generate error (unless in 6.5 compatibility mode)
Verify dependencies with sp_depends before
dropping an object
6. Compilation/Optimization
Based on parameters supplied
Future executions will reuse the plan
Complete optimization of all code passed
(more on this coming upmodular
code!)
Poor coding practices can cause
excessive locking/blocking
Excessive recompilations can cause poor
performance
7. Execution/Recompilation
Upon Execution if a plan is not already in cache
then a new plan is compiled and placed into cache
What can cause a plan to become invalidated
and/or fall out of cache:
Server restart
Plan is aged out due to low use
DBCC FREEPROCCACHE (sometime desired to force
it)
Base Data within the tables - changes:
Same algorithm as AutoStats, see Q195565 INF: How
SQL Server 7.0 and SQL Server 2000 Autostats Work
8. Recompilation Issues
RECOMPILATION = OPTIMIZATION
OPTIMIZATION = RECOMPILATION
When do you want to recompile?
What options do you have
Recompilation?
How do you know you need to recompile?
Do you want to recompile the entire
procedure or only part of it?
Can you test it?
9. When to recompile?
When the plan for a given statement within a
procedure is not consistent in execution plan due to
parameter and/or data changes
Cost of recompilation might be significantly less than
the execution cost of a bad plan!
Why?
Faster Execution with a better plan
Saving plans for reuse is NOT always beneficial
Some plans should NEVER be saved
10. Options for Recompilation
CREATE WITH RECOMPILE
When procedure returns widely varying results
When the plan is not consistent
EXECUTE WITH RECOMPILE
For testing and to determine if CREATE WITH
RECOMPILE is necessary
sp_recompile objname
Forces all plans with regard to that object to be
invalidated (note: this does not force recompilation
on views even though a view name is supported)
Statement Recompilation
Dynamic String Execution or Modularized Code
11. How do you know?
You Test!
Test optimization plans consistency using
EXECUTE WITH RECOMPILE
Choose what needs to be recompiled
Whole Procedure
Portions of the procedure
Test final performance using the chosen strategy
Procedure Recompilation (CREATE with
RECOMPILE)
Statement Recompilation (Dynamic String Execution)
Modularized Code (Sub procedures created with or
without WITH RECOMPILE)
12. EXECUTE WITH RECOMPILE
Excellent for Testing
Verify plans for a variety of test cases
EXEC dbo.GetMemberInfo Tripp WITH RECOMPILE
EXEC dbo.GetMemberInfo T% WITH RECOMPILE
EXEC dbo.GetMemberInfo %T% WITH RECOMPILE
Do the execution plans match?
Are they consistent?
Yes then create the procedure normally
No Determine what should be
recompiled
13. What Should be Recompiled?
Whole Procedure
CREATE with RECOMPILE
Procedure is recompiled for each execution
EXECUTE with RECOMPILE
Procedure is recompiled for that execution
NOTE: Consider forcing recompilation through another technique
you should not expect users will know when/why to use EXECUTE
WITH RECOMPILE once in production!
Statement(s) Recompilation
Limited number of statements cause excessive
recompilation
Dynamic String Execution
Modular Code
14. CREATE WITH RECOMPILE
Use when the procedure returns drastically different results
based on input parameters.
May not be the only or even the best option
How do you know?
CREATE PROCEDURE GetMemberInfo
( @LastName varchar(30) )
AS
SELECT * FROM Member WHERE LastName LIKE @LastName
go
EXEC GetMemberInfo 'Tripp' -- index+bookmark
EXEC GetMemberInfo 'T%' -- plan already exists (s/b a table scan)
EXEC GetMemberInfo '%T%' -- definitely should use a table scan
15. Statement Recompilation
What if only a small number of statements need to
be recompiled?
The SQL Statement is not likely safe (i.e. it will not
be saved and parameterized)
Dynamic String Execution!
Amazingly Flexible
Permission Requirements
Potentially Dangerous
Advanced Examples
Complex large strings
Changing database context
Output parameters
16. Modular Code The Better Solution!
IF (expression operator expression) Solution?
Do not use a lot of
SQL Statement Block1
conditional SQL
ELSE Statement Blocks
SQL Statement Block2 Call separate stored
procedures instead!
Scenario 1 upon first execution
Parameters are passed such that the ELSE condition
executes BOTH Block1 and Block2 are optimized with the
input parameters
Scenario 2 upon first execution
Parameters are passed such that the IF condition executes
ONLY Block1 is optimized. Block2 will be optimized when a
parameter which forces the ELSE condition is passed.
See ModularProcedures.sql
17. sp_recompile
Can be used to periodically and directly force
recompilation of a procedure (or trigger)
Can be used on tables and views to indirectly
force the recompilation of all procedures and
triggers that reference the specified table or view
Does not actually recompile the procedures
Instead it invalidates plans for next execution
SQL Server invalidates plans as data changes
Never really negative especially if you run it at
night as part of batch processing after index
rebuilds or statistics updates with FULLSCAN
18. Stored Procedure Best Practices
Naming Conventions
Owner Qualify
Do not use sp_
Modifying Procedures
Write Solid Code
Writing Better Queries/Better Search Arguments
Changing Session Settings
Interleaving DML/DDL
Temp Table Usage
Modular Code
Detecting Excessive Recompilations
19. Naming Conventions
Owner Qualify to Eliminate Ambiguity
On execution
EXEC dbo.procname
On creation
CREATE PROC dbo.procname
AS
SELECT columnlist FROM dbo.tablename
EXEC dbo.procname
Minimize Blocking initial cache lookup by owner will fail.
It will not cause a recompile but excessive lookups can
cause significant blocking and cache misses.
Do not use sp_ in stored procedure names causes
cache misses on lookup as well because SQL Server
looks in master first!
See KB Article Q263889
20. Modifying Procedures
DROP and RECREATE
Loses the dependency chain stored in sysdepends
Loses the permissions already granted
Invalidates all plans
ALTER PROC
Loses the dependency chain stored in sysdepends
Retains the permissions
Invalidates all plans
To retain the dependency chain you must also
ALTER all procedures that depend on the
procedure being altered.
21. Changing SESSION Settings
Certain Session Settings can be set within a stored
procedure some can be desired:
SET NOCOUNT ON
SET QUOTED_IDENTIFIER OFF (not recommended except for
backward compatibility and upgrades)
Some Session Settings will cause EVERY execution to force
a recompile:
ANSI_DEFAULTS
ANSI_NULLS (tip: do not use WHERE col = null, use col IS NULL)
ANSI_PADDING
ANSI_WARNINGS
CONCAT_NULL_YIELDS_NULL (tip: use the ISNULL function to
concatenate strings)
Recommendation: DO NOT Change these session settings
in the client or the server!
See SET Options that Affect Results in the BOL
22. Interleaving DML/DDL Statements
Objects that dont exist at procedure first execution
cannot be optimized until statement execution
Upon execution of a DDL statement the procedure
gets recompiled to recompile the plans for the DML
But wait not all of the objects are createdso
later executions of DDL force recompilation
AGAIN
Dont interleave DDL and DML separate it
All DDL at the beginning of the proc, all DML later!
23. Data Manipulation
Derived Tables
Nested Subquery in FROM clause
May optimize better than temp tables/variables
Views
Another option rewrite existing temp table code to
use views instead (simple rewrite)
May optimize better than temp tables/variables
Temp Tables
Should be considered
Table Variables
Limitations might not affect you
Might be the most optimal
24. Temp Table Usage
Temp Table can create excessive recompilations for
procedures. Consider creating permanent tables
(with indexes) and manipulating data there.
Consider dropping and re-creating or rebuilding
indexes as part of the procedure instead!
Try not to create tables conditionally (IF create
ELSE create)
Use Profiler to see if there are significant recompiles
Use KEEP PLAN on SELECT statements if data
changes more than 6 times but the plan should not
change
25. Table Variable Usage
Scope is limited to the local proceduretransaction
Does not cause excessive recompiles due to local only
access
No re-resolution on CREATE/ALTER
Temp Tables need re-resolution for nested procedures
Only Key Indexes can be created
Definition of Table allows PRIMARY KEY/UNIQUE constraint indexes
Use TEMP TABLES if large volumes of data will be manipulated
create the right indexes for access
Population
Does not support INSERT EXEC
Does not support SELECT INTO
26. Temp Table vs. Table Variables
Temp Table
PROs
Can create useful nonclustered non-unique indexes to improve
join performance
Can access from other nested procedures
Can populate with INSERT EXEC or SELECT INTO
CONs
Potential for excessive recompiles due to resolution
Table Variable Table
PROs
Local only no excessive recompiles
CONs
Cannot create additional nonclustered indexes
Not flexible on population
27. Detecting SP Recompilation
Event = SP:Recompile & Column = EventSubClass
Local Schema, bindings or permissions changed between compile and execute
or executions
1 Shouldnt happen often. If it does isolate where/how changes occur and batch/schedule for off
hours
Statistics changed
Thresholds for statistics of the different types of tables vary.
Empty Tables (Permanent >= 500, Temp >= 6, Table Variables = No threshold)
2 Tables with Data (Perm/Temp >= 500 + 20% cardinality, Table Variables = No threshold)
If consistent plan then eliminate recompiles from changes in statistics by using (KEEPFIXED
PLAN) optimizer hint in SELECT
Object not found at compile time, deferred check at run-time
3 If the objects on which the procedure are based are permanent objects consider recreating
Set option changed in batch
4 Best Coding practice: Consistency in client session settings. Consistency in development
environment. Only use SET options when connection is started and when procedure is created.
Temp table schema, binding or permission changed
5 Change coding practice for #temptable
Remote rowset schema, binding or permission changed.
6 Gets stats from remote server, may recompile. If youre going to another server often for a
relatively small amount of static data you might consider periodically brining over a local copy?
28. Profiling SP Performance
Create New Trace (SQLProfilerTSQL_sps)
Replace SP:StmtStarting w/SP:StmtCompletion
Better if you want to see a duration (starting
events dont have a duration)
Add Duration as a Column Value
If short term profiling for performance:
Add columns: Reads, Writes, Execution Plan
Always use Filters
Database Name (only the db you want)
Exclude system IDs (checkbox on filter dialog)
29. Review
Initial Processing - Review
Resolution
Compilation/Optimization
Execution/Recompilation
Recompilation Issues
When do you want to Recompile?
Options for Recompilation?
What to Recompile?
Stored Procedure Best Practices
Naming Conventions
Writing Solid Code
Excessive Recompilations How? Detecting?
30. Other Sessions
DAT 335 SQL Server Tips and Tricks for DBAs and
Developers
Tuesday, 1 July 2003, 15:15-16:30
DBA 324 Designing for Performance: Structures,
Partitioning, Views and Constraints
Wednesday, 2 July 2003, 08:30-09:45
DBA 328 Designing for Performance: Optimization with
Indexes
Wednesday, 2 July 2003, 16:45-18:00
DBA 322 Optimizing Stored Procedure Performance in
SQL Server 2000
Thursday, 3 July 2003, 08:30-09:45
31. Articles
Articles in TSQLSolutions at www.tsqlsolutions.com (FREE,
just register)
All About Raiserror, InstantDoc ID#22980
Saving Production Data from Production DBAs, InstantDoc ID#22073
Articles in SQL Server Magazine, Sept 2002:
Before Disaster Strikes, InstantDoc ID#25915
Log Backups Paused for Good Reason, InstantDoc ID#26032
Restoring After Isolated Disk Failure, InstantDoc #26067
Filegroup Usage for VLDBs, InstantDoc ID#26031
Search www.sqlmag.com and www.tsqlsolutions.com for
additional articles
33. Community Resources
Community Resources
http://www.microsoft.com/communities/default.mspx
Most Valuable Professional (MVP)
http://www.mvp.support.microsoft.com/
Newsgroups
Converse online with Microsoft Newsgroups, including Worldwide
http://www.microsoft.com/communities/newsgroups/default.mspx
User Groups
Meet and learn with your peers
http://www.microsoft.com/communities/usergroups/default.mspx
34. Ask The Experts
Get Your Questions Answered
I will be available in the ATE
area after most of my sessions!
35. Thank You!
Kimberly L. Tripp
Principal Mentor, Solid Quality Learning
Website: www.SolidQualityLearning.com
Email: Kimberly@SolidQualityLearning.com
President, SYSolutions, Inc.
Website: www.SQLSkills.com
Email: Kimberly@SQLSkills.com
36. Suggested Reading And Resources
The tools you need to put technology to work!
TITLE Available
Microsoft速 SQL Server 2000
High Availability: 0-7356-1920-4
7/9/03
Microsoft速 SQL Server 2000
Administrator's Companion:0- Today
7356-1051-7
Microsoft Press books are 20% off at the TechEd Bookstore
Also buy any TWO Microsoft Press books and get a FREE T-Shirt
38. 息 2003 Microsoft Corporation. All rights reserved.
This presentation is for informational purposes only. MICROSOFT MAKES NO WARRANTIES, EXPRESS OR IMPLIED, IN THIS SUMMARY.