Showing posts with label sql. Show all posts
Showing posts with label sql. Show all posts

Friday, February 10, 2012

The Cost of Complexity

I find it stupendous that a single application can bring a decently modern i7 quadcore with 4gb of RAM to its knees. The app in question is our custom product that I'm helping work on. It stores data  locally on the client in SQL Server with Entity Framework, syncs with a server database via Sync Framework, and renders itself with WPF. It's simply a beast. Solution explorer in Visual Studio shows about 25-30 projects in the solution and each project has anywhere from 20 to 200 classses, with quite a few classes weighing in at over 1,000 lines of code. The whole suite easily gobbles up 2gb of memory on its own, while the remaining 2gb are consumed entirely by Windows 7, Visual Studio 2010, and this browser.

Now I mostly blame Entity Framework and only slightly WPF. Entity Framework is a discordant combination of being easy to use and while at the same time extremely difficult to master. On the one hand it's pleasantly easy to create the schema and start using the entities in code the very same day (and by the same token also easy to fall prey to lazy practice that apparently will do a number on your performance), while on the other hand, it's very difficult to reign in the effects of this ease-of-use and the results can get out of hand. We have had several teams analyze performance from different levels - from the overall architectural choices made early on and down to the details of how views are written, entities materialized, and so forth. We've certainly made quite a few improvements, but the biggest of these came not from improving the efficiency of the data layer, but simply by wrapping the data operations in asynchronous calls. WPF shares a portion of the blame here as well. The queries, while slow, typically don't account for more than half or three quarters of the time a user spends wishing he was at the dentist instead. No, the rest of the time is spent rendering huge data grids. Why am I letting WPF off the hook easily you ask? Well, the problem here is largely attributable to design choices, such as caving in when the customer asked for data grids that show hundreds of rows of data with an equal number of columns. Another portion of the blame can also be leveled at Telerik, which while providing beautiful and functional controls is at the same time another dirty pig in this mud bath the user has to slog through.  Nevertheless, I still can't help but blame WPF for some of the performance problems when I can see scroll bars chug along and occasionally see the entire UI lock up for entire seconds. But I shouldn't be so hard on this application since it really has sped up a lot over the last year. It was entirely normal to have to wait upwards of 30 seconds for a single tab to open and display nothing more complex than a simple message inbox. I am aware that care is required in tweaking the display of grids, but that's exactly the problem in my opinion. The whole technology stack we're using is extremely simple to use and yet infuriatingly, devilishly difficult to tune and tweak.

When I say this app dwarfs iTunes and makes Visual Studio feel like Notepad, I'm not kidding folks. I also admit that the app deals with huge amounts of data due to its "enterprise" nature. There is verbose logging being done in triplicate to the hard drive, the database and a logging service, which in a manner of speaking could be considered necessary since each log records different variations of the same events for different audiences. No data is being deleted and even our sparse test data set has bloated the client database to over 9gb, though most of this is transactional logging (exporting the same data sans transactional data yields a file that only weighs 300mb). Myriad small things have crept into the design and architecture of the application to meet very specific and sometimes narrow minded business requirements that were assembled by different business units within the same client organization. These requirements are often impossible, costly or just painful to implement in light of oftentimes conflicting or even duplicate requirements that originated from different business units. Lastly, the rigid and literal interpretation of design documents by the testing teams combined with unrealistically short turnaround times required for bug fixes has resulted in many suboptimal solutions to problems that never existed. I'm going to skip over the half dozen or so integrations to third party systems that are at the best of times brittle and non-functional, but I hope I've drawn a clear picture of the problem.

Tuesday, September 13, 2011

A (Huge) Needle In A Bigger Haystack

Microsoft's Sync Framework is a fragile creature, and as such doesn't like being abused with data. Today I ran into a problem with our app trying to sync a record that had a field of 8266 characters, which is over the (apparent) 8192 character limit. To simplify the problem of finding which record out of millions scattered across hundreds of tables with dozens of columns, I first hit google like any lazy coder.

This thread got me started by giving me a base for searching a single table for the longest column:
http://www.dbforums.com/microsoft-sql-server/1069468-possible-get-max-length-text-field.html

...but I needed more.

So I wrapped that thing in another cursor, and came up with this:




DECLARE @table SYSNAME, @field SYSNAME


DECLARE b_cursor CURSOR STATIC
FOR
SELECT name FROM sys.tables WHERE name not like '%tracking'  -- customize table SELECTion here
OPEN b_cursor
FETCH NEXT FROM b_cursor INTO @table
WHILE @@FETCH_STATUS = 0
BEGIN


CREATE TABLE #Temp (TableName SYSNAME, ColumnName SYSNAME, MaxLength INT)


DECLARE a_cursor CURSOR STATIC
FOR
SELECT name FROM syscolumns WHERE id = object_id(@table) and name like '%history%' -- customize column SELECTion here
OPEN a_cursor


FETCH NEXT FROM a_cursor INTO @field
WHILE @@FETCH_STATUS = 0
BEGIN
INSERT INTO #temp
EXEC ('SELECT ''' + @table + ''', ''' + @field + ''', max(len(' + @field + ')) FROM ' + @table )
FETCH NEXT FROM a_cursor INTO @field
END


CLOSE a_cursor
DEALLOCATE a_cursor


SELECT * FROM #Temp


DROP TABLE #Temp


FETCH NEXT FROM b_cursor INTO @table
END
CLOSE b_cursor
DEALLOCATE b_cursor


--DROP TABLE #Temp


-- select from a table to see the record with the largest field
SELECT * FROM (SELECT Id, len(History) AS len FROM WorkOrders) AS a ORDER BY len DESC

The guts of it is the EXEC statement that queries each column of each table. Note that I've put in comments where you can customize the queries to filter tables and columns by name, etc. At the bottom is a query that shows how to query a table for records and sort by the record size. Emjoy

** Note: If it complains about the table #Temp already existing, just drop it by running the commented drop statement towards the end.

Friday, February 12, 2010

Query Dynamics GP Configuration

Often in the course of a customer implementation I've found myself thinking "Why don't they keep a list of their important GP setup?" I know the answer to that, but that's another post. I'd really like something like this:


SystemSQL Server:9.00.1399.06, RTM, Developer Edition (64-bit)
TWO eConnect Version:10.0.0.0
TWO Company Name:Fabrikam, Inc.
TWO Sitename:Two , Inc
TWO Multi-currency:True
TWO Functional Currency:Z-US$
TWO Functional Currency Format: #,##0.00
TWO Functional Currency Decimals: 2
TWO Default Site: NORTH
TWO Multiple Bins: True
TWO Qty Shortage Option: Back Order Balance
TWO Extended Pricing: N\A
TWO SOP Setup Options: Auto-Assign Lot Nums:False
TWO SOP Setup Options: Auto-Assign Serial Nums:True
TWO Prices Not Required In Price List:True
TWO Pickup Ship Methods: PICKUP,
TWO Deliver Ship Methods: COURIER, DROP SHIP, EXPRESS MAIL, OVERNIGHT, LOCAL DELIVERY, MAIL, TWO DAY, GROUND, INTERNATIONAL, NEXT DAY,
TWO "Use separate Fulfillment Process" Checked:FUL, SPECORD, ORDMAN,
TWO "Use separate Fulfillment Process" Un-Checked:PHNORD, REPORD, STDORD, RMAORD, DISTRIBUTION,

Well, I sat down and wrote a SQL script to show me exactly that! Here it is; hope you find it helpful too.



use TWO


DECLARE @dbname char(30)
set 
@dbname = DB_NAME()


DECLARE @spGPSettings TABLE (
DBName char(30),
Config char(50),
Value char(500)
);

declare @tempstr varchar(1024)

--DECLARE @sqlstr char(500)
--set @sqlstr = 'use ' + @dbname + ''

-- SQL Server Version
insert into @spGPSettings
SELECT 'System', Config='SQL Server:', Value=rtrim(convert(char, SERVERPROPERTY('productversion'))) + ', ' + rtrim(convert(char, SERVERPROPERTY ('productlevel'))) + ', ' + rtrim(convert(char, SERVERPROPERTY ('edition')))

-- eConnect
declare @spTempTable table (DatabaseName char(20), Version char(20), CompanyName char(50));
insert into @spTempTable
exec DYNAMICS..taeConnectVersionInfoDYNAMICS
insert into @spGPSettings
select @dbname, Config='eConnect Version:', Value=convert(char, Version) from @spTempTable where DatabaseName = @dbname
insert into @spGPSettings
select @dbname, Config='Company Name:', Value=convert(char, CompanyName) from @spTempTable where DatabaseName = @dbname

-- company
insert into @spGPSettings
select @dbname, Config='Sitename:', Value=isnull(rtrim(SITENAME), '') from DYNAMICS..SY03500

-- Multi Currency
if (select count(*) from CM00100 where CURNCYID <> '')>0 begin
insert into @spGPSettings
select @dbname, Config='Multi-currency:', Value='True'
end else if (select count(*) from CM00100 where CURNCYID <> '') = 0 begin
insert into @spGPSettings
select @dbname, Config='Multi-currency:', Value='False'
end

-- Currency Setup
insert into @spGPSettings
select
@dbname, Config='Functional Currency:',
Value=rtrim(isnull(funcCurr.CURNCYID, ''))
from MC40000 as currSetup (nolock)
left join DYNAMICS..MC40200 as funcCurr (nolock) on funcCurr.CURNCYID=currSetup.FUNLCURR

insert into @spGPSettings
select
@dbname, Config='Functional Currency Format:',
Value=cast(replace(rtrim('#,##0 ' + substring('00000',1, isnull((select top 1 DECPLCUR-1 from DYNAMICS..MC40200 as a join MC40000 as b on a.CURRNIDX=b.FUNCRIDX), 2))), ' ', '.') as varchar(15))
from MC40000 as currSetup (nolock)
left join DYNAMICS..MC40200 as funcCurr (nolock) on funcCurr.CURNCYID=currSetup.FUNLCURR

insert into @spGPSettings
select
@dbname, Config='Functional Currency Decimals:',
Value=cast(isnull(funcCurr.DECPLCUR ,3)-1 as char)
from MC40000 as currSetup (nolock)
left join DYNAMICS..MC40200 as funcCurr (nolock) on funcCurr.CURNCYID=currSetup.FUNLCURR

-- default warehouse
insert into @spGPSettings
select @dbname, Config='Default Site:', Value=LOCNCODE from SOP40100


-- multiple bins
insert into @spGPSettings
select @dbname, Config='Multiple Bins:', Value=case ENABLEMULTIBIN when 1 then 'True' else 'False' end from IV40100

-- Qty Shortage Option
insert into @spGPSettings
select @dbname, Config='Qty Shortage Option:', Value=case DFQTYSOP when 1 then 'Override Shortage' when 2 then 'Sell Balance' when 3 then 'Back Order Balance' when 4 then 'Back Order All' when 5 then 'Cancel Balance' when 6 then 'Cancel All' when 7 then 'Distribute' end from SOP40100

-- Extended Price (?)
if (select count(EXPRINST) from DYNAMICS..SY05501 where CMPANYID=(select CMPANYID from DYNAMICS..SY01500 (nolock) where INTERID=@dbname))>0 begin
insert into @spGPSettings
select @dbname, Config='Extended Pricing:', Value='Installed'
end else if (select count(EXPRINST) from DYNAMICS..SY05501 where CMPANYID=(select CMPANYID from DYNAMICS..SY01500 (nolock) where INTERID=@dbname))=0 begin
insert into @spGPSettings
select @dbname, Config='Extended Pricing:', Value='N\A'
end

-- sop setup: auto-assign l/n, s/n
insert into @spGPSettings
select @dbname, Config='SOP Setup Options: Auto-Assign Lot Nums:', Value=case SOPALLOW_3 when 1 then 'True' else 'False' end from SOP40100
insert into @spGPSettings
select @dbname, Config='SOP Setup Options: Auto-Assign Serial Nums:', Value=case SOPALLOW_2 when 1 then 'True' else 'False' end from SOP40100

-- Prices_Not_Required_In_Pr
insert into @spGPSettings
select @dbname, Config='Prices Not Required In Price List:', Value=case Prices_Not_Required_In_Pr when 1 then 'True' else 'False' end from SOP40100 (nolock)

-- Shipping methods
set @tempstr = ''
select @tempstr = coalesce(@tempstr, '', '') + rtrim(SHIPMTHD) + ', '
from SY03000 where SHIPTYPE = 0
insert into @spGPSettings
select @dbname, Config='Pickup Ship Methods:', Value=@tempstr
--
set @tempstr = ''
select @tempstr = coalesce(@tempstr, '', '') + rtrim(SHIPMTHD) + ', '
from SY03000 where SHIPTYPE = 1
insert into @spGPSettings
select @dbname, Config='Deliver Ship Methods:', Value=@tempstr

-- docs with separate fulfillment process
set @tempstr = ''
SELECT @tempstr = coalesce(@tempstr, '', '') + rtrim(DOCID) + ', '
FROM TWO.dbo.SOP40200 WITH ( NOLOCK) WHERE SOPTYPE = 2 and USPFULPR = 1
insert into @spGPSettings
select @dbname, Config='"Use separate Fulfillment Process" Checked:', Value=substring(@tempstr, 0, 500)
--
set @tempstr = ''
SELECT @tempstr = coalesce(@tempstr, '', '') + rtrim(DOCID) + ', '
FROM TWO.dbo.SOP40200 WITH ( NOLOCK) WHERE SOPTYPE = 2 and USPFULPR = 0
insert into @spGPSettings
select @dbname, Config='"Use separate Fulfillment Process" Un-Checked:', Value=substring(@tempstr, 0, 500)

select * from @spGPSettings

Saturday, October 24, 2009

Upgrade Woes

First blog post! My goal is to make this an informative and occasionally amusing account of the perils and pitfalls of web and database programming. I work at a small software company where I do a bit of everything, so what better source of inspiration is there?

Just recently, a client of ours requested a software update to address a certain issue. The update went smoothly enough, with the biggest step being the update to their database. We have a utility that automates a Red Gate compare and updates based on differences. Usually this works smoothly and requires only a bit of coaxing once in a while. The only problem is that while the update seemed normal, it isn't smart enough to notice changes in the old version so it overwrites any existing database object with the new version.

Soon, the "bug reports" started rolling in...

To make a long story short, I can't help but wonder if there is a better way to extend a data layer with customer-specific changes. We already have a mechanism which which we extend a few key stored procedures, but it's only a 'if exists run XYZ' call to an extender proc. These extender procedures are ignored during an update and are thus safe from overwriting, but they are of somewhat limited value for coupling in additional functionality. I have begun looking at borrowing a few pages from the OO playbook for this problem. Specifically, I think some good'ole polymorphism would allow customers to extend our product in a safe way.