Configuring Wireless

  • Subscribe to our RSS feed.
  • Twitter
  • StumbleUpon
  • Reddit
  • Facebook
  • Digg

Friday, August 16, 2013

Designing and Building your CMDB, Part 1: From System Description To Configuration Items

Posted on 3:33 AM by Unknown
This series of posts is going to be a slight departure from normal in that it won't be showing you any code. We are going through the process of designing a CMDB (that's a Configuration Management Database) to hold details for all the systems (500+) that we administer. The point of this post is to, by means of an example, show you the sort of questions you should be asking yourself when you put together a CMDB.

So let's start with a description of a system;

"System X is a fairly simple VB.NET solution deployed using IIS and installed on the server WIN005. It consists of two  applications; User Interface (an application open to all users), and Admin Interface (only available to a few).

The Admin Interface works on a vanilla IIS install but the User Interface requires the installation of  the Visual Studio 2010 Tools for Office Runtime.

The installation files for the software are located on WIN080\Software (a file share) as are the bi-monthly patch files that are applied.

At the back end the database, SYSXDB, is SQL Server 2008R2 and is held on a SQL Server cluster called SQLC001.

The application uses Active Directory for authentication, and the User Interface renders some information  from Google Maps to which it requires access.

The users of the Solution are spread across two Countries; France and the United Kingdom. We have internally configured the system so that in the UK users know the solution as as 'InfoMaps' and in France it's known as 'LocalMaps'."

I'm sure there are probably parts in that description that you'll recognise from systems you've worked on. As you can see despite it being only a fairly simple VB.Net website with a couple of plugins there is already quite a lot of information here to capture in our CMDB. If we take this structure and put it into Visio then as a system overview we get something like this;

System X: An Overview
Now for most small organisations this is probably 95% of the information they're ever going to need. If you're a small company and aren't expecting do significantly increase in size and you're not planning on managing hundreds of systems across the globe then you can make do - let's be honest we all have our "go to guy" for a particular system and so long as they're not on holiday (or haven't left!) then they can keep the system ticking over quite happily from both the users and managements perspecitive.

The problem comes when you don't just have one system, or a few, you start to have tens of systems like this and each system takes some time to administer. Suddenly your team of 3/4 software engineers don't really have any time to do anything new because they're too busy keeping the systems that the business is already relying on working to put in anything new.

Once you approach this level you need to significantly increase the quality of information you are holding on each system; you stop needing "Bob" to fix your problem but instead you need "someone who knows IIS" or "someone who can fix SQL Server". If all the knowledge is in Bob's head then Bob quickly becomes a bottle-neck through which all issues have to go through - this isn't good for Bob (although he might think in the short term that it is!) and it's certainly not good for the company or the users.

So let's go back to the description for System X and look for all the items in the configuration that we might want to store information on in our CMDB. Each of these items will become a Configuration Items (CI) in the CMDB. It's fairly easy looking at the system description to just pick things out;
  • IIS
  • WIN005
  • User Interface
  • Admin Interface
  • Visual Studio 2010 Tools for Office Runtime
  • WIN080\Software
  • SYSXDB
  • SQLC001
  • Active Directory
  • maps.google.com
This is a fairly long list, but is only part of the story. We (as IT Professionals) then need to take this list and add in the non-obvious things that will help us troubleshoot the system when there's a problem six months after it's gone live and we've all moved on to other projects. Again there is no easy way to do this and you're heavily reliant on vendors to provide "full and complete" information.

The sort of questions that need to be picked out from the system description are; have both applications been installed into the same Application Pool in IIS? Is the Application Pool running as a local user or is it using network credentials? How are we connecting to the database? Are users typing in http://win005 to access the site or have we setup DNS entries (http://infomaps for example)? How are we deciding if a user has access to the Admin Interface? Etc.

So let's assume someone technical has gone through the system, had the discussions with the vendor, and found out how everything is not just connected but configured. Here's the list of things we might like to consider turning into CI's in additional the ones we've already identified;
  • Application Pool: SystemXUserInterface (Installed on WIN005)
  • Application Pool: SystemXAdminInterface (Installed on WIN005)
  • SYSTEMXSERVER (Active Directory account Used by both Application Pools and SQLC001 to grant access to SYSXDB)
  • "UK InfoMaps Standard Users" (Active Directory Group, Used By "System X User Interface")
  • "FR LocalMaps Standard Users " (Active Directory Group, Used by "System X User Interface")
  • "UK InfoMaps Administrators" (Active Directory Group, Used By "System X User Interface")
  • "FR LocalMaps Administrators" (Active Directory Group, Used by "System X User Interface")
  • DNS Entry: LocalMaps.ourcompany.org (Maps to WIN005)
  • DNS Entry: InfoMaps.ourcompany.org (Maps to WIN005)
  • SMTP.ourcompany.org (Used by System X Admin Interface to send email notifications)
  • Firewall Ports: 80,443 (Required for access to WIN005)
  • VT001 (Hyper-V server hosting WIN005 - a virtual server)
Now this list is looking a little more comprehensive!

But how do we know we've captured everything? or even captured enough details for us to be able to properly support the system after we've put it in?

In Part 2 we'll look at "testing" our configuration to try and identify the gaps.
Read More
Posted in cmdb, configuration items, configuration management database | No comments

Tuesday, August 13, 2013

PL/SQL: Dynamically Building Your Data Archive

Posted on 7:53 AM by Unknown
The purpose of this blog post is just to outline a design I put together as part of an internal project for dynamically building a data archive using rules based on the source data being fed into the system. It's far from complete but I think it highlights an interesting way of building an archive for your data when you don't know when you're doing the designing exactly what data you will be putting into it.

THE PROBLEM
At the moment in order to put data from various sources into the data archive a multitude of different loading programs are used (SSIS, command-line applications, scripts, etc) each of which uses it's own rules to determine where the source data ends up (largely dependent on what rules the developer used when putting it together) and inter-dependencies are largely invisible.

New feeds are added at a rate of one every other month and the system should cope with this wile keeping track of the dependencies in the database.

DESIGNING THE SOLUTION
In essence the problem this solution is trying to solve is to provide a single point of entry into the data archive where you can put your source data and which will then be put into the archive using a pre-specified set of rules to determine where the data ends up and what format it's in.

A simple diagram for the system is;
System Diagram
The specific bit that is "in scope" for this work is the "LOAD Process". How data gets into the DATASOURCE tables is really dependent on where the data is coming from, what format it's in, etc and it's practically impossible to write something so universally generic to cover every possible option from a CSV text file to a database link.

The aim of the solution will be to process the data as it arrives but it's possible that it could be adapted to work with data in batches.

THE PROPOSAL
I've created a fairly simple structure using the tables;
  • SOURCEDATATYPE - This holds a unique reference and description for each different data source
  • STAGINGOUTPUT - This table holds the raw data as loaded into the database from the external feed (I went with this name in case it's necessary to use staging tables for the IMPORT process to manipulate the data prior to it being loaded via the LOAD process)
  • ENTITY - This is the name for a table that is being created as part of the LOAD process in the Data Archive.
  • ENTITYDETAIL - This table contains information on how the data from the source table should be manipulated before being moved into the ENTITY table.
Here's a simple data structure;
Database Structure
As you can see it's pretty self explanatory.

Once you've configured the data source type, and entity details then you're ready to start loading data.

In order to load the database I've created a package called DW_LOADDATA. This has two routines;
  • ProcessAll, and
  • ProcessRow (p_rowID ROWID)
Basically "ProcessAll" loops through the unprocessed rows and passes them one at a time to the "processRow" routine.

The process row routine performs the following steps;
  • Get the new record from STAGINGOUTPUT
  • Identify the ENTITY/ENTITYDETAIL for the feed specified in the STAGINGOUTPUT record
  • Check to see if the ENTITY exists - if not create it.
  • Work out the column name, and if that doesn't exist as part of the ENTITY create it
  • Does a value already exist? If so update it (using MERGE), otherwise INSERT the new value
  • Mark the STAGINGOUTPUT record as processed
Sounds simple? Well it's less than 150 lines of code include comments and formatting ...

The key is the information in the ENTITY/ENTITYDETAIL tables. For example let's suppose I'm loading sales data and I want to create an ENTITY called SUPPLIER_SALES_BY_MONTH with separate columns for each month of data.

In the ENTITY table I'd create a simple record with the name of the new ENTITY (bearing in mind the actual name of the table will be prefixed with the Short_Code from the SOURCEDATATYPE table) and then in the ENTITYDETAIL table create the following rows;

INSERT INTO ENTITYDETAIL
SELECT 1, 1, 2,
  '''PERIOD_'' || TO_CHAR(SO.DATE01, ''YYYYMM'')', -- column_name_expression
  'SO.NUMBER01', -- row_unique_expression
  'OLD.VALUE = NVL(OLD.VALUE, 0) + SO.NUMBER04', -- value_expression
  'NUMBER', -- on_create_type
  '0' -- on_create_default
FROM DUAL
UNION SELECT 1, 1, 1,
  '''SUPPLIER_NAME''', -- column_name_expression
  'SO.NUMBER01', -- row_unique_expression
  'OLD.VALUE = SO.TEXT01', -- value_expression
  'VARCHAR2(80)', -- on_create_type
  '0' -- on_create_default
FROM DUAL


I know "INSERT INTO ..." probably isn't the best way to do this but this is only an example!

As you can see the column_name_expression is looking at the SO (STAGINGOUTPUT) table and formatting the first date to YYYYMM - so a value of 13-JAN-2013 will create/ update the column PERIOD_201301.

The value (for the supplier) is being updated to add on the sales for that month.

The second column that's created is the SUPPLIER_NAME - this is simply the name of the supplier. If I run this using some random test data I end up with a table that looks like;
Generated Table
I've created a script which creates the objects and loads some simple test data. It's available here (via Google Drive - DO NOT RUN IT IN AN EXISTING DATABASE SCHEMA UNLESS YOU WANT OBJECTS STARTING WITH SAL_ TO BE DROPPED!). You'll need to have setup a user with default tablespace permissions in order to get the script to work.

Let me know in the comments if you find this useful
Read More
Posted in pl/sql | No comments

Friday, June 21, 2013

Testing SMTP Connections Using Telnet

Posted on 7:32 AM by Unknown


So here’s a quick list of commands that will test an SMTP connection. The first thing to do is to make sure that you have used “Run as” to start the command window. Then type;

telnet [server name] 25

helo  [server name]

mail from:

rcpt to:

data

Some sort of random text you want to see in the email body …

.

Here’s the test output;
Telnet/SMTP Sample Output
 The most common errors you'll get a relay failure - if you want to fix this you just need to make sure the "from" email address is internal to the organisation hosting the SMTP server (for example use Gmail accounts if using Google's SMTP server, etc).
Read More
Posted in smtp, telnet | No comments

Thursday, May 16, 2013

SSRS: Searching The Reporting Database - Which Reports Include Subreport XXX?

Posted on 3:36 AM by Unknown
I've been tasked with splitting several existing reports into two (one for one set of users, one for a different set) and while I was looking at using Linked Reports unfortunately the software program that actually does the pushing out of the reports to the end-users doesn't support Linked Reports.

There also doesn't seem to be a "Dependencies" link which would allow me to see what reports are dependent on the Sub report I've been asked to change.

Digging through various SQL examples that are out there there didn't seem to be anything to do exactly what I was after *without* making it unnecessarily complicated.

Here's the SQL I ended up with;

SELECT *
  FROM (SELECT *,
               CASE
                 WHEN LEFT(CONVERT(varbinary(max),Content),3) = 0xEFBBBF
                   THEN CONVERT(varbinary(max),
                                SUBSTRING(CONVERT(varbinary(max), Content),
                                          4,
                                          LEN(CONVERT(varbinary(max), Content))
                                         )
                                )
               ELSE
                 CONVERT(varbinary(max),Content)
               END AS ContentXML
  FROM Catalog C) AS C
 WHERE C.ContentXML LIKE '%Subreport%'
   AND C.ContentXML LIKE '%SUB_REPORT_NAME%'
   AND C.Path LIKE '/SUB_REPORT_FOLDER/%'


The point of including the sub-report folder is to only pick up items in a single folder (or sub-folder) as we have PROD, DEV, and TEST all on the same server (in different folders).

Hope this saves you the time it took me sorting it out!
Read More
Posted in dbo.Catalog, ssrs | No comments

Wednesday, May 1, 2013

Excel 2013: Getting Data From Parametized SQL Query (vs SQL Server)

Posted on 3:50 AM by Unknown
I would have thought that dragging in data from SQL Server into Excel (both Microsoft products) would be easy - and it is if you're looking to drag in entire tables, views, etc. But if you want to do something a little more complicated with parameters it becomes a lot harder and less intuitive to get it to work.

The example below shows how to get the ExecutionLogs from a SQL Server instance between two dates.
 
I'm going to use Excel 2013 as it's the latest version and the one I have to hand.

Create a blank workbook by selecting "Blank workbook" (which is usually the first option in the middle of the screen);
Excel 2013: New "Blank workbook" Tile
Select the "Data" page in the ribbon and then click on "From Other Sources" in the "Get External Data" part of the ribbon (on the left). Select "From Microsoft Query" (which should be the very bottom option);

Excel 2013: Data Page
 NOTE: you may think selecting "SQL Server" is a slightly more obvious choice. However this will not allow you to use parametrized SQL - it's just for direct export from tables or views (why that's the case if beyond me!).

This will then open the "Choose Data Source" dialog;

Excel 2013: Choose Data Source Dialog
This dialog clearly dates from an earlier version of Windows and it's difficult to see how Microsoft couldn't have "updated" this with the rest of the 2013 look-and-feel. I'm running Windows 7 but I have have a sneaking suspicion that everyone from Windows XP onwards will be familiar with this dialog (although possibly not with the addition of "OLAP Cubes").

This dialog also isn't part of Excel, it's a separate application. Sure Microsoft will score some marks for re-use of a standard Windows component but the change in interface is jarring to say the least ... and it gets worse.

Leave "New Data Source" highlighted and click "OK";

Excel 2013: Create New Data Source Dialog
We seem to have slipped back to a pre-wizard era and we now have fields labelled 1 to 4. When we complete field 1, field 2 becomes available, on completing field 2 field 3 becomes available. This is jarring different from the other dialogs within Excel 2013.

Anyway populate fields and 2 in the dialog, selecting "SQL Server" from the drop down (in mine it was at the very bottom). Then click "Connect ...";

Excel 2013: SQL Server Login
Enter the login information - "Use Trusted Connection" means use your already authenticated (Active Directory) credentials - once you've entered a Server the "Options" box at the bottom right will become available, click on it;

Excel 2013: SQL Server Login Dialog - Extended
Use the "Database" drop down to select the database you wish to connect to. If you leave it as default it will pick the default database for your database user.

Click "OK".

Click "OK" again (on the "Create Data Source" dialog) - do not pick a table in the bottom drop down, we're going to use SQL with parameters.

The data source you just created should be select (in the "Choose Data Source" dialog) so just click "OK".

You will then be presented with the "Query Wizard - Choose Columns" dialog;

Excel 2013: Query Wizard - Choose Columns
Now you'll notice that you can't do anything from this stage *except* select a table.

Click "Cancel" (at the bottom right);

Excel 2013: Microsoft Query Confirmation Dialog
 Click "Yes";

Excel 2013: Add Tables Dialog
We're not working with tables so click "Close";

Excel 2013: Microsoft Query
Click on the "SQL" button on the menu bar;

Excel 2013: Microsoft Query - SQL Dialog
Here is the SQL we are going to use;

SELECT
  EL.InstanceName,
  EL.ItemPath,
  EL.UserName,
  EL.ExecutionId,
  EL.RequestType,
  EL.Format,
  EL.Parameters,
  EL.ItemAction,
  EL.TimeStart,
  EL.TimeEnd,
  EL.TimeDataRetrieval,
  EL.TimeProcessing,
  EL.TimeRendering,
  EL.Source,
  EL.Status,
  EL.ByteCount,  EL.AdditionalInfo
FROM ExecutionLog3 EL
WHERE EL.TimeStart >= ["Min Start Date"]
AND EL.TimeStart < ["Max Start Date"]
ORDER BY EL.TimeStart DESC


Enter the SQL and click "OK".

NOTE: There are a couple of "gotchas" here. The SQL is processed prior to being run and it isn't particularly flexible. If you use ANSI SQL (JOIN ... ON ...) then you won't get the graphical interface at best, or it just won't work. Equally including square-backets [] seems to break the SQL, as does having "dbo." in front of the table name.

"Broken" SQL is usually identified by you being told that the SQL doesn't support the graphical interface. This is usually a prelude to a more obscure error.

Providing everything is working OK you'll see;

Microsoft Query: Sample Data
Click on the "Exit" button (fourth from the left at the top left).

This closes Microsoft Query and returns control to Excel. The "Import Data" dialog will now appear;

Excel 2013: Import Data Dialog
Change "=$A$1" to "=$A$4" (so we have a few lines for the parameter entry boxes) and click "OK";

Enter "Start Date" in A1, and "End Date" into A2 (just labels). And then two dates into B1 and B2 (these will be the from/to dates we run the report as);

Excel 2013: Parameter Values in Excel
Now we need to link up the cells we've used with the parameters in use in our query. Click on the "Data" tab in the ribbon and then "Connections";

Excel 2013: Connections
Select the connection and then click "Properties";

Excel 2013: Connection Properties
Click on the "Definition" tab;

Excel 2013: Connection Properties - Definition Tab
Click on the "Parameters" button at the bottom (if you have used the SQL Server option in Excel this is where you'd have the problem - "Parameters" would be permanently greyed out);

Excel 2013: Parameters
As you can see in the list there are two parameters, the two we created earlier in the SQL. Both are currently set to prompt us for values. Click on the "Get the value from the following cell" radio group and select the cell we have entered the Start Date in;

Excel 2013: Default Parameter Value
You can also check the "refresh automatically when cell value changes" box if you want to work that way.

Repeat the process with the Max Start Date Parameter.

Click "OK" (closed Parameters dialog)

Click "OK" (closes Connection Properties dialog)

Click "Close" (closes Workbook Connections dialog)

Click "Refresh all" (in the ribbon)

And we're done! If this was useful for you don't forget to leave a comment ...









Read More
Posted in excel, sql server | No comments

Tuesday, April 30, 2013

EasyPush: Creating Member Groups To Control iPad Experience

Posted on 6:34 AM by Unknown
This blog covers how to setup and configure Member Groups to control the distribution of content within the EasyPush iPad Application.

The first step is to login to the administrative console (back-end);

EasyPush: Umbraco Admin Interface
Click on the "Members" section in the bottom left;

EasyPush: Members Section
The bit we're interested in is the node titled "Member Groups". If you click the triangle you'll see the groups already configured on the server. To create a new group right-click the node and select "Create";

EasyPush: Creating a New Members Group
Enter the name you're after and then click "Create".

The new group you've created will then be listed at the bottom of the nodes in the navigator and automatically selected (so you'll see the properties);

EasyPush: Member Group Properties
You'll notice that the only property is the name.

Now you need to configure Members to be part of your new group.

Click the triangle (expand) the "Members" node in the Navigation View and then search for a user to add to your new group;

EasyPush: Members

Click on the "Properties" tab (1);

EasyPush: Member Properties (With Group Membership)
Select the group you want to add to the member in the list on the left (2) and then click the double-arrow to move it across (3).

Click "Save" to save the changes.

You now need to look at the others documents I've produced as part of my work with Easypush (see Easypush Documentation Summary) and create some content that is can only be deployed to the new group you've created to test it.
Read More
Posted in easypush, iOS | No comments

Thursday, April 18, 2013

MobileIron: Removing a Deployed Application From iPads

Posted on 3:22 AM by Unknown
This is a simple quick guide to how to remove an application you are already deploying via MobileIron. This is useful if you need to remove an application urgently but don't want to delete it or if, like us, you are using Active Directory groups to control deploying the application but can't wait until AD replication has caught up to remove the application.

Log into MobileIron click on the APPS & CONFIGS tab, then “App Distribution”, select “iOS” in the Platform drop down, and then click on the “Add Name” column so that the applications are sorted in name order;
MobileIron: Deployed Applications
This gives you the list of all the application (both in-house and recommended applications from the iTunes store). It's probably easier to do a search for the application you're looking to remove. I'm going to remove the "Accellion" application you can see above. It's better to search for the exact application you want to remove as the last thing you want to do is just remove the *current* version of an application and roll all your users back to the previous version.

When you've done the search you will be presented with all the versions of the application you have in the system;

MobileIron: App Distribution - Multiple Versions
Select all the applications (click on the tick on the left), and then in the "Actions" menu select "Remove from Label";

MobileIron: Remove From Label dialog
The top item is labelled as "Partial" because the label is applied to one of the applications I've selected but not the other.

Select all the labels (check box at the top left) and then click "Remove".

You should now start to see the "Devices Installed" count decreasing as each device checks in and MobileIron does its stuff.
Read More
Posted in iOS, MobileIron | No comments

Tuesday, April 16, 2013

Lot Genealogy, Part 3: Debugging Lots and Batches

Posted on 8:20 AM by Unknown
As you might have noticed I've just updated the LOT_GENEALOGY source code for this project to a new version following the discovery of a bug where the product of a batch is two different item numbers. This is a rare occurrence in our system, which is why it went undetected for so long, but it might not be in yours.

Rather than dwell on that I thought I'd talk about exactly what the process was that led to us discovering the fixing the error.

The first report came from the users; a lot which they knew had two item numbers was only showing one. Other lots didn't seem to be affected. This was then traced back to a single batch, 73591, running this SQL gave us the transactions that affected the cache for that batch. To publish this data I'm obfuscating the data (see here);

SELECT LGG.TRANSACTION_ID,
       HIDEDATA.Hide('LGG.LN', LGG.LOT_NUMBER) Lot_Number,
       LGG.TRANSACTION_QUANTITY,
       LGG.TRANSACTION_TYPE_NAME,
       LGG.Item_Number,
       HIDEDATA.Hide('LGG.ID', LGG.Item_Description) Item_Description,
       LGG.ITEM_TYPE
  FROM LOT_GENEALOGY_GETTRANSACTIONS LGG
 WHERE 1 = 1
   AND LGG.Batch_No = '73591';

The result of this is the following transactions;

LG Transactions For A Single Batch
As you can see the first thing we have is three "WIP Completion" transactions. Each of these represents the completion of the batch and the generation of an output. In this case Item Numbers 073081 and 07440. However without knowing what information was put *into* the batch in the first place it's not possible to usefully use this information in the Genealogy - It has to wait until the "WIP Issue" transactions are processed (each of which represents an ingredient).

The next stage for debugging the cache was to (on a test system!) change the LOT_GENEALOGY_GETTRANSACTIONS view so that rather than looking at *everything* it only looks at the records for a single batch - this is simply done by adding the following where clause to the view SQL;

-- Restrict transactions to a single batch (for debugging)
AND GBH.Batch_No = '73591'

Now we're restricted the input records to just those affecting that batch we can just re-build the whole cache - it will take longer to do the delete than the insert. The script to do it is;

begin
  -- Call the procedure
  lot_genealogy.refreshwholecache;
end;


Once that has been completed the entire content of the cache is the single batch we're tracking. The SQL to show it is;

SELECT HIDEDATA.Hide('LGG.LN', LGW.Master_Lot_Number) Master_Lot_Number,
       HIDEDATA.Hide('LGG.LN', LGW.Ingred_Lot_Number) Ingred_Lot_Number,
       LGW.Ingred_Item_Number,
       HIDEDATA.Hide('LGG.ID', LGW.Ingred_Item_Description) Ingred_Item_Description,
       LGW.Ingred_Item_Type_Code,
       LGW.Batch_Number,
       HIDEDATA.Hide('LGG.LN', LGW.PRODUCT_LOT_NUMBER) PRODUCT_LOT_NUMBER,
       LGW.PRODUCT_ITEM_NUMBER,
       HIDEDATA.Hide('LGG.ID', LGW.Product_Item_Description) Product_Item_Description,
       LGW.Product_Item_Type_Code
  FROM LOT_GENEALOGY_WHEREUSED LGW
 WHERE 1 = 1


The result of this is;

Lot Genealogy Cache: Result for a Single Batch
As you can see the only product for Ingredient Lot LGG.LN 37 is item number 073002, if we look at the transactions earlier we can see that it should be reporting Item Number 07440 as well - it's not which means something is going wrong in the "WIP Issue" part of the cache processing.

If we look at the source code (available here - via Google Drive) you'll see that the final part of the WIP Issue is creating the records in LOT_GENEALOGY_BATCH_PRODUCT so the next stage to check is to see if these records are being created correctly. Here is the SQL;

SELECT LGBP.batch_number,
       HIDEDATA.Hide('LGG.LN', LGBP.PRODUCT_LOT_NUMBER) PRODUCT_LOT_NUMBER,
       LGBP.product_item_number,
       HIDEDATA.Hide('LGG.ID', LGBP.Product_Item_Description) Product_Item_Description,
       LGBP.product_item_type_code
  FROM LOT_GENEALOGY_BATCH_PRODUCT LGBP
 WHERE 1 = 1
   AND LGBP.Batch_Number = '73591'


This gives us the result;

Lot Genealogy Batch Products
This shows us that the correct batch products are being recorded - this is important as we now have both ends of the transaction; the correct transactions are going in and the correct products are coming out. However we also know that the cache isn't being updated correctly therefore the culprits must be the two pieces of SQL that are actually doing the inserts.

If you look at the first one;

INSERT INTO NOETIX_SYS.Lot_Genealogy_WhereUsed
    SELECT v_Transaction.Lot_Number, -- Master_Lot_Number
           v_Transaction.Lot_Number, -- Ingred_Lot_Number
           v_Transaction.Item$item, -- INGRED_ITEM$ITEM
           v_Transaction.Item_Description, -- INGRED_ITEM_DESCRIPTION
           v_Transaction.Item_Type_Code, -- INGRED_ITEM_TYPE_CODE
           v_Transaction.Batch_Number, -- Batch_Number
           v_Product.Product_Lot_Number, -- Product_Lot_Number
           v_Product.PRODUCT_ITEM$ITEM, -- PRODUCT_ITEM$ITEM
           v_Product.PRODUCT_ITEM_DESCRIPTION, -- PRODUCT_ITEM_DESCRIPTION
           v_Product.PRODUCT_ITEM_TYPE_CODE -- PRODUCT_ITEM_TYPE_CODE
      FROM DUAL
     WHERE NOT EXISTS
     (SELECT 1
              FROM NOETIX_SYS.Lot_Genealogy_WhereUsed LGI
             WHERE LGI.MASTER_LOT_NUMBER = v_Transaction.Lot_Number
               AND LGI.Ingred_LOT_NUMBER = v_Transaction.Lot_Number
               AND LGI.Batch_Number = v_Transaction.Batch_Number
               AND (v_Product.Product_Lot_Number IS NULL AND
                   LGI.PRODUCT_LOT_NUMBER IS NULL OR
                   v_Product.Product_Lot_Number = LGI.PRODUCT_LOT_NUMBER));


You can see that this is actually just a select from DUAL so the fact that it's only working on the first pass (i.e. for the first product) means that the offending part of the code must be the NOT EXISTS.

Looking at the WHERE clause in the sub-select reveals that it's not using the Product_Item_Number field. Not using this field means that after it's inserted the first product it mistakes the second one for a duplicate and skips over it.

Altering the final AND statement to;

AND (v_Product.Product_Lot_Number IS NULL AND LGI.PRODUCT_LOT_NUMBER IS NULL OR (v_Product.Product_Lot_Number = LGI.PRODUCT_LOT_NUMBER AND v_Product.Product_Item_Number = LGI.PRODUCT_ITEM_NUMBER))

For BOTH pieces of SQL (the one updating  all the existing records as well as this one which is creating a new master record) addresses the issue logically, if you rebuild the cache from scratch (using the script above) and re-run the SQL to get the content of the cache you will now see;

Lot Genealogy Cache - Complete
The first thing you notice is there are a lot more records. The new records are for Item Number 07440 which is the one we were missing earlier.

If you do a source comparison between the new and old versions of the package you'll notice that this wasn't the only change I made - I've added a great deal more logging to the WIP Issue transaction so it's possible to see what is going on and what "SQL%ROWCOUNT = 0" actually means! I also added a warning to WIP Completion if no records are updated.

I hope this is of some interest ... Debugging an application, even one you've written yourself, is a skill that's very difficult to transfer. Hopefully there is enough information here for people to at least make a start for future issues (which will, by their very nature, be completely different I'm sure!).
Read More
Posted in e-Business Suite, EBS, lot genealogy, Oracle, Oracle Process Manufacturing | No comments
Newer Posts Older Posts Home
Subscribe to: Posts (Atom)

Popular Posts

  • Oracle PL/SQL: Working With Oracle Jobs (Showing/ Deleting/ Adding)
    Showing All Oracle Jobs To see a list of the currently configured Oracle Jobs use the SQL; SELECT job,        schema_user,        last_date,...
  • Oracle PL/SQL: Copying Column Comments From One View To Another
    This blog post gives a piece of simple SQL that will allow you to automatically copy the comments from one database view to another. In the ...
  • Oracle PL/SQL: Using DBMS_LDAP To Query Active Directory
    It's always useful to be able to retrieve details from Active Directory when working within an Oracle Database (I'll do a separate p...
  • PL/SQL: Using R12 Item Open Interface Tables
    I'm sure we won't be the only company to need to do a batch update of all the existing Items (in INV.MTL_SYSTEM_ITEMS_B) so I though...
  • SSRS: Deployment Problems With Large Data Models
    This blog post describes how to solve the "Maximum request length exceeded" error when deploying large data models; the "maxi...
  • SSRS: Creating a "Shared Reports" folder in Reporting Services
    This blog post covers step-by step instructions on how to create a folder that can be shared between multiple users without being publicly f...
  • Oracle EBS: Creating New Menu Items in Oracle e-Business Suite
    NOTE: Don't do this on a production environment. Did that need saying? Apparently one person who submitted a comment seemed to think so ...
  • Lot Genealogy, Part 3: Debugging Lots and Batches
    As you might have noticed I've just updated the LOT_GENEALOGY source code for this project to a new version following the discovery of a...
  • SSRS: Adding Calculated Fields To Data Sets
    This blog post covers an example of how to add a simple calculated field to a Dataset in SQL Server Reporting Services using Report Builder ...
  • Noetix: Adding a new Z$ Column Reference
    Sometimes you need to add an additional Z$ column to link between two view. This Google Knol tells you what you need to know to make a simpl...

Categories

  • .net framework
  • #Error
  • 1080p
  • 1248ub
  • 2007
  • 2008R2
  • 32-bit
  • 4.1.1
  • 64-bit
  • 720p
  • accellion
  • active directory
  • ad
  • airplay
  • All_Col_Comments
  • All_MViews
  • All_Objects
  • All_Source
  • All_Tab_Columns
  • All_Tables
  • All_Views
  • ALR_Action_Outputs_Pkg
  • ALR_Action_Sets
  • ALR_Actions_Pkg
  • ALR_Alert_Installations_Pkg
  • ALR_Alert_Outputs_Pkg
  • ALR_Alerts_Pkg
  • ALR_DBTrigger
  • amazon wishlist
  • aod
  • AP
  • AP_Credit_Card_Trxns_All
  • AP_Invoices_All
  • AP_Payables
  • AP_Vendor_Sites_Pkg
  • AP_Vendors_Pkg
  • app-v
  • apple
  • apple mac
  • apple maps
  • apple tv
  • application virtualisation
  • AR_Receivables
  • arbury carnival
  • arbury community centre
  • arbury court
  • arbury court library
  • army of darkness
  • army of darkness defense
  • asp.net
  • audiobooks
  • bar hill cambridgeshire uk
  • bar hill library
  • bbc micro
  • bids
  • biztalk 2009
  • british telecom
  • business intelligence development studio
  • business objects
  • c sharp
  • cambridge central library
  • cambridge regional college
  • cambridge station
  • cambridgeshire county council
  • cambridgeshire library service
  • Cast()
  • ccc
  • CDate()
  • citi 1
  • city councillor
  • classic pc
  • cmdb
  • commodore 64
  • Concurren Requests
  • configuration items
  • configuration management database
  • conservative
  • Count()
  • county councillor
  • crc
  • D600
  • data model
  • data source
  • database link
  • dataset
  • DateAdd()
  • DateSerial()
  • dba_jobs
  • DBA_Objects
  • DBA_Tab_Columns
  • dbms_job
  • DBMS_LDAP
  • dbms_refresh
  • dbo.AllUserData
  • dbo.Catalog
  • dbo.ExecutionLogStorage
  • Dell
  • district councillor
  • doodle.com
  • dos box
  • driver
  • e-Business Suite
  • easypush
  • EBS
  • email
  • epetitions
  • excel
  • ExecutionLog2
  • fa
  • FA_Fixed_Assets
  • fixed assets
  • FND_Form_Functions
  • FND_Form_Functions_Pkg
  • FND_Global
  • FND_Menu_Entries
  • FND_Menu_Entries_Pkg
  • FND_Menus
  • FND_Profile_Option_Values
  • FND_Profile_Options
  • FND_Program
  • FND_Request
  • FND_Users
  • FOI
  • Format()
  • freedom of information
  • Functional Administrator
  • GL_Daily_Rates_V
  • GL_Item_Cst
  • GL_Je_Lines
  • GL_Ledger
  • Gmail
  • GMD_Product_Development
  • GME_Process_Execution
  • GMF_OPM_Financials
  • GMF_Period_Balances
  • GMF_SLA_Cost_Subledger
  • gmfg0_item_costs
  • GMI_Onhand_Inv_By_Lot
  • GMI_Process_Planning
  • google
  • google dns
  • google knol
  • google maps
  • green
  • gremlin
  • group policy
  • guided bus
  • high definition
  • home hub 3.0
  • home sharing
  • hr.net
  • i-Expenses
  • ibm
  • iccid
  • iExpenses
  • IIF
  • IIF()
  • iis
  • iis 6
  • imei
  • information
  • installation
  • InStr
  • InStrRev
  • Internet Expenses
  • INV_Forecasts
  • INV_Inventory
  • INV_Item_Onhand_By_lot
  • inv_lot_transactions
  • INV_Onhand_Quantities
  • INV_Period_Close_Details
  • INV_Quantity_Tree_Pub
  • inv_reservations
  • iOS
  • iOS 6
  • ip address
  • iPad
  • ipconfig
  • iPhone
  • iPod
  • iresign
  • itunes
  • java
  • Join()
  • june
  • key flex field
  • Key Flex Fields
  • kff
  • labour
  • Latitude
  • Left()
  • level 50
  • Liberal Democrat
  • libraries
  • Lookup()
  • lot genealogy
  • materialized views
  • maximo
  • microsoft
  • microsoft app-v
  • microsoft exchange
  • microsoft paint
  • migration
  • MobileIron
  • Month()
  • MRP_Forecast_Dates
  • MRP_Forecast_Designators
  • msi
  • Mtl_Material_Status_History
  • MTL_System_Items_B
  • mtl_system_items_interface
  • mustek
  • N_Buffer
  • N_F_KFF_Flex_Sources
  • N_GSeg_Pkg
  • N_Gseg_Utility_Pkg
  • N_KFF_Ctlg_Grp
  • N_KFF_GL_Acct
  • N_KFF_Item_Loc
  • N_KFF_Mtl_Cat
  • N_KFF_Sys_Item
  • N_KFF_Sys_Item_Pkg
  • N_Role_View_Templates
  • N_View_Column_Property_Templates
  • N_View_Column_Templates
  • N_View_Columns
  • N_View_Query_Templates
  • N_View_Table_Templates
  • N_View_Templates
  • N_View_Where_Templates
  • N_Views
  • native-mode
  • ncm
  • NLS_Language
  • NLS_Territory
  • noetix
  • noetix customization maintenance
  • noetix views
  • Now()
  • OE_Order_Entry
  • OIE
  • open interface
  • open source software
  • opensource-it.com
  • opm
  • ORA-01795
  • Oracle
  • Oracle Alerts
  • oracle client
  • Oracle General Ledger
  • Oracle Internet Expenses
  • Oracle Payables
  • Oracle Process Manufacturing
  • oracle sql developer
  • orchard park
  • os x
  • os x lion
  • Outlook
  • parish councillor
  • Payables
  • pc line
  • pcl-3000
  • pl/sql
  • PO_Distributions_All
  • PO_Purchasing
  • PO_Vendor_Sites
  • PO_Vendors
  • port forwarding
  • quick guide
  • Recyclebin
  • Release 11
  • Release 12
  • remote server administration tools
  • Replace()
  • report builder 3
  • router
  • run as a different user
  • sap
  • scom
  • services
  • sharepoint
  • sharepoint 2007
  • sharepoint 2010
  • sharepoint content types
  • sharepoint document library
  • sharepoint integrated-mode
  • sharepoint native-mode
  • sla
  • smtp
  • sql server
  • sql server 2012
  • sql server analysis services
  • sql server integration services
  • sql server reporting services
  • ssas
  • ssis
  • ssrs
  • subledger accounting
  • subsidence
  • super hub
  • sysdate
  • system centre operations manager
  • telnet
  • test
  • textfile-search-and-replace
  • tnsnames.ora
  • town councillor
  • udid
  • ukip
  • umbraco
  • user accounts
  • User_Triggers
  • virgin media
  • vizual
  • vmware fusion
  • windows
  • windows 2003
  • windows 2008r2
  • windows 7
  • windows 8
  • windows 8 consumer preview
  • windows 8 server
  • windows update
  • windows vista
  • Wireless Drivers
  • wireless networking
  • wItem Installer
  • wnoetxu2.sql
  • wnoetxu5.sql
  • wnoetxu6.sql
  • work order
  • workflow builder
  • world of spectrum
  • xcode
  • XLA_Distribution_Links
  • xxk_mtl_cat
  • XXNAO
  • Year()
  • zool
  • zx spectrum

Blog Archive

  • ▼  2013 (43)
    • ▼  August (2)
      • Designing and Building your CMDB, Part 1: From Sys...
      • PL/SQL: Dynamically Building Your Data Archive
    • ►  June (1)
      • Testing SMTP Connections Using Telnet
    • ►  May (2)
      • SSRS: Searching The Reporting Database - Which Rep...
      • Excel 2013: Getting Data From Parametized SQL Quer...
    • ►  April (8)
      • EasyPush: Creating Member Groups To Control iPad E...
      • MobileIron: Removing a Deployed Application From i...
      • Lot Genealogy, Part 3: Debugging Lots and Batches
    • ►  March (3)
    • ►  February (14)
    • ►  January (13)
  • ►  2012 (63)
    • ►  December (2)
    • ►  October (1)
    • ►  September (4)
    • ►  August (4)
    • ►  July (5)
    • ►  June (6)
    • ►  May (3)
    • ►  April (4)
    • ►  March (10)
    • ►  February (11)
    • ►  January (13)
  • ►  2011 (65)
    • ►  December (8)
    • ►  November (8)
    • ►  October (7)
    • ►  September (9)
    • ►  August (9)
    • ►  July (9)
    • ►  June (6)
    • ►  May (2)
    • ►  March (1)
    • ►  February (5)
    • ►  January (1)
  • ►  2010 (9)
    • ►  December (1)
    • ►  November (3)
    • ►  September (1)
    • ►  July (1)
    • ►  June (1)
    • ►  February (2)
  • ►  2009 (9)
    • ►  December (1)
    • ►  November (1)
    • ►  August (1)
    • ►  July (1)
    • ►  May (3)
    • ►  March (1)
    • ►  February (1)
  • ►  2008 (11)
    • ►  November (2)
    • ►  October (1)
    • ►  July (1)
    • ►  May (1)
    • ►  April (2)
    • ►  February (1)
    • ►  January (3)
  • ►  2007 (4)
    • ►  December (4)
  • ►  2004 (1)
    • ►  December (1)
Powered by Blogger.

About Me

Unknown
View my complete profile