2014-12-14

SAP SNC integration, Enter Powershell



For me shell programming mostly is to automate simple IT admin tasks like reorganising database tables, compact file systems etc. I’m not very good at doing the same thing twice so I tend to write ‘admin procedures’ for many small tasks. During the years I used a lot of shell scripting languages. The first I used was IBM mainframe EXEC/CLIST (Command List) and JCL (Job Control language). JCL is not normally not regarded as a shell scripting language, anyway that was many years ago both these languages were pretty primitive/crappy. In all fairness IBM has also constructed one great (shell) scripting language REXX.

When you write a shell script you often need to have a good understanding and control of the environment and context where your script runs. God interfaces to the underlying opsys and commands are essential as predefined constants of script name opsys version, current working directory etc. My ‘shell processes’ often consists of a part maintaining configuration parameters and another part to act upon these parameters.
Now I’m creating an interface between SAP SNC and External ERP systems, a task where a shell scripting procedure may seem an odd choice, but I got some odd requirements, the interface must work on a standard Windows desktop, very easy to distribute and handle, since the users of this interface are not expected to be IT-savvy. Anyway I thought why not give Powershell a go, and contain the entire interface in one script, for easy export/import. If you read the link above you know SAP SNC spits out a rather ugly CSV file and the receivers expects a decent structured XML file. I decided to do the file metamorphosis from CSV to XML in two steps, first transform the CSV file into a SAP SNC XML format, and then transform that XML file with a XSL script into the final XML format. It was surprisingly easy to create a draft. IT only took some hours due to some powerful functions in Powershell, e.g. a one liner to parse the complex SNC csv file:
$olArray = Get-Content ‘CSV-FILE-NAME’ | Select-Object -Skip 11 | ConvertFrom-Csv

Another very pleasant surprise was the abundance of useful information I found on the web, I could pinch a lot of code directly from blog posts. I didn't expect that, in the past I found free (as in free beer) Microsoft information rare.
It is simple to create popups for text input and file selection, which is of great value for configuration. 

But there were some nasties too, the worst is actually really bad, functions reply with all output produced e.g:
return ‘OK’
does not necessarily  only return ‘OK’ but all? output created in the function is also pushed on the ‘return stack’, this forces you to some extraordinary cleansing after the call:
$returnValues = function "$Parameter"
$OKstring = $returnValues[-1]  # pop the last value off the ‘return stack’
This is exacerbated by fact any object missing a verb in the code is automatically transferred to standard output making it appear on the ‘return stack’, this make it hard to find all things pushed onto the return stack. I find this behavior very strange. And it took me some time to figure it out. I hope I’m wrong about this or there is a clear ‘return stack’ option like  - return -only ‘OK’

The Powershell ISE hangs every now and then when I use popups, which seems to be placed haphazardly on the screen depending on Windows versions.
All in all I find Powershell scripting a pleasant experience, a very powerful language although a bit quirky. I have only been coding for about 15 hours with no training upfront, so I might have got things wrong.

My script now takes this:
to this:

I’m happy with my script. It is an easy configurable script able to run on most Windows system, so I hope. I have written the script with Powershell versions 3 and 4 in mind, but now I have realised version 2 is still very much used, so I may have to back port my script to vs 2.

2014-12-09

Stop write BOM in UTF-8 encoded files


I’m writing my first Powershell script, which converts CSV files to XML files.
So far it has been an easy ride, but this kept me occupied for some hours.
I looked at my newly created XML file in Google Chrome:

Looks quite pretty, doesn't it?
Now look at the same file in Microsoft Internet Explorer:

Yes that's right, just blanks. Why is that?
This is a snippet from my Powershell script, I think the comments say it all:
# Out-File writes a %&@£€ BOM header
# transXSL "$L" "$xsl" | Out-File $out # DOES NOT WORK! since Out-File writes a %&@£€ BOM header
# So I have to write the file like this so MS Internet Explorer can read the file.

$myXML = transXSL "$L" "$xsl"
[System.IO.File]::WriteAllLines($out, $myXML)

# Google Chrome has no problems mit or mitout the %&@£€ BOM header
# Why in Gods name does anyone force a BOM header onto UTF-8 files

The BOM character is valid in UTF-8 encoding. Just because you are allowed to litter does not mean you have to. Stop put crap into UTF-8 files. And when you read a BOM in UTF-8 files, just throw it away and proceed with next character.
I would like to have a Out-File writing UTF-8 files without BOM characters.

Here you can see the bugger, the first characters are the BOM character, confusing lot of softwares.


2014-11-30

SAP SNC integration, part 1

Recently I have had the pleasure to dig into SAP SNC (Supply Network Collaboration) download/upload interface. SNC is a portal for suppliers, I have no experience myself of SNC, but it is a portal where suppliers can log in and work in our SAP ERP, one key area in probably vendor managed inventory.
SAP SNC also supports up- and download of files for suppliers that do not want to enter transactions in real time but upload them in a file or download status reports etc. For this SAP SNC uses a rather complicated Comma Separated Value file format.


The Company have many small suppliers that want to interface SAP SNC automatically with their own ERP system. In order to do so they need to transform the CSV files of SNC into a format their ERP systems understand. And here is where we and the suppliers have a problem. They do not have an IT department that can transform SAP SNC complicated csv files into a format their ERP system understands. The SAP support guys at the company turned to me for help. We could go one of two ways, either do the transformation inside SAP SNC so the suppliers communicate with  files in ‘their’ formats or we give the suppliers transformation routines. Creating the transformation inside SAP SNC would take long time and be costly. I decided to try to build a transformation routine in the suppliers MS Windows environment. I did not want to rely on software not standard in MS Windows, I wanted to use as much standard concepts and software as possible. These days XML file format is a well established standard and it seems most of our suppliers prefer XML formats. It is actually a mystery to me why SAP decided to go for csv formats for these integration files, SAP is normally a strong proponent of XML integration, even when XML is not a good choice. XML can be very verbose and if you send information about very many object these streams may constipate your network. XML and SAP own IDOC format for SNC integration looks ideal to me. Maybe I have misunderstood something, but as it is now I have to deal with the csv format, IDOCs would have been much easier to transform.
There is no standard for parsing and transforming csv files and certainly not the csv files of SAP SNC, but there is an established and good standard for XML file transformation XSLT. You can use XSLT to transform csv files but it is complicated and XSLT is not designed for that. I decided to built my own SAP SNC csv file parser with tools available in a standard MS Windows PC. I did not want to invest too much time and efforts in this solution, since I’m convinced SAP will redesign this interface in not to far future. I did not want to change my procedure for every ‘supplier’ format. I also wanted to give the suppliers something they could modify themself. And lastly I wanted to learn something new. I’m not very comfortable in Windows so almost anything except bat files would be a learning experience. After studying the alternatives I decided to use Powershell scripting, something completely new to me. After browsing an introductory tutorial and some Googling I started to build my csv file parser. The supplier have to download the SNC cvs file, the run my parser script and then upload the transformed file into their ERP system. This parser is not beta tested yet, but when beta testing is done I will write a post about the parser.

I think this SNC up- and download integration is badly designed by SAP. If I had designed this schnittstelle, it should at least provide you with three options, use either CSV, XML/IDOC or JSON file formats.What I really like to see is an easy2use high-level toolbox in SAP for application integration. I think SAP customers are worth something better than present csv files, and I believe SAP can afford to develop such an interface. I might have missed something, is there something already available I’m not aware of?

2014-11-23

Bad Design


Part 1.

This summer my dishwasher broke down. It was a Miele from the nineties, a very good dish washer I was happy with. A bit noisy but it washed my dish all right. But this summer the water pump broke. Instead of replacing the water pump I decided to buy a modern dishwasher, I decided to go for a Bosch machine, and I installed it some weeks ago. It does a good job, the dish is clean, but still I’m a bit disappointed, the new machine is a tiny bit less noisy but it takes longer time to do the dish, and I do not like that. And it doesn’t dry the dish as good as the old one, I do not like that either. But  what really bugs me is the cutlery basket of the Bosch dishwasher:
In front the old lighter Miele cutlery basket and behind the darker Bosch basket.

As you can see the handle of the Miele basket is strengthened  by two beams for better stability. The Bosch handle is reinforced by beams on the inside of the handle creating perfect pockets trapping condensing steam making sure you always get wet and water is dripping down on the cutlery when you grab the handle.
The Bosch handle with inside water trapping beams.

This is not only an example of bad design but also of inadequate testing. If the Q&A guys in the Bosch laboratory had bothered to test the cutlery basket they would have spotted the problem, and sent it back to the developers.

Part 2.

Last week I had a call from a colleague ‘can you help with an interface, we need to transfer purchase orders from our SAP ERP system to the ERP systems of vendors’.
The vendors didn’t want to use the web portal as such, but preferred to exchange orders and acknowledgements via files. The problem was our ERP system could only communicate via CSV files and the vendors systems expected messages in their XML formats. If you have XML files on both sides a small XSL script is all you need, but our ERP system’s vendor portal can only communicate via CSV files. The year is 2014! I would not have created such an interface.

"Interface Type:","POCONF",,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
"Owner Partner:","XY00000025",,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
"Partner:","X000000001",,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
"Selection Profile Number:","00000000000001000045",,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
"Selection Profile Name:","PD_CONFIRM_PO",,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
"Created By:","XYZ123",,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
"Created On:","17.11.2014 10:47:59 CET",,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
"ORDER_ID","ITEM_ID","REQUESTED","CONFIRMED","TO_BE_CONF","REJECT","PRODUCT_ID","REV_LEVEL_ITEM","QUANTITY","QUANTITY_UNIT","DELIVERY_DATE","DELIVERY_TIME","DELIVERY_TZONE","SHIPPING_DATE","SHIPPING_TIME","SHIPPING_TZONE","REQ_PRICE","CONF_PRICE","CURRENCY","PRICE_BASE_QUANTITY","PRICE_BASE_QUANTITY_UNIT","SALES_ORDER_REFERENCE","REQ_MPN","CONF_MPN","REQ_MFR","CONF_MFR","CUST_LOC_ID","CMP_ID","CMP_PRODUCT_ID","CMP_REV_LEVEL","CMP_REQ_DATE","CMP_QUANTITY","CMP_QUANTITY_UNIT","CMP_BATCH_ID"
"PO No.","PO Item No.","Requested","Confirmed","To Be Confirmed","To Be Rejected","Product","RevLvl","Quantity","UoM","Deliv. Date","Deliv.Time","DlvTZ","Ship. Date","Ship. Time","ShipTZ","Requested Price","Confirmed Price","Crcy","PrU.","PrUoM","Reference Document Number of Sales Order","Requested MPN","Confirmed MPN","Requested Mfr","Confirmed Mfr","Customer Loc.","Component ID","Product","RevLvl","Requirement Date","Qty","UoM","Cust. Batch"
"4700000386","20","X","","","","ABC0221300","","2.200","PC","05.01.2015","00:00:00","CET","24.12.2014","23:59:55","CET","4,85","0.000000 ","SEK","1","PC","","","","","","6100","","","","","","",""
"4700000386","20","","","X","","ABC0221300","","2.200","PC","05.01.2015","00:00:00","CET","","","CET","","","","","","","","","","","","","","","","","",""
,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
A CSV file, SAP interface format.

As you see this is some CSV file! First we have a file header, then two! lines of labels and then purchase order lines. Anyone with experience of parsing CSV files understands this is not the easiest file to parse.
This is not only an example of bad design but also of inadequate testing. If the Q&A guys in the SAP laboratory had eaten their own dog food, they would had sent this back to the developers.

2014-11-08

PHP 5.6 Mojibake


The three little piggies have shown their ugly faces and bit me again. The three little piggies are the swedish national characters ÅÄÖ. The three letters with the ring and the dots are more than cute IKEA-dots as I once had them described to me, they are vowels in their own rights. Å is pronounced as the vowel sound in the english word ‘your’, Ä  as the vowel sound in ‘bear’ and Ö as in ‘sir’. These three letters has haunted me in my entire professional career. from EBCDIC in the IBM mainframes and now lastly in PHP 5.6. PHP now has a default encoding setting and this is UTF-8 by default. This is in itself is a good thing, prior to 5.6 character encoding was a mishmash of ISO-8859-1 and UTF-8. I knew of the change, but I didn’t pay attention to it and didn’t test it. I did not even ask someone else to test it. I should have known better. All the Ås, the Äs and the Ös was distorted when we made the switch to 5.6. In our Nantes plant they had managed to put in Äs in some part numbers (material number in SAP english), this caused some  ‘disturbance’ in BOM structures and some other places. How the french guys had managed to use the swedish Ä in part/material numbers is a mystery to me, it might just have been some double encoding error, I do not know.
The remedy for this mistranslation  is simple, just set default_charset=’ISO-8859-1’, but I had not tested it, so I sat the old ICONV( ISO-8859-1, ISO-8859-1, ISO-8859-1) instead, although deprecated in PHP 5.6. I had tested ICONV before and was pretty sure it would work. The right solution is of course to figure out what the problems with UTF-8 are and fix them. Since encoding now is addressed in a clear way in PHP,  it should be simpler to attack the problem, simple it will never be but simpler than before version 5.6.

2014-11-05

The Data Warehouse running on PHP 5.6.2



Stockholm, November 2014 - 59.330340, 18.129661

Finally the Data Warehouse is running on PHP 5.6.2. We have been testing PHP 5.6
since alpha 1, and the intention was to go production on PHP 5.6.0, but gather all necessary people for testing and and transition from 5.5.9 just took much longer time than anticipated. 
Next we will upgrade the Data Warehouse's Integration Workflow Engine. After that we will upgrade MySQL from version 5.5 to something newer. But for now we are pleased if we do not hit some PHP 5.6.2 nasties.

2014-11-02

Real time reporting


About five years ago I did a PoC how to do real time report/analysis of SAP COPA data. The principles and methods behind the PoC is described in this blog post. The business did not pursue my PoC but followed another path to COPA real time reporting.
COPA or more specific the cost based actuals line items in table CE1 are the basis for the financial reporting in the company and the bean counters need fresh figures around period book closing. If you ever paid attention to the financial department in your company you cannot have missed all activities are targeted to praise the holy period closing. A financial period can be divided into the preparation for the period close and the holy period close per se. I suspect financial managers compete, who can deliver the most accurate period close the fastest. Respectable and trustworthy financial people have told me I’m wrong, they claim the market demands swift closings, but they do not compete. Anyway the midnight lamp is burning in the finance department during period close, and the last these guys want is to sit and wait for the figures after last correction.
I invented user triggered data loading to speed up the finance reporting. The rest of the business was pretty uninterested in real time reporting. I have tried to promote real time with no success, fresh figures in the morning and stable throughout the day is was the business wants. (You can see a day’s night activities in the Data Warehouse in this exciting movie.)

Give the users the option to load the Data Warehouse themselves is a brilliant idea, this option is not something you find in every Business Intelligence system. When I introduced user triggered loading I transferred the commands from MS Excel to the Data Warehouse via a SSH link. This solution is not only complex it also needs installation of extra software in the client. Some time ago I developed a Node.js web server which executes Data Warehouse jobs as a web service. Apart from the web server itself this is a much simpler than the SSH connector. When I developed this web service there was no interest for it. But now there is a new interest for real time reporting in the Data Warehouse. A BI-developer have created a much better and simpler MS Excel interface using the web service in favour of my old complex SSH link. I am excited by this newly-awakened interest in real time reporting, since I have a feeling the Data Warehouse can excel in this. I still have not seen this new MS Excel interface, but I hope it will prove useful and that I will have reasons to write more of real time reporting.