2012-10-21

Automatic replication of MySQL databases with Rsync

In some posts   I have written about replicating Business Intelligence information to Local Area Network satellites.
Instead of using normal database backup procedures that guarantees the integrity of the database I use rsync and file copy the database from the source database server over to the target database server. I can do this since I know no updates are done to the database while replicating and I use MySQL MyISAM storage engine. My rsync procedure is very simple, fast and self-healing, but
Do not try this at home
The real reason why I replicate this way is - I like to experiment and try new things and I have not seen anyone replicate databases like this before.  

The Setup

This is how I have set it up. From the controlling ETL server I issue commands via ssh  to the source and target systems:
           Source system           Target system
1        Flush tables
2                                                 Stop MySQL
3        Run Rsync_repl.sh
4                                                Start MySQL.
I use ssh from the ETL server (where my Job scheduler runs) and issue the commands from a job.
First I need to set up SHH (control server):
ssh-keygen
ssh-copy-id -i ~/.ssh/id_rsa.pub userid@targetDBserver
and then sudo (Target server/BI Satelite):
visudo  (add)
MYSQLADM ALL = NOPASSWD: /usr/sbin/service
and then test it:
ssh -t userid@targetDBserver sudo service mysql status
from the control server. You should receive mysql status from the target database server with any prompts for password.
In the source database server I did almost the same thing. First SHH (control server):
ssh-copy-id -i ~/.ssh/id_rsa.pub userid@sourceDBserver
and then sudo (Source server/BI Master):
User_Alias MYSQL_REPL = userid
MYSQL_REPL ALL=(ALL) NOPASSWD:/path2/rsync_repl.sh *
replicate.sh * is a bash script (appended below) that rsync Mysql databases to the target database server. Now I have all things in place and I can system test from my control server
ssh -t userid@targetDBserver sudo service mysql stop
ssh -t userid@sourceDBserver sudo rsync_repl.sh
ssh -t userid@targetDBserver sudo service mysql start

The automation.

With everything in place and tested, I only have to create a job and schedule it.

<?xml version='1.0' encoding='UTF-8' standalone='yes'?>
<job name='replicateDB' type='sql'>
<!-- This job replicate databases from Source Host to Target Host -->
<!-- Note of Warning! This is not according to any safe procedure. Do not try this at home! -->
 
<tag><name>TargetHost</name><value>TIPaddr</value></tag>
  <tag><name>TargetUser</name><value>Tuserid</value></tag>
  <tag><name>SourceHost</name><value>SIPaddr</value></tag>
  <tag><name>SourceUser</name><value>Suserid</value></tag>
 
  <sql>FLUSH TABLES</sql>
  <exit>
    <!--Action 1:  stop mysql in target server -->
    <!--Action 2:  replicate from Source to Target -->
    <!--Action 3:  start mysql in target server -->
    <action wait='yes' cmd='ssh' parm='-t @TargetUser@@TargetHost sudo service mysql stop'/>
    <action wait='yes' cmd='ssh' parm='-t @SourceUser@@SourceHost sudo /path2/rsync_repl.sh'/>
    <action wait='yes' cmd='ssh' parm='-t @TargetUser@@TargetHost sudo service mysql start'/>
  </exit>
</job>
As a safety measure this job first flush MySQL tables to disk and then runs the exit actions. And that’s it.
If which God forbid the replicated database is trashed, I just have to run the job again. You can guarantee the integrity of the database by running the job repeatedly until no data is replicated, basically if this replication is faster than the ‘update rate’ your database will be fine.  
 I conclude this series of posts  with the rsync_repl.sh script. The comments say it all ‘hopefully the replicated database is OK’ no guarantees!
p.s.
You can make this procedure secure by take a table lock before the second replicate, but in my case it is not necessary.
 

2012-10-12

Having PHP FUN(ctions) with SAP shop calendar

The other day I needed to calculate number of workdays in a 60 days period for our Tierp Factory .
Since I had to do this in our BI system I needed the shop calendar for Tierp from SAP. I started to look around for some procedure to extract the shop/factory calendar from SAP. To my surprise I didn’t find anything. The closest I found was  the bapi  DATE_CONVERT_TO_FACTORYDATE. This piece of code takes a date and gives the shop calendar equivalent, i.e. it tells you if the date is a work day or not and the closest  work day if it is a holiday. By exposing a range of dates to date_convert_to_factorydate you can create a shop calendar. I also found some interesting tables in SAP where I could extract the information I needed, but I decided to have some fun with my PHP job scheduler and call  date_convert_to_factorydate repeatedly for a long date span, and assemble the shop calendar with the result. I needed to feed the date_convert_to_factorydate with the id  of the Tierp factory calendar, a range of dates , and a flag  telling date_convert_to_factorydate to look forward or backward for closest work day.
The id is stored in SAP TFACD  table, the date span could easily be created by a function and I wanted a backward search. Then I just had to call SAP once  for each date in the span, for this I could use a job iterator, which would do the job, but with a nasty side effect, since the job calling SAP would repeat itself it would log on and log off from SAP once for each date. Instead I pushed down the iterator to the SAP communicator sap2.php, which logs on once and then iterates through  the dates and appends the results. After this its just a simple matter of loading the result into our BI systems mySQL database. OK here we go:
For you who have followed my posts about my PHP job scheduler  and the SAP communications examples . The interesting pieces here are the PHP functions to generate the start of the calendar in <tag name=’CALSTART’...>  and the use of the iterator <mydriver>  which creates a date span, which the script sap2.php will iterate through and call date_convert_to_factorydate once for each date. The second job in this schedule createCalendar just loads the result into MySQL.
Here you see part of the created shop calendar:
Example - The date 2012-07-08 is a holiday and the closest work day is 2012-07-06.  
If you take the time to study the first job getCalendar you will notice there is actually quite a lot FUNctionality packed in there.
My first use of this shop calendar is an advanced formatted Excel sheet, which I will create and email in subsequent jobs. That will be fun too.

2012-10-07

Language confusion

Yesterday I was setting up a job using the SAP BAPI function BAPISDORDER_GETDETAILEDLIST. As always I google on a subject before I start, while googling I found a conversation like:

Question - How should I specify the parameter I_MEMORY_READ?

Reply       - It doesn’t harm to read the documentation in the program.

The documentation in the program reads:

SPACE = zuerst Pufferzugriff, dann DB-Zugriff, 'A' = nur Datenbankzugriff, 'B' = nur Pufferzugriff

 

If your only language is English, you may have problems to understand this. If you’re only knowledge of indo-european language is bad English the above explanation may be incomprehensible. But there is help, feeding the phrases into Google translate gives:

  1. ‘zuerst Pufferzugriff, dann DB-Zugriff’         -> ‘first buffer access, then DB access’
  2. ‘nur Datenbankzugriff’                                -> ‘only database access’
  3. ‘nur Pufferzugriff’                                -> ‘only buffer access’

Now this should be understandable for most people with a basic understanding of English. But it is a hell of an effort to find the relevant phrases in text you do not understand and then feed them into Google translate. On the positive side you are not limited to english, if I feed the first phrase ‘ zuerst Pufferzugriff, dann DB-Zugriff ’  into Google translate I can choose my native Swedish which gives ‘ första bufferten åtkomst, då DB åtkomst ’. Well that was maybe bad luck it’s gibberish, it may look like swedish but it doesn’t mean anything. Even if you have access to Google translate you may need more helpful advice than RTFM, if you do not have a better advice you should maybe not answer at all, or do not give that type of comments, it do no one any good.

These days when English is the Lingua Franca of the world you see a lot of strange English, this post is good example of that. And I have said many things in ‘English’ that have amused others. E.g. my tongue easily slip on the vowel in ‘cook’ making it mean something else.  

Even if you know English well as a second language, you will translate into English differently depending on your native tongue, cultural background and nationality. From working with MRP systems I’m well acquainted with the phrase ‘ Part number was not found ’. A swede would probably say ‘ Article number missing ’ while a german put it like ‘ Material number was not possible ’. I once was talking with an American and an Englishman, the english guy was telling about a car accident he had with a lorry , I could tell from the expression on the american guy he didn’t quite understand, so I jumped in and said he means a truck . ‘Uhha’ said the American.

I do not know if this a true story or not. When the swedish company Electrolux started to market their vacuum cleaners in the US they used the slogan ‘ Nothing sucks like Electrolux .’ I know of a swedish company who tried to sell a range of environmental friendly  beauty products and perfumes in the US using the brand name ‘ Nature calling ’.

2012-10-03

Crawl before you walk, my first lines of node.js code.

Last Saturday I finally had some time for my node.js project . After some clumsy attempts to write a web app with the flatiron framework, I realized I knew too little about web development to successfully create something usable in a reasonably timeframe without help. I found the book  ‘The Node Beginners Book’ by Manuel Kiessling and it help me through the process of creating a simple web service. Input is a URL query and the output is a JSON object.
The input query requests products with a SQL ‘like predicate’ and the output JSON object is a list of products. Very primitive web app and very primitive code. But it gave me the opportunity to study javaScript and get the grips on how and why functions are passed around. This was a bit confusing at first but I think I got the hang of it. Now I think it would have been a serious mistake to start with a (Flatiron) framework. If you know Node.js and JavaScript you can probably appreciate a framework better. Crawl before you walk.