You are not logged in.
with Talend mostly all is possible
You could define your input parameters as context variables and calculate the values you need for processing in a tJavaRow for example. Store the results in a context variable again. To get the number of rows in the table you could use a t[DB]Input "SELECT count(*) FROM XXX".
Then connect the subjob to process your data with onSubjobOk.
To handle the blocks there are several ways. One would be to calculate the "commit points" dynamically (for example "100,200,300), store them in a context (string), use a tFixedFlowInput to create a flow, process them with tNormalize (you now have three rows). Next step a tFlowToIterate and it is done...
Can I make it more complex that that and in the same way more dynamic?
I mean to define some variables for "no_rows = 100.000;" ,"crt_range = 1,2...n;" and to have a query that counts the number of rows then calculates the number of iteration based on no_rows and set the crt_range to each step. Would this be possible with talend?
I'll be very happy if anyone could show me how to do that 8-)
yes it is. I'm pretty sure that there is a thread in the forum about this issue with an example. But I didn't find it :-( Someone else remember the thread and know how to search for?
The idea was to define a table with information which data should be processed (for example key ranges). Than read the data and create an iteration from the flow to execute you t[DB]Input with a dynamic SQL statement.
First of all thank you all for the reply.
Is it possible to copy for example packages of 100.000 rows from a table until they are all copied (if the table has around 400.000 rows)? I mean 4 iterations for example.
to answer your question we need some more information.
But I think it will end up in creating one process for each table:
t[DB]Input --(row)--> tMap[or any complexer transformation] --(row)--> t[DB]Output
I want to transform it as I've write, the backup stuff isn't a solution because it will be identical and I don't want that.
If your need is as simple as you wrote then backup and restore your database into your new host.
I have a huge database with aprox. 50 tables. I want to be able to extract all the data form this database, transform it and put it on another host.
What is the simple way to do that?