You are not logged in.
Using TAC (Big Data Edition), I have a job that needs to be deployed to many Job Servers in a cluster. The first few generate, deploy and run just fine, but then a job will get stuck in the Generating step and never complete. Even after stopping/restarting TAC (on linux), stopping/restarting the job servers, it's still stuck.
In the CommandLine view of TAC, the step that is in the running phase is ExportJobCommand.
Even if I kill this job and create a duplicate, the duplicate becomes hung as well.
I had a similar problem about a year ago. The jobs had many context variables and they exceeded the length of the db field in the H2 database running TAC.
I ran these alter table statements:
alter table "executiontaskjobprm" alter column "label" VARCHAR(510);
alter table "executiontaskjobprm" alter column "originalvalue" CLOB(4147483647);
alter table "executiontaskjobprm" alter column "defaultvalue" CLOB(4147483647);
alter table "taskexecutionhistory" alter column "contextvalues" CLOB(4147483647);
You might not have the same problem I had, but it certainly sounds like similar symptoms.
Thanks, Pedro. That cleared out the Command Line records from TAC, and reset the job to Ready to Generate, but it still gets stuck when I try to regenerate. It hangs at the ExportJobCommand step.