Home > Forum > Actions > On Time Out Action & REST call

On Time Out Action & REST call
1

Hello community!

Maybe this is a very basic question, but I am asking it anyway ;-)

I tried out OnTimeOut action, but after some experiments it is very unclear to me, how this is expected to work.

I added one time out action with Type: Minute, Interval (minutes): 1, Repetition count: Infinite and
1 action below which basically changes the value of a single field.

My expectation was, that this field change should happen on every workflow element every minute, but this didn't happen.

What did I do wrong?

I know this is a dangerous scenario, but it is for testing purposes only.

What we want to achieve is a 'Invoke REST service call' which updates some fields in current element and saves current instance only
if any of the fields has changed.

Is that even possible?

Many thanks in advance & best regards, Nik

MVP

Hi Nik,

the OnTimeout is created when a workflow enters a step with the current configuration. If you change the interval this will only apply to workflows which entered the step afterwards.
If you want to repeat something for all workflows you need to create a cycling action. I think it's on the tab where you can set global menu buttons.

While you can check the OnTimeout actions in the administrative view of a workflow instance you need to use the Designer Studio for the cycling actions.

Back to your original goal.
As you mentioned this can be dangerous because numerous versions may be created.
I would choose a different approach. I'm assuming that you started/created some external content from the workflow and are now waiting for a state change of the external content. Maybe you placed a purchase order in an ERP system and are waiting that this is closed. Because you don't have access to the ERP systems it can't tell your workflow that it's finished and you need to pull the state.
In this case I would create a separate monitoring workflow. A new instance will be created via a cycling action.
1. Step: Gather all waiting workflows in an item list
2. Step: Query the ERP system for the state of those workflows
3. Step: Compare returned values to workfow values and set an action: Nothing Todo, move forward
4. Step move all workflows whit the appropriate action.

This has a few benefits:
You have a detailed overview/logging why something didn't happen
You can test it without waiting for the external system.
You don't create multiple versions on the workflow instances
You can delete these monitoring workflow instances via a cycling action after some days

Best regards,
Daniel

In reply to: Daniel Krüger (Cosmo Consult)

Hi Nik,

the OnTimeout is created when a workflow enters a step with the current configuration. If you change the interval this will only apply to workflows which entered the step afterwards.
If you want to repeat something for all workflows you need to create a cycling action. I think it's on the tab where you can set global menu buttons.

While you can check the OnTimeout actions in the administrative view of a workflow instance you need to use the Designer Studio for the cycling actions.

Back to your original goal.
As you mentioned this can be dangerous because numerous versions may be created.
I would choose a different approach. I'm assuming that you started/created some external content from the workflow and are now waiting for a state change of the external content. Maybe you placed a purchase order in an ERP system and are waiting that this is closed. Because you don't have access to the ERP systems it can't tell your workflow that it's finished and you need to pull the state.
In this case I would create a separate monitoring workflow. A new instance will be created via a cycling action.
1. Step: Gather all waiting workflows in an item list
2. Step: Query the ERP system for the state of those workflows
3. Step: Compare returned values to workfow values and set an action: Nothing Todo, move forward
4. Step move all workflows whit the appropriate action.

This has a few benefits:
You have a detailed overview/logging why something didn't happen
You can test it without waiting for the external system.
You don't create multiple versions on the workflow instances
You can delete these monitoring workflow instances via a cycling action after some days

Best regards,
Daniel

Hi Daniel!

Thanks for your answer.

I wasn't aware that timeout is only running, when workflow instance enters the step, where timeout is defined, but thanks for the clarification on that.

I build a custom action, which checks changes in some fields compared with external data (text fields, user fields) and I'm only saving changes, when data
has changed. Property args.HasErrors is very useful on that ;-)

So we are able to do a periodic pull of external data and we are not creating any new versions if nothing has changed.

When it comes to external data at the moment we have a well defined a JSON structure and some mock data for testing. In the future it is planned
to call some external REST services to get the data, but this is not available yet.

At the moment I have no idea of how to get the JSON file, maybe it should be added periodically as an attachment to another workflow (via REST call from
the external system)?

The next thing is how to read this attachment in my custom action, how can this be done?

And I also don't want to pull the whole content each time (this happens on every workflow element), parse the whole structure and filter out my relevant data to be able to update current instance (we are talking about several thousand instances).

Maybe you have some thoughts on that, which are pushing me in the right direction?

Thanks a lot in advance & best regards, Nik

MVP
In reply to: Nikolaus Schusser

Hi Daniel!

Thanks for your answer.

I wasn't aware that timeout is only running, when workflow instance enters the step, where timeout is defined, but thanks for the clarification on that.

I build a custom action, which checks changes in some fields compared with external data (text fields, user fields) and I'm only saving changes, when data
has changed. Property args.HasErrors is very useful on that ;-)

So we are able to do a periodic pull of external data and we are not creating any new versions if nothing has changed.

When it comes to external data at the moment we have a well defined a JSON structure and some mock data for testing. In the future it is planned
to call some external REST services to get the data, but this is not available yet.

At the moment I have no idea of how to get the JSON file, maybe it should be added periodically as an attachment to another workflow (via REST call from
the external system)?

The next thing is how to read this attachment in my custom action, how can this be done?

And I also don't want to pull the whole content each time (this happens on every workflow element), parse the whole structure and filter out my relevant data to be able to update current instance (we are talking about several thousand instances).

Maybe you have some thoughts on that, which are pushing me in the right direction?

Thanks a lot in advance & best regards, Nik

Hi Nik,

after you "confirmed" my assumption I would still recommend this approach of creating a "monitoring" workflow

1. Step: Get the json as an attachment
2. Step: Parse the relevant data to an item list
3. Step: Update the item list with the corresponding workflow id and set an Action column to "Undefined"
4. Step: Set an Action column to a state: Do nothing, Fields Change
5. Step: Update all workflows with value "Fields Changed"
6. Step: Monitoring completed

A new instance would be created every day or so via a cycling action.
If there are thousands of rows I wouldn't render the item list in the UI though. :)
Of course you can combine multiple steps, but this will reduce the "debugging" options. The workflow automatically moves on using timeouts with a path. So you can also trigger them on demand.

Some other ideas:
- Put the mock file on the dev web site / create a new site so you can already download it via an anonymous get
- Attachment to Item List example
https://github.com/cosmoconsult/webconbps/blob/main/SDK_Actions/CC_XmlToItemList/CC_XmlToItemList/XMLtoItemList.cs
- You can also retrieve the JSON and put it in a field using a custom Business Rule. Here's a similar example:
https://github.com/Daniel-Krueger/webcon_reportSubscriptions/blob/main/ReportSubscriptions/ReportSubscriptions/GetReportAsHtmlTable.cs


Best regards,
Daniel

MVP
In reply to: Daniel Krüger (Cosmo Consult)

Hi Nik,

after you "confirmed" my assumption I would still recommend this approach of creating a "monitoring" workflow

1. Step: Get the json as an attachment
2. Step: Parse the relevant data to an item list
3. Step: Update the item list with the corresponding workflow id and set an Action column to "Undefined"
4. Step: Set an Action column to a state: Do nothing, Fields Change
5. Step: Update all workflows with value "Fields Changed"
6. Step: Monitoring completed

A new instance would be created every day or so via a cycling action.
If there are thousands of rows I wouldn't render the item list in the UI though. :)
Of course you can combine multiple steps, but this will reduce the "debugging" options. The workflow automatically moves on using timeouts with a path. So you can also trigger them on demand.

Some other ideas:
- Put the mock file on the dev web site / create a new site so you can already download it via an anonymous get
- Attachment to Item List example
https://github.com/cosmoconsult/webconbps/blob/main/SDK_Actions/CC_XmlToItemList/CC_XmlToItemList/XMLtoItemList.cs
- You can also retrieve the JSON and put it in a field using a custom Business Rule. Here's a similar example:
https://github.com/Daniel-Krueger/webcon_reportSubscriptions/blob/main/ReportSubscriptions/ReportSubscriptions/GetReportAsHtmlTable.cs


Best regards,
Daniel

An additional information regarding timeouts, copied from my latest blog post:

What you need to know though is, that the timeout actions are also created during path transition. If you use a save path, the old timeout actions removed, and new ones are recreated. This does not happen, when you use the save button.

If you wonder whether this is a problem, the answers is, it depends. Example: You want to escalate an outstanding task to a supervisor after five days. If the user only uses the save button the escalation will be send _five_ days after the workflow entered the step. If he uses a save path on the third day and again after an additional four days, the escalation will be triggered five days after the last save path has been used. Instead of sending the escalation after five days it will be sent _twelve_ days after the first time the workflow entered the step.

Remark: If you are using a field as a start date and need to calculate its value, you have to do this OnPath or OnExit. You can’t use OnEntry for this. The timeouts are created either in parallel or before OnEntry is executed so the field will either have no value or an old value. The first one caused me some problems because I wasn’t aware of the limitation. Answer in a support ticket from 2020: This is by design and won’t be changed.