Skip to content

Filecatalyst Workload Automation [updated] May 2026

success = run_fta(f, "/incoming/", "fc-server.company.com", "auto", "secret") if success: logging.info(f"Success: f") # Post-processing: log to database subprocess.run(["psql", "-c", f"INSERT INTO transfers VALUES('f', 'original_hash')"]) else: logging.error(f"Failed: f") time.sleep(30) # Backoff before retry if == " main ": main() Summary Table: Choosing an Automation Method | Requirement | Recommended Method | |-------------|--------------------| | Simple directory watching | Hotfolder | | Scripted, scheduled transfers | CLI + cron/systemd timer | | Complex workflow with multiple steps | CLI + Bash/Python logic | | Integration with Airflow/Jenkins | REST API or BashOperator | | Central management of many transfers | REST API + custom dashboard |

Integrate FileCatalyst with OS schedulers. filecatalyst workload automation

hotfolder.watch.dir=/opt/fc/watch hotfolder.target.server=192.168.1.100 hotfolder.target.port=11001 hotfolder.target.user=hotfold_user hotfolder.target.password=encrypted_pass hotfolder.target.directory=/uploads hotfolder.post.delete=true # Delete local after success hotfolder.compress=true # On-the-fly compression A monitoring system drops log files every hour → FileCatalyst transfers them to a central archive. Method C: REST API – Best for Centralized Workload Orchestration The FileCatalyst Server exposes a REST API (port 8080 by default) for managing transfers, users, and monitoring. success = run_fta(f, "/incoming/", "fc-server

Since FileCatalyst itself is primarily a high-speed file transfer solution (using UDP acceleration), it does not have a native "Workload Automation" engine built into its core. Instead, automation is achieved through its , REST API , and Hotfolders . Since FileCatalyst itself is primarily a high-speed file

import requests import time API_BASE = "http://fc-server:8080/api" API_KEY = "your-api-key" def run_transfer(local_path, remote_path): payload = "source": local_path, "destination": remote_path, "server": "destination-host", "username": "transfer_user", "password": "secret"

# Retry up to 3 times RETRIES=3 for i in $(seq 1 $RETRIES); do fta-cli --put critical_file.dat --target /incoming/ && break || sleep 10 done | Tool | Integration Method | |------|--------------------| | Apache Airflow | Use BashOperator with fta-cli or SimpleHttpOperator for REST API | | Jenkins | Execute shell script step calling fta-cli | | Rundeck | Create a job step: "Command" → fta-cli ... | | Control-M | FileCatalyst provides a Control-M plugin (File Transfer Hub) | | Apache NiFi | Use ExecuteProcess processor to call fta-cli |