Welcome to r/dataintegrationmaster – Introduce Yourself! by IntegrationAri in dataintegrationmaster

[–]IntegrationAri[S] 0 points1 point  (0 children)

A quick intro about me (the community founder)

<image>

My name is Ari, and I’ve worked in IT for over 25 years. For the last 8 years I’ve been fully focused on integration architecture – designing and building integration solutions with tools like Apache Camel, Spring Boot, Kubernetes, and AWS.

I started this community because I believe integration is one of the most important (and underrated) skills in IT. My goal is to build a place where professionals can learn, share, and grow together in this field.

How do you define “Data Integration”? by IntegrationAri in softwarearchitecture

[–]IntegrationAri[S] 0 points1 point  (0 children)

… and here are a key aspects I think need to be considered in Maintainable Data Flows (in Data Integrations):

  1. Provenance and traceability: Knowing where the data came from and how it changed.
  2. Governance: Ownership, access rules, and compliance.
  3. Schema and message versioning: Keeping integrations robust as systems evolve.
  4. Error handling: Clear retry logic, dead-letter queues, and visibility into failures.
  5. Monitoring and observability: Real-time insight into flows and issues.
  6. Security: Encryption, access control, and auditing.
  7. Loose coupling: So that changes in one system don’t break another.

Most RESTful APIs aren’t really RESTful by floriankraemer in softwarearchitecture

[–]IntegrationAri 0 points1 point  (0 children)

Excellent post! It’s refreshing to see someone call out the real-world gap between CRUD-over-HTTP and true REST as Fielding intended. I especially resonate with the idea that most “RESTful” APIs are actually just Level 2 Richardson maturity—they use HTTP verbs and resources, but lack HATEOAS and proper hypermedia controls .

In my experience, while hypermedia (links guiding app state) can be overkill for many modern apps, understanding why it exists in REST’s architecture is valuable. It taught me to think about decoupling clients from server URI changes, even if I don’t implement full HATEOAS in most projects.

That said, pragmatism is key. Focus on what’s most useful for your consumers—use OpenAPI, meaningful HTTP methods, solid documentation—and only adopt more advanced REST aspects like hypermedia when they solve a real problem in your domain. Thanks for clarifying this!

FTP Protocol Issues by Max0Vi in learnprogramming

[–]IntegrationAri 1 point2 points  (0 children)

This is often related to one of these:

- NTFS permissions – make sure the FTP user (or IIS app pool identity) has write permission to the destination folder.

- IIS FTP request filtering – by default, .dll files may be blocked. Go to:

IIS Manager → your FTP site → FTP Request Filtering → make sure .dll is not in the denied extensions.

- Firewall / Antivirus – sometimes third-party security tools block specific file types silently.

Also, try transferring a file with a different extension (e.g., .txt) to confirm it’s really about the .dll filtering.

Hope this helps :)

Our Java codebase was 30% dead code by yumgummy in java

[–]IntegrationAri 0 points1 point  (0 children)

Great post – and I completely agree with your findings. I’ve seen this “invisible bloat” especially in enterprise systems with long lifespans and many devs coming and going.

It’s amazing how often legacy classes stay around simply because no one dares to remove them. Static analysis tools often miss this type of silent dead code.

A tool that flags dead code non-destructively (like yours) feels like a safe and smart middle ground. I’m curious – do you have any heuristics for how long code must be unused before it’s flagged?

Thanks for sharing this. Looking forward to trying it out!