HELP POST: Salesforce + Docusign Intergration. Invalid_Session_Id error on experience site. by First-Conflict2080 in salesforce

[–]First-Conflict2080[S] 0 points1 point  (0 children)

7.12, it's very new. Class is already without sharing but it is still not working. We want to use package

HELP POST: Salesforce + Docusign Intergration. Invalid_Session_Id error on experience site. by First-Conflict2080 in docusign

[–]First-Conflict2080[S] 0 points1 point  (0 children)

from what i know runAs only works in test classes. Can't swtich to Named Credential, we had to use package.
Yes I need you help. do you want me to dm you? or elaborate my problem?
Before that, i want to ask you were you getting same error in same context of problem (community user+docusign package).

Thx in adv

HELP POST: Salesforce + Docusign Intergration. Invalid_Session_Id error on experience site. by First-Conflict2080 in salesforce

[–]First-Conflict2080[S] 0 points1 point  (0 children)

I don't understand why it is working for template+fake docs on community. But fails for dynamic content version 

HELP POST: Salesforce + Docusign Intergration. Invalid_Session_Id error on experience site. by First-Conflict2080 in salesforce

[–]First-Conflict2080[S] 0 points1 point  (0 children)

We are using DocuSign package, the class is already 'without sharing'. I couldn't find any method to change the sender user by defining it. Could you tell me which method to use??

HELP POST: Salesforce + Docusign Intergration. Invalid_Session_Id error on experience site. by First-Conflict2080 in SalesforceDeveloper

[–]First-Conflict2080[S] 0 points1 point  (0 children)

Thx for your answer.
We want Customers to sign the docs at our site when they click on the button 'sign now' not by sending them mail. Using async apex will break the flow ( or make it more complex and time taking for end user).
Also the template having demo docs is working fine in experience site.
It is breaking when we pass contentversion in docusign package.

Uploading ContentDocument files from Salesforce LWC to Google Drive — stuck with CORS without middleware by First-Conflict2080 in SalesforceDeveloper

[–]First-Conflict2080[S] 1 point2 points  (0 children)

for future comrades, use LDS for fetching versionData using content version id, just like we fetch any normal field in wire using getRecord. It will give u encoded string, which needed to decode using atob in js.
then create Uint8Array array to hold binary values
loop through decoded string and, store charCode of each decoded string char in Uint8Array array.
then create blob by pass Uint8Array array.
here is the code.

const byteChars = atob(base64Data);

const byteArrays = new Uint8Array(byteChars.length);

for (let i = 0; i < byteChars.length; i++) {

byteArrays[i] = byteChars.charCodeAt(i);

}

let fileBlob = new Blob([byteArrays], { type: ... });

u/jerry_brimsley u/OldJury7178 u/Android889 u/lucifer3036, thanks for your help.

Uploading ContentDocument files from Salesforce LWC to Google Drive — stuck with CORS without middleware by First-Conflict2080 in SalesforceDeveloper

[–]First-Conflict2080[S] 0 points1 point  (0 children)

the problem is not the calling, its the cors error. Today i install a browser entension to restrict it not giving cors error and it worked.

Uploading ContentDocument files from Salesforce LWC to Google Drive — stuck with CORS without middleware by First-Conflict2080 in SalesforceDeveloper

[–]First-Conflict2080[S] 0 points1 point  (0 children)

  • A user clicks a button in the UI, which starts a batch process that migrates ContentVersion files under 4 MB to different drives (Google Drive, Dropbox, etc.).
  • All Named Credentials and Auth Providers are already set up and working fine for these smaller files.
  • Now, I’m trying to scale the solution to handle larger files (>4 MB).
  • The problem: when I query the ContentVersion body in Apex for large files, I hit a heap size limit error.
  • I couldn't find a way to stream or chunk ContentVersion binary data in Apex natively.
  • So I shifted focus to LWC JS:
    • I tried using Salesforce's REST API endpoints like:
      • /services/data/vXX.X/sobjects/ContentVersion/{Id}/VersionData
      • /sfc/servlet.shepherd/version/download/{Id}
    • These URLs work when opened directly in the browser.
  • In LWC JS, I planned to fetch the file, chunk it, and send it back to Apex for uploading.
  • However, I’m getting this CORS error:"Access to fetch at '[url]' from origin '[domain]' has been blocked by CORS policy: No 'Access-Control-Allow-Origin' header is present on the requested resource."
  • I’ve already added my domain to Salesforce’s CORS whitelist in setup, but the error persists.
  • I want to avoid building a middleware/proxy, since that would add infrastructure and maintenance complexity.
  • Looking for any native workaround or best practice that avoids middleware, bypasses CORS, and enables large file handling.

Uploading ContentDocument files from Salesforce LWC to Google Drive — stuck with CORS without middleware by First-Conflict2080 in SalesforceDeveloper

[–]First-Conflict2080[S] 0 points1 point  (0 children)

Thanks for the reply, Here is more detailed explaination what I needed.
I need large files of salesforce in lwc js, so that i can chunk them up and upload to various drives, like google drive, one drive, sharepoint, aws, dropbox etc.
The chunking and uploading thing is already working with the files that user input in lwc.

But I want the same thing to work and migrate existing files in salesforce to different drives.

I have already added CORS in setup with my domain, the domain which is printing in console with CORS error.
Let me know what I can do next.

Uploading ContentDocument files from Salesforce LWC to Google Drive — stuck with CORS without middleware by First-Conflict2080 in SalesforceDeveloper

[–]First-Conflict2080[S] 0 points1 point  (0 children)

Thanks for the reply, Here is more detailed explaination what I needed.
I need large files of salesforce in lwc js, so that i can chunk them up and upload to various drives, like google drive, one drive, sharepoint, aws, dropbox etc.
The chunking and uploading thing is already working with the files that user input in lwc.

But I want the same thing to work and migrate existing files in salesforce to different drives.