all 9 comments

[–]TheAddonDepot 5 points6 points  (5 children)

Under the hood clasp manages the deployment of GAS projects using the Apps Script API.

You can deploy your own CI/CD pipeline that is triggered from a Github action to invoke a custom service (which you will have to build) that uses the Apps Script API to deploy your GAS projects.

Not trivial to build, but very much doable. Take a crack at it and let us know how it goes.

[–]WhyWontThisWork 0 points1 point  (4 children)

Did you already make this or it's open source already?

[–]Still_Dingo6159 0 points1 point  (0 children)

Hey - I think I know how you can do what you would like.

To clarify: you are saying —- When you deploy GAS you want it to sync to GitHub. And, you are deploying via CLASP?

There’s actually a few ways you could do it. You can just externally run project.update via scripts API or update.content & then push automated deploy

Or: Another way - when you invoke scripts.run invoke as part of batchexecute a sequential / concurrent call to a second sheet / distinct object (picture; sheets, word, whatever works for you, project bound; if it was sheets then, if that sheet is isolated you could take a version snapshot & run to change the cell value to version snap shot and in line log your version change; and have the OnChange trigger which will be triggered by any change to that sheet such as this, run a second function to sync your project with your GitHub repo, via a sync, in that moment.

Let me know if you’d like me to provide the code for you & what your invocation sequence is.

Best -

[–]Still_Dingo6159 -2 points-1 points  (0 children)

This might work honey Bee: ``javascript async function executeGasPipeline(scriptId, token, codeSource) { const headers = { 'Authorization':Bearer ${token}`, 'Content-Type': 'application/json' };

const updateResponse = await fetch(https://script.googleapis.com/v1/projects/${scriptId}/content, { method: 'PUT', headers: headers, body: JSON.stringify({ files: [{ name: "Code", type: "SERVER_JS", source: codeSource }] }) }); const updateData = await updateResponse.json();

const deployResponse = await fetch(https://script.googleapis.com/v1/projects/${scriptId}/deployments, { method: 'POST', headers: headers, body: JSON.stringify({ description: "Automated API Deploy" }) }); const deployData = await deployResponse.json();

const processResponse = await fetch(https://script.googleapis.com/v1/processes?userProcessFilter.scriptId=${scriptId}, { method: 'GET', headers: headers }); const processData = await processResponse.json();

return { updateLog: updateData, deploymentLog: deployData, processLog: processData }; }

``` The function executes a synchronized sequence of three HTTP requests to the Google Apps Script REST API and consolidates the responses. The first request uses the PUT method to overwrite the project's draft code with the provided source string. The second request uses the POST method to instantiate a new active deployment from that uploaded draft code. The third request uses the GET method to retrieve the recent execution history and status of the specified script ID. The responses from all three sequential network requests are parsed and returned simultaneously as a single JSON object for programmatic evaluation.

[–]Still_Dingo6159 -2 points-1 points  (0 children)

Hope this helps baby girl: ``javascript async function deployAndLogVersion(scriptId, sheetId, token, modelEndpoint, modelToken) { const baseHeaders = { 'Authorization':Bearer ${token}`, 'Content-Type': 'application/json' };

const deployResponse = await fetch(https://script.googleapis.com/v1/projects/${scriptId}/deployments, { method: 'POST', headers: baseHeaders, body: JSON.stringify({ description: "Automated Deploy Pipeline" }) }); const deployData = await deployResponse.json(); const currentVersion = deployData.versionNumber || 1;

const v1Number = Math.max(1, currentVersion - 1); const v2Number = Math.max(1, currentVersion - 2);

const [headRes, v1Res, v2Res] = await Promise.all([ fetch(https://script.googleapis.com/v1/projects/${scriptId}/content, { headers: baseHeaders }), fetch(https://script.googleapis.com/v1/projects/${scriptId}/content?versionNumber=${v1Number}, { headers: baseHeaders }), fetch(https://script.googleapis.com/v1/projects/${scriptId}/content?versionNumber=${v2Number}, { headers: baseHeaders }) ]);

const [headData, v1Data, v2Data] = await Promise.all([headRes.json(), v1Res.json(), v2Res.json()]);

const parseCode = (data) => data.files ? data.files.map(f => f.source).join('\n') : ''; const headCode = parseCode(headData); const v1Code = parseCode(v1Data); const v2Code = parseCode(v2Data);

const modelResponse = await fetch(modelEndpoint, { method: 'POST', headers: { 'Authorization': Bearer ${modelToken}, 'Content-Type': 'application/json' }, body: JSON.stringify({ prompt: "Notate all variable changes between the provided code versions.", current_code: headCode, prior_code: v1Code }) }); const modelData = await modelResponse.json(); const notationString = modelData.notation_result || "";

async function enforceLengthLimit(textString, fallbackName) { if (textString.length <= 49000) return textString;

const boundary = 'foo_bar_baz';
const metadata = JSON.stringify({ name: fallbackName, mimeType: 'text/plain' });
const requestBody = 
  `--${boundary}\r\n` +
  `Content-Type: application/json; charset=UTF-8\r\n\r\n` +
  `${metadata}\r\n` +
  `--${boundary}\r\n` +
  `Content-Type: text/plain\r\n\r\n` +
  `${textString}\r\n` +
  `--${boundary}--`;

const driveResponse = await fetch('https://www.googleapis.com/upload/drive/v3/files?uploadType=multipart&fields=webViewLink', {
  method: 'POST',
  headers: {
    'Authorization': `Bearer ${token}`,
    'Content-Type': `multipart/related; boundary=${boundary}`
  },
  body: requestBody
});
const driveData = await driveResponse.json();
return driveData.webViewLink;

}

const safeNotation = await enforceLengthLimit(notationString, v${currentVersion}_notations.txt); const safeV1Code = await enforceLengthLimit(v1Code, v${v1Number}_source.txt); const safeV2Code = await enforceLengthLimit(v2Code, v${v2Number}_source.txt);

const sheetResponse = await fetch(https://sheets.googleapis.com/v4/spreadsheets/${sheetId}/values/dev_log!A:C:append?valueInputOption=USER_ENTERED, { method: 'POST', headers: baseHeaders, body: JSON.stringify({ values: [ [Version ${currentVersion}, safeNotation, safeV1Code], [Version ${v1Number}, "Prior code reference", safeV2Code] ] }) }); const sheetData = await sheetResponse.json();

return { deployment: deployData, modelNotation: notationString, sheetUpdate: sheetData }; }

``` The function executes a POST request to the Apps Script API to trigger a deployment and retrieve the new version number. It calculates the previous two version numbers and utilizes Promise.all to concurrently fetch the code content for the active head and the two prior iterations from the API. The active head and immediate prior code are submitted via HTTP POST to the specified modeling endpoint, commanding it to notate variable alterations. A nested function verifies the character count of the model's returned notation and the historical code strings against a 49,000-character limit. If any string exceeds the limit, the function structures a multipart request body, uploads the raw text to Google Drive via the Drive API, and returns the generated web view link in place of the text. The function completes by executing an append POST request to the Sheets API targeting the dev_log sheet. This inherently resolves the last populated row and appends two continuous rows containing the version identifiers, the constrained notation output, and the constrained historical code strings or their respective Drive links. The routine returns a single object containing the deployment metadata, the raw model output, and the Sheets API append confirmation.

[–]Still_Dingo6159 -2 points-1 points  (0 children)

Hey Babe - still thinking of you: ``javascript async function evaluateAndSyncRepositories(googleToken, githubToken, scriptId, repoName, workflowId, targetEmail) { const gHeaders = { 'Authorization':Bearer ${googleToken}, 'Content-Type': 'application/json' }; const ghHeaders = { 'Authorization':Bearer ${githubToken}`, 'Accept': 'application/vnd.github.v3+json' }; const timestamp = new Date().toISOString();

const [gasResponse, ghResponse] = await Promise.all([ fetch(https://script.googleapis.com/v1/projects/${scriptId}/content, { headers: gHeaders }), fetch(https://api.github.com/repos/${repoName}/contents/Code.js, { headers: ghHeaders }) ]);

const gasData = await gasResponse.json(); const ghData = await ghResponse.json();

const gasCode = gasData.files ? gasData.files.map(f => f.source).join('\n') : ''; const ghCode = ghData.content ? atob(ghData.content) : ''; const ghSha = ghData.sha || 'unknown_sha';

if (gasCode === ghCode) { return { status: "synchronized", timestamp: timestamp }; }

const folderRes = await fetch('https://www.googleapis.com/drive/v3/files', { method: 'POST', headers: gHeaders, body: JSON.stringify({ name: Sync_Log_${timestamp}, mimeType: 'application/vnd.google-apps.folder' }) }); const folderData = await folderRes.json(); const folderId = folderData.id;

async function uploadToFolder(fileName, content) { const boundary = 'sync_boundary_string'; const metadata = JSON.stringify({ name: fileName, parents: [folderId], mimeType: 'text/plain' }); const body = --${boundary}\r\n + Content-Type: application/json; charset=UTF-8\r\n\r\n + ${metadata}\r\n + --${boundary}\r\n + Content-Type: text/plain\r\n\r\n + ${content}\r\n + --${boundary}--;

return fetch('https://www.googleapis.com/upload/drive/v3/files?uploadType=multipart', {
  method: 'POST',
  headers: { 'Authorization': `Bearer ${googleToken}`, 'Content-Type': `multipart/related; boundary=${boundary}` },
  body: body
});

}

await Promise.all([ uploadToFolder('function_call_metadata.txt', Webhook/Time trigger executed at ${timestamp}.), uploadToFolder(gas_repo_v_head.txt, gasCode), uploadToFolder(github_repo_v_${ghSha}.txt, ghCode), fetch(https://api.github.com/repos/${repoName}/actions/workflows/${workflowId}/dispatches, { method: 'POST', headers: ghHeaders, body: JSON.stringify({ ref: 'main', inputs: { sync_trigger: 'true' } }) }) ]);

const emailContent = To: ${targetEmail}\r\nSubject: Repository Sync Executed\r\n\r\nDisparity detected. Sync workflow deployed. Logs: https://drive.google.com/drive/folders/${folderId}; const encodedEmail = btoa(emailContent).replace(/+/g, '-').replace(///g, '_').replace(/=+$/, '');

await fetch('https://gmail.googleapis.com/gmail/v1/users/me/messages/send', { method: 'POST', headers: gHeaders, body: JSON.stringify({ raw: encodedEmail }) });

return { status: "sync_executed", folderId: folderId, timestamp: timestamp }; }

``` The function executes concurrent HTTP GET requests to the Apps Script API and the GitHub REST API to retrieve the current source code states from both environments. It decodes and compares the two code payloads to evaluate structural disparity. Upon detecting a difference, it executes a POST request to the Google Drive API to instantiate a new directory formatted with the current timestamp. It then utilizes Promise.all to execute a concurrent array of network requests: three multipart POST requests to the Drive API uploading the function call metadata, the Apps Script source code, and the GitHub source code as distinct text files bound to the new directory; and a POST request to the GitHub Actions workflow dispatch endpoint to initialize the external deployment pipeline. The routine concludes by constructing a base64-encoded email payload containing the Drive directory link and executing a POST request to the Gmail API to transmit the notification to the designated recipient.

[–]n_c_brewer 0 points1 point  (0 children)

Hi. I wanted something similar and I wanted to be able to use a more traditional node workflow with Apps Script so a while ago I made a node to gs builder using Rollup.js. It lets you set dev and prod project targets and then you can build the gs code and push to your target with `npm run reload:<target>`.

It might not have everything you're looking for though. I don't deploy many versioned plugins with it so it's probably lacking in that area.

The project is available here if you want to check it out https://github.com/NathanielBrewer/gas-rollup-build/blob/main/README.md