Skip to content

Add Profound Agent Analytics Cloudflare Worker#2787

Merged
HarshMN2345 merged 5 commits intomainfrom
add-profound-cloudflare-worker
Mar 12, 2026
Merged

Add Profound Agent Analytics Cloudflare Worker#2787
HarshMN2345 merged 5 commits intomainfrom
add-profound-cloudflare-worker

Conversation

@jalafgani
Copy link
Contributor

@jalafgani jalafgani commented Mar 5, 2026

Summary

  • Adds a Cloudflare Worker (log-collector/) that captures HTTP request metadata on appwrite.io and forwards it to Profound's Agent Analytics API
  • Worker runs as transparent middleware — no DNS changes required, no impact on request handling
  • Uses Cloudflare Worker route binding for appwrite.io/*

Setup after merge

  1. cd log-collector && npm install
  2. npx wrangler login
  3. npx wrangler secret put PROFOUND_API_KEY — enter the API key when prompted
  4. npx wrangler deploy

Test plan

  • Verify TypeScript compiles cleanly (npx tsc --noEmit in log-collector/)
  • Deploy to Cloudflare and confirm worker is active on the zone
  • Confirm logs appear in Profound Analytics dashboard

Made with Cursor

Summary by CodeRabbit

  • New Features

    • Added a server-side log-collector service that captures request metadata and sends structured logs to a remote endpoint.
  • Chores

    • Added project configuration and development tooling for the log-collector service.
    • Added .gitignore entries to exclude local dev files and dependency artifacts from version control.

Set up a Cloudflare Worker (log-collector) that captures HTTP request
metadata and forwards it to Profound's Agent Analytics API for
appwrite.io. The PROFOUND_API_KEY secret must be configured via
`npx wrangler secret put PROFOUND_API_KEY` before deploying.

Made-with: Cursor
@coderabbitai
Copy link
Contributor

coderabbitai bot commented Mar 5, 2026

Walkthrough

Adds a new Cloudflare Workers log-collector project with files: .gitignore, package.json, tsconfig.json, wrangler.json, and src/index.ts. The worker implements a fetch handler that returns the original response and asynchronously posts a structured log record (timestamp, host, method, pathname, query params, client IP, user agent, referer, bytes sent, status) as JSON to PROFOUND_API_URL using PROFOUND_API_KEY. Project configuration includes Wrangler settings, TypeScript config, and devDependencies; .gitignore excludes node_modules, .wrangler, and .dev.vars.

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~20 minutes

🚥 Pre-merge checks | ✅ 2 | ❌ 1

❌ Failed checks (1 warning)

Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 0.00% which is insufficient. The required threshold is 80.00%. Write docstrings for the functions missing them to satisfy the coverage threshold.
✅ Passed checks (2 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title accurately describes the main change: adding a Cloudflare Worker for Profound Agent Analytics to the log-collector directory.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
  • 📝 Generate docstrings (stacked PR)
  • 📝 Generate docstrings (commit on current branch)
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment
  • Commit unit tests in branch add-profound-cloudflare-worker

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 3

🧹 Nitpick comments (3)
log-collector/package.json (1)

5-9: Consider adding a type-check script for CI validation.

The PR description mentions running npx tsc --noEmit for verification. Adding this as a named script would make the test plan easier to follow and enable CI integration.

💡 Suggested addition
   "scripts": {
     "dev": "wrangler dev",
     "deploy": "wrangler deploy",
-    "cf-typegen": "wrangler types"
+    "cf-typegen": "wrangler types",
+    "typecheck": "tsc --noEmit"
   },
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@log-collector/package.json` around lines 5 - 9, Add a new NPM script named
(for example) "typecheck" inside the existing "scripts" object in package.json
that runs the TypeScript compiler with no emit (npx tsc --noEmit or tsc
--noEmit) so CI and contributors can run type-checking via npm run typecheck;
update the "scripts" block (alongside "dev", "deploy", "cf-typegen") to include
this new "typecheck" entry.
log-collector/src/index.ts (2)

41-48: Add response status checking for the analytics API call.

The current implementation only catches network errors. HTTP error responses (4xx/5xx) from the Profound API will not be logged, making it difficult to diagnose integration issues.

💡 Proposed improvement
-    await fetch(env.PROFOUND_API_URL, {
+    const apiResponse = await fetch(env.PROFOUND_API_URL, {
         method: 'POST',
         headers: {
             'Content-Type': 'application/json',
             'X-API-Key': env.PROFOUND_API_KEY
         },
         body: JSON.stringify([logData])
-    }).catch((error) => console.error('Failed to send logs:', error));
+    });
+    
+    if (!apiResponse.ok) {
+        console.error('Failed to send logs:', apiResponse.status, await apiResponse.text().catch(() => ''));
+    }
 }

You may also want to wrap the entire function body in a try-catch to ensure all errors are handled:

async function handleRequest(request: Request, response: Response, env: Env) {
    try {
        // ... existing logic ...
    } catch (error) {
        console.error('Failed to send logs:', error);
    }
}
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@log-collector/src/index.ts` around lines 41 - 48, Update the POST to the
Profound API to detect and log HTTP errors: after calling
fetch(env.PROFOUND_API_URL, { method: 'POST', headers: {...}, body:
JSON.stringify([logData]) }) check the returned Response for response.ok and if
false read and log status, statusText and optionally response.text() or json()
to include the API error details; also wrap the fetch call (or the entire
handleRequest function) in a try-catch to ensure any thrown exceptions are
caught and logged (use identifiers env.PROFOUND_API_URL, env.PROFOUND_API_KEY,
logData and the handleRequest function to locate where to add the response.ok
check and try-catch).

24-26: Consider memory impact when processing large response bodies.

Calling response.blob() loads the entire response body into memory. For large responses (e.g., file downloads), this could cause memory pressure. If precise byte counting isn't critical, consider using the Content-Length header when available.

💡 Alternative approach using Content-Length
-    const responseBody = await response.blob();
-    const bodySize = responseBody.size;
+    const contentLength = response.headers.get('content-length');
+    const bodySize = contentLength ? parseInt(contentLength, 10) : 0;
     const totalBytesSent = headerSize + bodySize;

Note: Content-Length may not always be present (e.g., chunked encoding), so this trades accuracy for efficiency.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@log-collector/src/index.ts` around lines 24 - 26, The current code calls
response.blob() which buffers the whole body into memory; change the size
calculation to first check response.headers.get('Content-Length') and use that
numeric value if present to compute totalBytesSent (headerSize + contentLength).
If Content-Length is absent, switch to streaming/iterating the response.body
reader to sum chunk.byteLength into bodySize (or only fall back to
response.blob() if streaming is not available), updating the variables
headerSize, bodySize and totalBytesSent accordingly; modify the logic around
response.blob(), response.headers, headerSize, bodySize, and totalBytesSent to
implement this fallback.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@log-collector/package.json`:
- Around line 10-14: Update the devDependencies in package.json for the three
outdated packages: set "@cloudflare/workers-types" to the current 4.20260305.1,
bump "typescript" to at least 5.9.3 (or the latest stable), and upgrade
"wrangler" to the major-4 release (e.g., 4.70.0); then run your package manager
to refresh the lockfile (npm/yarn/pnpm) and run the test/build scripts to verify
compatibility, adjusting any TypeScript or Wrangler config if compiler/schema
errors appear.

In `@log-collector/src/index.ts`:
- Line 34: The code reads the client IP via
request.headers.get('cf-connecting-ip') and sends it to Profound, which may
process personal data; before keeping this, ensure privacy compliance by (1)
adding disclosure in the privacy policy about collecting IPs, (2) establishing a
Data Processing Agreement with Profound, and (3) implementing a safer default in
the log pipeline: replace the direct use of
request.headers.get('cf-connecting-ip') with an anonymized/hashed value (or drop
the field) and gate sending via a configurable consent/feature flag so IPs are
only included when legal bases and agreements are in place.

In `@log-collector/tsconfig.json`:
- Line 7: tsconfig.json currently pins the "types" entry to
"@cloudflare/workers-types/2023-07-01" which mismatches the Worker
compatibility_date in wrangler.json; update the project to use types that match
the compatibility_date by either removing or replacing the hardcoded "types"
entry and generating/updating types via the recommended workflow (run wrangler
types to generate matching definitions for the compatibility_date in
wrangler.json) so the "types" array and wrangler.json's compatibility_date
remain consistent.

---

Nitpick comments:
In `@log-collector/package.json`:
- Around line 5-9: Add a new NPM script named (for example) "typecheck" inside
the existing "scripts" object in package.json that runs the TypeScript compiler
with no emit (npx tsc --noEmit or tsc --noEmit) so CI and contributors can run
type-checking via npm run typecheck; update the "scripts" block (alongside
"dev", "deploy", "cf-typegen") to include this new "typecheck" entry.

In `@log-collector/src/index.ts`:
- Around line 41-48: Update the POST to the Profound API to detect and log HTTP
errors: after calling fetch(env.PROFOUND_API_URL, { method: 'POST', headers:
{...}, body: JSON.stringify([logData]) }) check the returned Response for
response.ok and if false read and log status, statusText and optionally
response.text() or json() to include the API error details; also wrap the fetch
call (or the entire handleRequest function) in a try-catch to ensure any thrown
exceptions are caught and logged (use identifiers env.PROFOUND_API_URL,
env.PROFOUND_API_KEY, logData and the handleRequest function to locate where to
add the response.ok check and try-catch).
- Around line 24-26: The current code calls response.blob() which buffers the
whole body into memory; change the size calculation to first check
response.headers.get('Content-Length') and use that numeric value if present to
compute totalBytesSent (headerSize + contentLength). If Content-Length is
absent, switch to streaming/iterating the response.body reader to sum
chunk.byteLength into bodySize (or only fall back to response.blob() if
streaming is not available), updating the variables headerSize, bodySize and
totalBytesSent accordingly; modify the logic around response.blob(),
response.headers, headerSize, bodySize, and totalBytesSent to implement this
fallback.

ℹ️ Review info
⚙️ Run configuration

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

Run ID: 0e0bff48-4160-45eb-a4dc-d9fb25b21f88

📥 Commits

Reviewing files that changed from the base of the PR and between 592e07a and 29829c3.

📒 Files selected for processing (5)
  • log-collector/.gitignore
  • log-collector/package.json
  • log-collector/src/index.ts
  • log-collector/tsconfig.json
  • log-collector/wrangler.json

Comment on lines +10 to +14
"devDependencies": {
"@cloudflare/workers-types": "^4.20250124.0",
"typescript": "^5.5.2",
"wrangler": "^3.101.0"
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

🧩 Analysis chain

🏁 Script executed:

#!/bin/bash
# Check if the specified package versions exist on npm

echo "Checking `@cloudflare/workers-types`..."
npm view `@cloudflare/workers-types` versions --json 2>/dev/null | tail -20

echo -e "\nChecking wrangler..."
npm view wrangler version

echo -e "\nChecking typescript..."
npm view typescript version

Repository: appwrite/website

Length of output: 732


Update outdated package versions to current releases.

The dependency versions are outdated:

  • @cloudflare/workers-types is pinned to January 2025 (4.20250124.0) when March 2026 versions exist (latest: 4.20260305.1)
  • wrangler is on major version 3 (3.101.0) while major version 4 (4.70.0) is available
  • typescript should be updated from 5.5.2 to at least 5.9.3

Update these dependencies to leverage bug fixes, security patches, and new features from current releases.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@log-collector/package.json` around lines 10 - 14, Update the devDependencies
in package.json for the three outdated packages: set "@cloudflare/workers-types"
to the current 4.20260305.1, bump "typescript" to at least 5.9.3 (or the latest
stable), and upgrade "wrangler" to the major-4 release (e.g., 4.70.0); then run
your package manager to refresh the lockfile (npm/yarn/pnpm) and run the
test/build scripts to verify compatibility, adjusting any TypeScript or Wrangler
config if compiler/schema errors appear.

method: request.method,
pathname: requestUrl.pathname,
query_params: Object.fromEntries(requestUrl.searchParams),
ip: request.headers.get('cf-connecting-ip'),
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Verify compliance with privacy regulations before logging client IP addresses.

Logging cf-connecting-ip (client IP address) to a third-party analytics service constitutes processing of personal data under GDPR/CCPA. Ensure:

  1. This data collection is disclosed in the privacy policy
  2. Appropriate data processing agreements are in place with Profound
  3. Consider whether IP anonymization/hashing would suffice for your analytics needs
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@log-collector/src/index.ts` at line 34, The code reads the client IP via
request.headers.get('cf-connecting-ip') and sends it to Profound, which may
process personal data; before keeping this, ensure privacy compliance by (1)
adding disclosure in the privacy policy about collecting IPs, (2) establishing a
Data Processing Agreement with Profound, and (3) implementing a safer default in
the log pipeline: replace the direct use of
request.headers.get('cf-connecting-ip') with an anonymized/hashed value (or drop
the field) and gate sending via a configurable consent/feature flag so IPs are
only included when legal bases and agreements are in place.

"module": "ESNext",
"moduleResolution": "Bundler",
"lib": ["ESNext"],
"types": ["@cloudflare/workers-types/2023-07-01"],
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

🧩 Analysis chain

🌐 Web query:

Cloudflare Workers types API versions available 2024 2025

💡 Result:

Cloudflare’s TypeScript runtime typings for Workers are published as the npm package @cloudflare/workers-types, and the versioning is date-based (e.g. 4.YYYYMMDD.patch), with new releases published alongside workerd releases. [1]

Versions you’ll see in 2024 (examples):

  • 4.20240821.14.20240903.0 [2]
  • 4.20240909.04.20240919.0 [3]

Versions you’ll see in 2025 (examples):

  • 4.20250628.0 [4]
  • 4.20250816.0 [5]

Note (recommended approach): Cloudflare now recommends using wrangler types to generate types that match your Worker’s compatibility_date and flags, rather than relying on a pinned @cloudflare/workers-types version. [6]

Sources
[1] https://blog.cloudflare.com/improving-workers-types/
[2] https://npmdiff.dev/%40cloudflare%2Fworkers-types/4.20240821.1/4.20240903.0/package/2023-03-01/index.ts/
[3] https://npmdiff.dev/%40cloudflare%2Fworkers-types/4.20240909.0/4.20240919.0/package/package.json/
[4] https://cloudsmith.com/navigator/npm/%40cloudflare/workers-types
[5] https://medium.com/full-stack-engineer/how-to-build-a-typescript-npm-package-for-a-cloudflare-worker-653fef2fdfe2
[6] https://developers.cloudflare.com/workers/languages/typescript/


🏁 Script executed:

fd wrangler.toml log-collector/

Repository: appwrite/website

Length of output: 42


🏁 Script executed:

# Search for wrangler.toml anywhere in the repo
find . -name "wrangler.toml" -type f 2>/dev/null

# List files in log-collector directory
ls -la log-collector/

# Read the tsconfig.json
cat -n log-collector/tsconfig.json

Repository: appwrite/website

Length of output: 903


🏁 Script executed:

cat -n log-collector/wrangler.json

Repository: appwrite/website

Length of output: 554


Update TypeScript types to match Worker's compatibility_date or use wrangler types instead.

The tsconfig.json references @cloudflare/workers-types/2023-07-01, but wrangler.json sets compatibility_date to 2025-01-29. This mismatch means TypeScript won't recognize APIs available in the newer compatibility version. Cloudflare now recommends using wrangler types to auto-generate types that match your Worker's actual compatibility_date, rather than manually pinning a types version.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@log-collector/tsconfig.json` at line 7, tsconfig.json currently pins the
"types" entry to "@cloudflare/workers-types/2023-07-01" which mismatches the
Worker compatibility_date in wrangler.json; update the project to use types that
match the compatibility_date by either removing or replacing the hardcoded
"types" entry and generating/updating types via the recommended workflow (run
wrangler types to generate matching definitions for the compatibility_date in
wrangler.json) so the "types" array and wrangler.json's compatibility_date
remain consistent.

@jalafgani jalafgani requested a review from HarshMN2345 March 5, 2026 19:40
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

♻️ Duplicate comments (1)
log-collector/package.json (1)

11-13: ⚠️ Potential issue | 🟠 Major

Refresh the Worker toolchain pins before merge.

These version pins are still the older ones already called out in the earlier review. Since this is a brand-new Worker being introduced to production traffic, I’d update the Cloudflare/TypeScript toolchain now instead of starting from known-stale versions.

#!/bin/bash
set -euo pipefail

for pkg in `@cloudflare/workers-types` typescript wrangler; do
  echo "=== $pkg ==="
  npm view "$pkg" version
  echo
done
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@log-collector/package.json` around lines 11 - 13, Update the pinned versions
for the Cloudflare worker toolchain packages in package.json: refresh the
version strings for "@cloudflare/workers-types", "typescript", and "wrangler" to
the current stable releases (run npm view <pkg> version or your preferred
registry query to get the latest versions) and replace the existing version
ranges in the dependencies/devDependencies block so the new Worker uses the
latest toolchain pins; ensure package.json remains valid JSON and run a quick
install (npm install) to verify there are no breaking changes.
🧹 Nitpick comments (1)
log-collector/package.json (1)

5-8: Add the documented compile check as an npm script.

The PR test plan relies on tsc --noEmit, but package.json does not expose it. Adding a typecheck script keeps local and CI verification aligned with the rollout steps.

♻️ Proposed change
     "scripts": {
         "dev": "wrangler dev",
         "deploy": "wrangler deploy",
-        "cf-typegen": "wrangler types"
+        "cf-typegen": "wrangler types",
+        "typecheck": "tsc --noEmit"
     },
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@log-collector/package.json` around lines 5 - 8, package.json is missing an
npm script for the documented compile check; add a "typecheck" entry under the
"scripts" object that runs the TypeScript compiler in no-emit mode (i.e., "tsc
--noEmit") so local and CI verification match the PR test plan and rollout
steps, updating the existing "scripts" block where "dev", "deploy", and
"cf-typegen" are defined.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Duplicate comments:
In `@log-collector/package.json`:
- Around line 11-13: Update the pinned versions for the Cloudflare worker
toolchain packages in package.json: refresh the version strings for
"@cloudflare/workers-types", "typescript", and "wrangler" to the current stable
releases (run npm view <pkg> version or your preferred registry query to get the
latest versions) and replace the existing version ranges in the
dependencies/devDependencies block so the new Worker uses the latest toolchain
pins; ensure package.json remains valid JSON and run a quick install (npm
install) to verify there are no breaking changes.

---

Nitpick comments:
In `@log-collector/package.json`:
- Around line 5-8: package.json is missing an npm script for the documented
compile check; add a "typecheck" entry under the "scripts" object that runs the
TypeScript compiler in no-emit mode (i.e., "tsc --noEmit") so local and CI
verification match the PR test plan and rollout steps, updating the existing
"scripts" block where "dev", "deploy", and "cf-typegen" are defined.

ℹ️ Review info
⚙️ Run configuration

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

Run ID: ad154a5a-04b8-4ea7-8ea5-3925395383f5

📥 Commits

Reviewing files that changed from the base of the PR and between 29829c3 and 0afd148.

📒 Files selected for processing (3)
  • log-collector/package.json
  • log-collector/tsconfig.json
  • log-collector/wrangler.json
🚧 Files skipped from review as they are similar to previous changes (2)
  • log-collector/wrangler.json
  • log-collector/tsconfig.json

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick comments (1)
log-collector/src/index.ts (1)

41-48: Consider adding response status checking and better error diagnostics.

The current error handling only catches network failures. If the API returns an error status (4xx/5xx) or if PROFOUND_API_KEY is not configured, the failure is silent. This could make debugging difficult when logs aren't appearing in Profound.

♻️ Proposed improvement for error handling
-    await fetch(env.PROFOUND_API_URL, {
+    if (!env.PROFOUND_API_KEY) {
+        console.error('PROFOUND_API_KEY is not configured');
+        return;
+    }
+
+    const apiResponse = await fetch(env.PROFOUND_API_URL, {
         method: 'POST',
         headers: {
             'Content-Type': 'application/json',
             'X-API-Key': env.PROFOUND_API_KEY
         },
         body: JSON.stringify([logData])
-    }).catch((error) => console.error('Failed to send logs:', error));
+    }).catch((error) => {
+        console.error('Failed to send logs:', error);
+        return null;
+    });
+
+    if (apiResponse && !apiResponse.ok) {
+        console.error(`Profound API error: ${apiResponse.status} ${apiResponse.statusText}`);
+    }
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@log-collector/src/index.ts` around lines 41 - 48, Before calling fetch to
send logData to env.PROFOUND_API_URL, validate env.PROFOUND_API_KEY is set and
log a clear error if missing; then perform the POST, await the response, check
response.ok and when false log response.status, response.statusText and attempt
to read and include the response body (text/json) for diagnostics, and still
catch network exceptions to log the error and any request payload (logData) for
troubleshooting; update the existing anonymous fetch usage to the new flow so
failures from non-2xx responses are not silent.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Nitpick comments:
In `@log-collector/src/index.ts`:
- Around line 41-48: Before calling fetch to send logData to
env.PROFOUND_API_URL, validate env.PROFOUND_API_KEY is set and log a clear error
if missing; then perform the POST, await the response, check response.ok and
when false log response.status, response.statusText and attempt to read and
include the response body (text/json) for diagnostics, and still catch network
exceptions to log the error and any request payload (logData) for
troubleshooting; update the existing anonymous fetch usage to the new flow so
failures from non-2xx responses are not silent.

ℹ️ Review info
⚙️ Run configuration

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

Run ID: 29789590-1e88-4154-b149-45185212e31b

📥 Commits

Reviewing files that changed from the base of the PR and between 0afd148 and 93acd35.

📒 Files selected for processing (1)
  • log-collector/src/index.ts

@HarshMN2345 HarshMN2345 merged commit 51fa8d7 into main Mar 12, 2026
6 checks passed
@HarshMN2345 HarshMN2345 deleted the add-profound-cloudflare-worker branch March 12, 2026 11:24
@jalafgani
Copy link
Contributor Author

Hi @TorstenDittmann ,

We need some help getting this analytics tool set up. I just added you to Profound.

It's an Answer Engine Optimization tool. The goal is to get Appwrite recommended by ChatGPT/Codex over Supabase/Vercel.


Here's some of the things we tried, but not sure if I'm doing it right.

cd log-collector && npm install
npx wrangler login
npx wrangler secret put PROFOUND_API_KEY — enter the API key when prompted
npx wrangler deploy

PROFOUND_API_KEY=bot_a24aee5b-c3c4-475e-8b48-78f5a5b2049c

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants