improved logging
This commit is contained in:
@@ -272,7 +272,9 @@ if (is_actor(some_value)) {
|
||||
|
||||
### log
|
||||
|
||||
Logging functions: `log.console(msg)`, `log.error(msg)`, `log.system(msg)`.
|
||||
Channel-based logging. Any `log.X(value)` writes to channel `"X"`. Three channels are conventional: `log.console(msg)`, `log.error(msg)`, `log.system(msg)` — but any name works.
|
||||
|
||||
Channels are routed to configurable **sinks** (console or file) defined in `.cell/log.toml`. See [Logging](/docs/logging/) for the full guide.
|
||||
|
||||
### use(path)
|
||||
|
||||
|
||||
78
docs/cli.md
78
docs/cli.md
@@ -286,6 +286,84 @@ Clean build artifacts.
|
||||
pit clean
|
||||
```
|
||||
|
||||
## Logging
|
||||
|
||||
### pit log
|
||||
|
||||
Manage log sinks and read log files. See [Logging](/docs/logging/) for the full guide.
|
||||
|
||||
### pit log list
|
||||
|
||||
List configured sinks.
|
||||
|
||||
```bash
|
||||
pit log list
|
||||
```
|
||||
|
||||
### pit log add
|
||||
|
||||
Add a log sink.
|
||||
|
||||
```bash
|
||||
pit log add <name> console [options] # add a console sink
|
||||
pit log add <name> file <path> [options] # add a file sink
|
||||
```
|
||||
|
||||
Options:
|
||||
|
||||
- `--format=pretty|bare|json` — output format (default: `pretty` for console, `json` for file)
|
||||
- `--channels=ch1,ch2` — channels to subscribe (default: `console,error,system`). Use `'*'` for all channels (quote to prevent shell glob expansion).
|
||||
- `--exclude=ch1,ch2` — channels to exclude (useful with `'*'`)
|
||||
|
||||
```bash
|
||||
pit log add terminal console --format=bare --channels=console
|
||||
pit log add errors file .cell/logs/errors.jsonl --channels=error
|
||||
pit log add dump file .cell/logs/dump.jsonl '--channels=*' --exclude=console
|
||||
```
|
||||
|
||||
### pit log remove
|
||||
|
||||
Remove a sink.
|
||||
|
||||
```bash
|
||||
pit log remove <name>
|
||||
```
|
||||
|
||||
### pit log read
|
||||
|
||||
Read entries from a file sink.
|
||||
|
||||
```bash
|
||||
pit log read <sink> [options]
|
||||
```
|
||||
|
||||
Options:
|
||||
|
||||
- `--lines=N` — show last N entries
|
||||
- `--channel=X` — filter by channel
|
||||
- `--since=timestamp` — only show entries after timestamp (seconds since epoch)
|
||||
|
||||
```bash
|
||||
pit log read errors --lines=50
|
||||
pit log read dump --channel=debug --lines=10
|
||||
pit log read errors --since=1702656000
|
||||
```
|
||||
|
||||
### pit log tail
|
||||
|
||||
Follow a file sink in real time.
|
||||
|
||||
```bash
|
||||
pit log tail <sink> [--lines=N]
|
||||
```
|
||||
|
||||
`--lines=N` controls how many existing entries to show on start (default: 10).
|
||||
|
||||
```bash
|
||||
pit log tail dump
|
||||
pit log tail errors --lines=20
|
||||
```
|
||||
|
||||
## Developer Commands
|
||||
|
||||
Compiler pipeline tools, analysis, and testing. These are primarily useful for developing the ƿit compiler and runtime.
|
||||
|
||||
202
docs/logging.md
Normal file
202
docs/logging.md
Normal file
@@ -0,0 +1,202 @@
|
||||
---
|
||||
title: "Logging"
|
||||
description: "Configurable channel-based logging with sinks"
|
||||
weight: 25
|
||||
type: "docs"
|
||||
---
|
||||
|
||||
Logging in ƿit is channel-based. Any `log.X(value)` call writes to channel `"X"`. Channels are routed to **sinks** — named destinations that format and deliver log output to the console or to files.
|
||||
|
||||
## Channels
|
||||
|
||||
Three channels are conventional:
|
||||
|
||||
| Channel | Usage |
|
||||
|---------|-------|
|
||||
| `log.console(msg)` | General output |
|
||||
| `log.error(msg)` | Errors and warnings |
|
||||
| `log.system(msg)` | Internal system messages |
|
||||
|
||||
Any name works. `log.debug(msg)` creates channel `"debug"`, `log.perf(msg)` creates `"perf"`, and so on.
|
||||
|
||||
```javascript
|
||||
log.console("server started on port 8080")
|
||||
log.error("connection refused")
|
||||
log.debug({query: "SELECT *", rows: 42})
|
||||
```
|
||||
|
||||
Non-text values are JSON-encoded automatically.
|
||||
|
||||
## Default Behavior
|
||||
|
||||
With no configuration, a default sink routes `console`, `error`, and `system` to the terminal in pretty format:
|
||||
|
||||
```
|
||||
[a3f12] [console] main.ce:5 server started on port 8080
|
||||
[a3f12] [error] main.ce:12 connection refused
|
||||
```
|
||||
|
||||
The format is `[actor_id] [channel] file:line message`.
|
||||
|
||||
## Configuration
|
||||
|
||||
Logging is configured in `.cell/log.toml`. Each `[sink.NAME]` section defines a sink.
|
||||
|
||||
```toml
|
||||
[sink.terminal]
|
||||
type = "console"
|
||||
format = "bare"
|
||||
channels = ["console"]
|
||||
|
||||
[sink.errors]
|
||||
type = "file"
|
||||
path = ".cell/logs/errors.jsonl"
|
||||
channels = ["error"]
|
||||
|
||||
[sink.everything]
|
||||
type = "file"
|
||||
path = ".cell/logs/all.jsonl"
|
||||
channels = ["*"]
|
||||
exclude = ["console"]
|
||||
```
|
||||
|
||||
### Sink fields
|
||||
|
||||
| Field | Values | Description |
|
||||
|-------|--------|-------------|
|
||||
| `type` | `"console"`, `"file"` | Where output goes |
|
||||
| `format` | `"pretty"`, `"bare"`, `"json"` | How output is formatted |
|
||||
| `channels` | array of names, or `["*"]` | Which channels this sink receives |
|
||||
| `exclude` | array of names | Channels to skip (useful with `"*"`) |
|
||||
| `path` | file path | Output file (file sinks only) |
|
||||
|
||||
### Formats
|
||||
|
||||
**pretty** — human-readable, one line per message. Includes actor ID, channel, source location, and message.
|
||||
|
||||
```
|
||||
[a3f12] [console] main.ce:5 server started
|
||||
```
|
||||
|
||||
**bare** — minimal. Actor ID and message only.
|
||||
|
||||
```
|
||||
[a3f12] server started
|
||||
```
|
||||
|
||||
**json** — structured JSONL (one JSON object per line). Used for file sinks and machine consumption.
|
||||
|
||||
```json
|
||||
{"actor_id":"a3f12...","timestamp":1702656000.5,"channel":"console","event":"server started","source":{"file":"main.ce","line":5,"column":3,"function":"init"}}
|
||||
```
|
||||
|
||||
## Log Records
|
||||
|
||||
Every log call produces a record:
|
||||
|
||||
```javascript
|
||||
{
|
||||
actor_id: "a3f12...", // full actor GUID
|
||||
timestamp: 1702656000.5, // seconds since epoch
|
||||
channel: "console", // channel name
|
||||
event: "the message", // value passed to log
|
||||
source: {
|
||||
file: "main.ce",
|
||||
line: 5,
|
||||
column: 3,
|
||||
function: "init"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
File sinks write one JSON-encoded record per line. Console sinks format the record according to their format setting.
|
||||
|
||||
## CLI
|
||||
|
||||
The `pit log` command manages sinks and reads log files. See [CLI — pit log](/docs/cli/#pit-log) for the full reference.
|
||||
|
||||
```bash
|
||||
pit log list # show sinks
|
||||
pit log add terminal console --format=bare --channels=console
|
||||
pit log add dump file .cell/logs/dump.jsonl --channels='*' --exclude=console
|
||||
pit log remove terminal
|
||||
pit log read dump --lines=20 --channel=error
|
||||
pit log tail dump
|
||||
```
|
||||
|
||||
## Examples
|
||||
|
||||
### Development setup
|
||||
|
||||
Route console output to the terminal with minimal formatting. Send everything else to a structured log file for debugging.
|
||||
|
||||
```toml
|
||||
[sink.terminal]
|
||||
type = "console"
|
||||
format = "bare"
|
||||
channels = ["console"]
|
||||
|
||||
[sink.debug]
|
||||
type = "file"
|
||||
path = ".cell/logs/debug.jsonl"
|
||||
channels = ["*"]
|
||||
exclude = ["console"]
|
||||
```
|
||||
|
||||
```javascript
|
||||
log.console("listening on :8080") // -> terminal: [a3f12] listening on :8080
|
||||
log.error("bad request") // -> debug.jsonl only
|
||||
log.debug({latency: 0.042}) // -> debug.jsonl only
|
||||
```
|
||||
|
||||
### Separate error log
|
||||
|
||||
Keep a dedicated error log alongside a full dump.
|
||||
|
||||
```toml
|
||||
[sink.terminal]
|
||||
type = "console"
|
||||
format = "pretty"
|
||||
channels = ["console", "error", "system"]
|
||||
|
||||
[sink.errors]
|
||||
type = "file"
|
||||
path = ".cell/logs/errors.jsonl"
|
||||
channels = ["error"]
|
||||
|
||||
[sink.all]
|
||||
type = "file"
|
||||
path = ".cell/logs/all.jsonl"
|
||||
channels = ["*"]
|
||||
```
|
||||
|
||||
### JSON console
|
||||
|
||||
Output structured JSON to the console for piping into other tools.
|
||||
|
||||
```toml
|
||||
[sink.json_out]
|
||||
type = "console"
|
||||
format = "json"
|
||||
channels = ["console", "error"]
|
||||
```
|
||||
|
||||
```bash
|
||||
pit run myapp.ce | jq '.event'
|
||||
```
|
||||
|
||||
### Reading logs
|
||||
|
||||
```bash
|
||||
# Last 50 error entries
|
||||
pit log read errors --lines=50
|
||||
|
||||
# Errors since a timestamp
|
||||
pit log read errors --since=1702656000
|
||||
|
||||
# Filter a wildcard sink to one channel
|
||||
pit log read all --channel=debug --lines=10
|
||||
|
||||
# Follow a log file in real time
|
||||
pit log tail all
|
||||
```
|
||||
@@ -316,31 +316,19 @@ var nota = use_core('nota')
|
||||
var ENETSERVICE = 0.1
|
||||
var REPLYTIMEOUT = 60 // seconds before replies are ignored
|
||||
|
||||
function caller_data(depth)
|
||||
{
|
||||
return {file: "nofile", line: 0}
|
||||
}
|
||||
// --- Logging system (bootstrap phase) ---
|
||||
// Early log: prints to console before toml/time/json are loaded.
|
||||
// Upgraded to full sink-based system after config loads (see load_log_config below).
|
||||
|
||||
function console_rec(line, file, msg) {
|
||||
return `[${text(_cell.id, 0, 5)}] [${file}:${line}]: ${msg}\n`
|
||||
// time: [${time.text("mb d yyyy h:nn:ss")}]
|
||||
}
|
||||
var log_config = null
|
||||
var channel_sinks = {}
|
||||
var wildcard_sinks = []
|
||||
var warned_channels = {}
|
||||
|
||||
function log(name, args) {
|
||||
var caller = caller_data(1)
|
||||
var msg = args[0]
|
||||
|
||||
if (name == 'console') {
|
||||
os.print(console_rec(caller.line, caller.file, msg))
|
||||
} else if (name == 'error') {
|
||||
if (msg == null) msg = "error"
|
||||
os.print(console_rec(caller.line, caller.file, msg))
|
||||
} else if (name == 'system') {
|
||||
msg = "[SYSTEM] " + msg
|
||||
os.print(console_rec(caller.line, caller.file, msg))
|
||||
} else {
|
||||
log.console(`unknown log type: ${name}`)
|
||||
}
|
||||
if (msg == null) msg = ""
|
||||
os.print(`[${text(_cell.id, 0, 5)}] [${name}]: ${msg}\n`)
|
||||
}
|
||||
|
||||
function actor_die(err)
|
||||
@@ -428,6 +416,134 @@ core_extras.native_mode = native_mode
|
||||
var shop = use_core('internal/shop')
|
||||
if (native_mode) use_core('build')
|
||||
var time = use_core('time')
|
||||
var toml = use_core('toml')
|
||||
|
||||
// --- Logging system (full version) ---
|
||||
// Now that toml, time, fd, and json are available, upgrade the log function
|
||||
// from the bootstrap version to a configurable sink-based system.
|
||||
|
||||
function ensure_log_dir(path) {
|
||||
var parts = array(path, '/')
|
||||
var current = starts_with(path, '/') ? '/' : ''
|
||||
var i = 0
|
||||
// ensure parent dir (skip last element which is the filename)
|
||||
for (i = 0; i < length(parts) - 1; i++) {
|
||||
if (parts[i] == '') continue
|
||||
current = current + parts[i] + '/'
|
||||
if (!fd.is_dir(current)) fd.mkdir(current)
|
||||
}
|
||||
}
|
||||
|
||||
function build_sink_routing() {
|
||||
channel_sinks = {}
|
||||
wildcard_sinks = []
|
||||
var names = array(log_config.sink)
|
||||
arrfor(names, function(name) {
|
||||
var sink = log_config.sink[name]
|
||||
sink._name = name
|
||||
if (!is_array(sink.channels)) sink.channels = []
|
||||
if (is_text(sink.exclude)) sink.exclude = [sink.exclude]
|
||||
if (!is_array(sink.exclude)) sink.exclude = []
|
||||
if (sink.type == "file" && sink.path) ensure_log_dir(sink.path)
|
||||
arrfor(sink.channels, function(ch) {
|
||||
if (ch == "*") {
|
||||
wildcard_sinks[] = sink
|
||||
return
|
||||
}
|
||||
if (!channel_sinks[ch]) channel_sinks[ch] = []
|
||||
channel_sinks[ch][] = sink
|
||||
})
|
||||
})
|
||||
}
|
||||
|
||||
function load_log_config() {
|
||||
var log_path = null
|
||||
if (shop_path) {
|
||||
log_path = shop_path + '/log.toml'
|
||||
if (fd.is_file(log_path)) {
|
||||
log_config = toml.decode(text(fd.slurp(log_path)))
|
||||
}
|
||||
}
|
||||
if (!log_config || !log_config.sink) {
|
||||
log_config = {
|
||||
sink: {
|
||||
terminal: {
|
||||
type: "console",
|
||||
format: "pretty",
|
||||
channels: ["console", "error", "system"]
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
build_sink_routing()
|
||||
}
|
||||
|
||||
function pretty_format(rec) {
|
||||
var aid = text(rec.actor_id, 0, 5)
|
||||
var src = ""
|
||||
if (rec.source && rec.source.file)
|
||||
src = rec.source.file + ":" + text(rec.source.line)
|
||||
var ev = is_text(rec.event) ? rec.event : json.encode(rec.event, false)
|
||||
return `[${aid}] [${rec.channel}] ${src} ${ev}\n`
|
||||
}
|
||||
|
||||
function bare_format(rec) {
|
||||
var aid = text(rec.actor_id, 0, 5)
|
||||
var ev = is_text(rec.event) ? rec.event : json.encode(rec.event, false)
|
||||
return `[${aid}] ${ev}\n`
|
||||
}
|
||||
|
||||
function sink_excluded(sink, channel) {
|
||||
var excluded = false
|
||||
if (!sink.exclude || length(sink.exclude) == 0) return false
|
||||
arrfor(sink.exclude, function(ex) {
|
||||
if (ex == channel) excluded = true
|
||||
})
|
||||
return excluded
|
||||
}
|
||||
|
||||
function dispatch_to_sink(sink, rec) {
|
||||
var line = null
|
||||
if (sink_excluded(sink, rec.channel)) return
|
||||
if (sink.type == "console") {
|
||||
if (sink.format == "json")
|
||||
os.print(json.encode(rec, false) + "\n")
|
||||
else if (sink.format == "bare")
|
||||
os.print(bare_format(rec))
|
||||
else
|
||||
os.print(pretty_format(rec))
|
||||
} else if (sink.type == "file") {
|
||||
line = json.encode(rec, false) + "\n"
|
||||
fd.slurpappend(sink.path, stone(blob(line)))
|
||||
}
|
||||
}
|
||||
|
||||
load_log_config()
|
||||
|
||||
log = function(name, args) {
|
||||
var sinks = channel_sinks[name]
|
||||
var event = args[0]
|
||||
|
||||
if (!sinks && length(wildcard_sinks) == 0) {
|
||||
if (!warned_channels[name]) {
|
||||
warned_channels[name] = true
|
||||
os.print(`[warn] log channel '${name}' has no sinks configured\n`)
|
||||
}
|
||||
return
|
||||
}
|
||||
|
||||
var caller = caller_info(2)
|
||||
var rec = {
|
||||
actor_id: _cell.id,
|
||||
timestamp: time.number(),
|
||||
channel: name,
|
||||
event: event,
|
||||
source: caller
|
||||
}
|
||||
|
||||
if (sinks) arrfor(sinks, function(sink) { dispatch_to_sink(sink, rec) })
|
||||
arrfor(wildcard_sinks, function(sink) { dispatch_to_sink(sink, rec) })
|
||||
}
|
||||
|
||||
var pronto = use_core('pronto')
|
||||
var fallback = pronto.fallback
|
||||
@@ -438,6 +554,7 @@ var sequence = pronto.sequence
|
||||
runtime_env.actor = actor
|
||||
runtime_env.log = log
|
||||
runtime_env.send = send
|
||||
runtime_env.shop_path = shop_path
|
||||
runtime_env.fallback = fallback
|
||||
runtime_env.parallel = parallel
|
||||
runtime_env.race = race
|
||||
|
||||
@@ -584,6 +584,37 @@ JSC_CCALL(fd_slurpwrite,
|
||||
return JS_NULL;
|
||||
)
|
||||
|
||||
JSC_CCALL(fd_slurpappend,
|
||||
size_t len;
|
||||
const char *data = js_get_blob_data(js, &len, argv[1]);
|
||||
|
||||
if (!data && len > 0)
|
||||
return JS_EXCEPTION;
|
||||
|
||||
const char *str = JS_ToCString(js, argv[0]);
|
||||
|
||||
if (!str) return JS_EXCEPTION;
|
||||
int fd = open(str, O_WRONLY | O_CREAT | O_APPEND, 0644);
|
||||
if (fd < 0) {
|
||||
ret = JS_ThrowInternalError(js, "open failed for %s: %s", str, strerror(errno));
|
||||
JS_FreeCString(js, str);
|
||||
return ret;
|
||||
}
|
||||
|
||||
ssize_t written = write(fd, data, len);
|
||||
close(fd);
|
||||
|
||||
if (written != (ssize_t)len) {
|
||||
ret = JS_ThrowInternalError(js, "write failed for %s: %s", str, strerror(errno));
|
||||
JS_FreeCString(js, str);
|
||||
return ret;
|
||||
}
|
||||
|
||||
JS_FreeCString(js, str);
|
||||
|
||||
return JS_NULL;
|
||||
)
|
||||
|
||||
// Helper function for recursive enumeration
|
||||
static void visit_directory(JSContext *js, JSValue *results, int *result_count, const char *curr_path, const char *rel_prefix, int recurse) {
|
||||
if (!curr_path) return;
|
||||
@@ -733,6 +764,7 @@ static const JSCFunctionListEntry js_fd_funcs[] = {
|
||||
MIST_FUNC_DEF(fd, read, 2),
|
||||
MIST_FUNC_DEF(fd, slurp, 1),
|
||||
MIST_FUNC_DEF(fd, slurpwrite, 2),
|
||||
MIST_FUNC_DEF(fd, slurpappend, 2),
|
||||
MIST_FUNC_DEF(fd, lseek, 3),
|
||||
MIST_FUNC_DEF(fd, getcwd, 0),
|
||||
MIST_FUNC_DEF(fd, rmdir, 2),
|
||||
|
||||
343
log.ce
Normal file
343
log.ce
Normal file
@@ -0,0 +1,343 @@
|
||||
// cell log - Manage and read log sinks
|
||||
//
|
||||
// Usage:
|
||||
// cell log list List configured sinks
|
||||
// cell log add <name> console [opts] Add a console sink
|
||||
// cell log add <name> file <path> [opts] Add a file sink
|
||||
// cell log remove <name> Remove a sink
|
||||
// cell log read <sink> [opts] Read from a file sink
|
||||
// cell log tail <sink> [--lines=N] Follow a file sink
|
||||
|
||||
var toml = use('toml')
|
||||
var fd = use('fd')
|
||||
var json = use('json')
|
||||
|
||||
var log_path = shop_path + '/log.toml'
|
||||
|
||||
function load_config() {
|
||||
if (fd.is_file(log_path)) {
|
||||
return toml.decode(text(fd.slurp(log_path)))
|
||||
}
|
||||
return null
|
||||
}
|
||||
|
||||
function ensure_dir(path) {
|
||||
if (fd.is_dir(path)) return
|
||||
var parts = array(path, '/')
|
||||
var current = starts_with(path, '/') ? '/' : ''
|
||||
var i = 0
|
||||
for (i = 0; i < length(parts); i++) {
|
||||
if (parts[i] == '') continue
|
||||
current = current + parts[i] + '/'
|
||||
if (!fd.is_dir(current)) fd.mkdir(current)
|
||||
}
|
||||
}
|
||||
|
||||
function save_config(config) {
|
||||
ensure_dir(shop_path)
|
||||
fd.slurpwrite(log_path, stone(blob(toml.encode(config))))
|
||||
}
|
||||
|
||||
function print_help() {
|
||||
log.console("Usage: cell log <command> [options]")
|
||||
log.console("")
|
||||
log.console("Commands:")
|
||||
log.console(" list List configured sinks")
|
||||
log.console(" add <name> console [opts] Add a console sink")
|
||||
log.console(" add <name> file <path> [opts] Add a file sink")
|
||||
log.console(" remove <name> Remove a sink")
|
||||
log.console(" read <sink> [opts] Read from a file sink")
|
||||
log.console(" tail <sink> [--lines=N] Follow a file sink")
|
||||
log.console("")
|
||||
log.console("Options for add:")
|
||||
log.console(" --format=pretty|bare|json Output format (default: pretty for console, json for file)")
|
||||
log.console(" --channels=ch1,ch2 Channels to subscribe (default: console,error,system)")
|
||||
log.console(" --exclude=ch1,ch2 Channels to exclude (for wildcard sinks)")
|
||||
log.console("")
|
||||
log.console("Options for read:")
|
||||
log.console(" --lines=N Show last N lines (default: all)")
|
||||
log.console(" --channel=X Filter by channel")
|
||||
log.console(" --since=timestamp Only show entries after timestamp")
|
||||
}
|
||||
|
||||
function parse_opt(arg, prefix) {
|
||||
var full = '--' + prefix + '='
|
||||
if (starts_with(arg, full))
|
||||
return text(arg, length(full), length(arg))
|
||||
return null
|
||||
}
|
||||
|
||||
function format_entry(entry) {
|
||||
var aid = text(entry.actor_id, 0, 5)
|
||||
var src = ""
|
||||
var ev = null
|
||||
if (entry.source && entry.source.file)
|
||||
src = entry.source.file + ":" + text(entry.source.line)
|
||||
ev = is_text(entry.event) ? entry.event : json.encode(entry.event)
|
||||
return "[" + aid + "] [" + entry.channel + "] " + src + " " + ev
|
||||
}
|
||||
|
||||
function do_list() {
|
||||
var config = load_config()
|
||||
var names = null
|
||||
if (!config || !config.sink) {
|
||||
log.console("No log sinks configured.")
|
||||
log.console("Default: console pretty for console/error/system")
|
||||
return
|
||||
}
|
||||
names = array(config.sink)
|
||||
arrfor(names, function(n) {
|
||||
var s = config.sink[n]
|
||||
var ch = is_array(s.channels) ? text(s.channels, ', ') : '(none)'
|
||||
var ex = is_array(s.exclude) ? " exclude=" + text(s.exclude, ',') : ""
|
||||
var fmt = s.format || (s.type == 'file' ? 'json' : 'pretty')
|
||||
if (s.type == 'file')
|
||||
log.console(" " + n + ": " + s.type + " -> " + s.path + " [" + ch + "] format=" + fmt + ex)
|
||||
else
|
||||
log.console(" " + n + ": " + s.type + " [" + ch + "] format=" + fmt + ex)
|
||||
})
|
||||
}
|
||||
|
||||
function do_add() {
|
||||
var name = null
|
||||
var sink_type = null
|
||||
var path = null
|
||||
var format = null
|
||||
var channels = ["console", "error", "system"]
|
||||
var exclude = null
|
||||
var config = null
|
||||
var val = null
|
||||
var i = 0
|
||||
if (length(args) < 3) {
|
||||
log.error("Usage: cell log add <name> console|file [path] [options]")
|
||||
return
|
||||
}
|
||||
name = args[1]
|
||||
sink_type = args[2]
|
||||
|
||||
if (sink_type == 'file') {
|
||||
if (length(args) < 4) {
|
||||
log.error("Usage: cell log add <name> file <path> [options]")
|
||||
return
|
||||
}
|
||||
path = args[3]
|
||||
format = "json"
|
||||
i = 4
|
||||
} else if (sink_type == 'console') {
|
||||
format = "pretty"
|
||||
i = 3
|
||||
} else {
|
||||
log.error("Unknown sink type: " + sink_type + " (use 'console' or 'file')")
|
||||
return
|
||||
}
|
||||
|
||||
for (i = i; i < length(args); i++) {
|
||||
val = parse_opt(args[i], 'format')
|
||||
if (val) { format = val; continue }
|
||||
val = parse_opt(args[i], 'channels')
|
||||
if (val) { channels = array(val, ','); continue }
|
||||
val = parse_opt(args[i], 'exclude')
|
||||
if (val) { exclude = array(val, ','); continue }
|
||||
}
|
||||
|
||||
config = load_config()
|
||||
if (!config) config = {}
|
||||
if (!config.sink) config.sink = {}
|
||||
|
||||
config.sink[name] = {type: sink_type, format: format, channels: channels}
|
||||
if (path) config.sink[name].path = path
|
||||
if (exclude) config.sink[name].exclude = exclude
|
||||
|
||||
save_config(config)
|
||||
log.console("Added sink: " + name)
|
||||
}
|
||||
|
||||
function do_remove() {
|
||||
var name = null
|
||||
var config = null
|
||||
if (length(args) < 2) {
|
||||
log.error("Usage: cell log remove <name>")
|
||||
return
|
||||
}
|
||||
name = args[1]
|
||||
config = load_config()
|
||||
if (!config || !config.sink || !config.sink[name]) {
|
||||
log.error("Sink not found: " + name)
|
||||
return
|
||||
}
|
||||
config.sink[name] = null
|
||||
save_config(config)
|
||||
log.console("Removed sink: " + name)
|
||||
}
|
||||
|
||||
function do_read() {
|
||||
var name = null
|
||||
var max_lines = 0
|
||||
var filter_channel = null
|
||||
var since = 0
|
||||
var config = null
|
||||
var sink = null
|
||||
var content = null
|
||||
var lines = null
|
||||
var entries = []
|
||||
var entry = null
|
||||
var val = null
|
||||
var i = 0
|
||||
|
||||
if (length(args) < 2) {
|
||||
log.error("Usage: cell log read <sink_name> [options]")
|
||||
return
|
||||
}
|
||||
name = args[1]
|
||||
|
||||
for (i = 2; i < length(args); i++) {
|
||||
val = parse_opt(args[i], 'lines')
|
||||
if (val) { max_lines = number(val); continue }
|
||||
val = parse_opt(args[i], 'channel')
|
||||
if (val) { filter_channel = val; continue }
|
||||
val = parse_opt(args[i], 'since')
|
||||
if (val) { since = number(val); continue }
|
||||
}
|
||||
|
||||
config = load_config()
|
||||
if (!config || !config.sink || !config.sink[name]) {
|
||||
log.error("Sink not found: " + name)
|
||||
return
|
||||
}
|
||||
sink = config.sink[name]
|
||||
if (sink.type != 'file') {
|
||||
log.error("Can only read from file sinks")
|
||||
return
|
||||
}
|
||||
if (!fd.is_file(sink.path)) {
|
||||
log.console("Log file does not exist yet: " + sink.path)
|
||||
return
|
||||
}
|
||||
|
||||
content = text(fd.slurp(sink.path))
|
||||
lines = array(content, '\n')
|
||||
|
||||
arrfor(lines, function(line) {
|
||||
var parse_fn = null
|
||||
if (length(line) == 0) return
|
||||
parse_fn = function() {
|
||||
entry = json.decode(line)
|
||||
} disruption {
|
||||
entry = null
|
||||
}
|
||||
parse_fn()
|
||||
if (!entry) return
|
||||
if (filter_channel && entry.channel != filter_channel) return
|
||||
if (since > 0 && entry.timestamp < since) return
|
||||
entries[] = entry
|
||||
})
|
||||
|
||||
if (max_lines > 0 && length(entries) > max_lines)
|
||||
entries = array(entries, length(entries) - max_lines, length(entries))
|
||||
|
||||
arrfor(entries, function(e) {
|
||||
log.console(format_entry(e))
|
||||
})
|
||||
}
|
||||
|
||||
function do_tail() {
|
||||
var name = null
|
||||
var tail_lines = 10
|
||||
var config = null
|
||||
var sink = null
|
||||
var last_size = 0
|
||||
var val = null
|
||||
var i = 0
|
||||
|
||||
if (length(args) < 2) {
|
||||
log.error("Usage: cell log tail <sink_name> [--lines=N]")
|
||||
return
|
||||
}
|
||||
name = args[1]
|
||||
|
||||
for (i = 2; i < length(args); i++) {
|
||||
val = parse_opt(args[i], 'lines')
|
||||
if (val) { tail_lines = number(val); continue }
|
||||
}
|
||||
|
||||
config = load_config()
|
||||
if (!config || !config.sink || !config.sink[name]) {
|
||||
log.error("Sink not found: " + name)
|
||||
return
|
||||
}
|
||||
sink = config.sink[name]
|
||||
if (sink.type != 'file') {
|
||||
log.error("Can only tail file sinks")
|
||||
return
|
||||
}
|
||||
if (!fd.is_file(sink.path))
|
||||
log.console("Waiting for log file: " + sink.path)
|
||||
|
||||
function poll() {
|
||||
var st = null
|
||||
var poll_content = null
|
||||
var poll_lines = null
|
||||
var start = 0
|
||||
var poll_entry = null
|
||||
var old_line_count = 0
|
||||
var idx = 0
|
||||
var parse_fn = null
|
||||
if (!fd.is_file(sink.path)) {
|
||||
$delay(poll, 1)
|
||||
return
|
||||
}
|
||||
st = fd.stat(sink.path)
|
||||
if (st.size == last_size) {
|
||||
$delay(poll, 1)
|
||||
return
|
||||
}
|
||||
|
||||
poll_content = text(fd.slurp(sink.path))
|
||||
poll_lines = array(poll_content, '\n')
|
||||
|
||||
if (last_size == 0 && length(poll_lines) > tail_lines) {
|
||||
start = length(poll_lines) - tail_lines
|
||||
} else if (last_size > 0) {
|
||||
old_line_count = length(array(text(poll_content, 0, last_size), '\n'))
|
||||
start = old_line_count
|
||||
}
|
||||
|
||||
last_size = st.size
|
||||
for (idx = start; idx < length(poll_lines); idx++) {
|
||||
if (length(poll_lines[idx]) == 0) continue
|
||||
parse_fn = function() {
|
||||
poll_entry = json.decode(poll_lines[idx])
|
||||
} disruption {
|
||||
poll_entry = null
|
||||
}
|
||||
parse_fn()
|
||||
if (!poll_entry) continue
|
||||
os.print(format_entry(poll_entry) + "\n")
|
||||
}
|
||||
$delay(poll, 1)
|
||||
}
|
||||
|
||||
poll()
|
||||
}
|
||||
|
||||
// Main dispatch
|
||||
if (length(args) == 0) {
|
||||
print_help()
|
||||
} else if (args[0] == 'help' || args[0] == '-h' || args[0] == '--help') {
|
||||
print_help()
|
||||
} else if (args[0] == 'list') {
|
||||
do_list()
|
||||
} else if (args[0] == 'add') {
|
||||
do_add()
|
||||
} else if (args[0] == 'remove') {
|
||||
do_remove()
|
||||
} else if (args[0] == 'read') {
|
||||
do_read()
|
||||
} else if (args[0] == 'tail') {
|
||||
do_tail()
|
||||
} else {
|
||||
log.error("Unknown command: " + args[0])
|
||||
print_help()
|
||||
}
|
||||
|
||||
$stop()
|
||||
@@ -8143,6 +8143,56 @@ static JSValue js_stacktrace (JSContext *ctx, JSValue this_val, int argc, JSValu
|
||||
return JS_NULL;
|
||||
}
|
||||
|
||||
static JSValue js_caller_info (JSContext *ctx, JSValue this_val, int argc, JSValue *argv) {
|
||||
int depth = 0;
|
||||
if (argc > 0) JS_ToInt32(ctx, &depth, argv[0]);
|
||||
|
||||
/* Save frame pointer — JS_GetStack clears it */
|
||||
JSValue saved_frame = ctx->reg_current_frame;
|
||||
uint32_t saved_pc = ctx->current_register_pc;
|
||||
|
||||
cJSON *stack = JS_GetStack(ctx);
|
||||
|
||||
/* Restore so other callers still see the frame */
|
||||
ctx->reg_current_frame = saved_frame;
|
||||
ctx->current_register_pc = saved_pc;
|
||||
|
||||
const char *fn_str = "<anonymous>";
|
||||
const char *file_str = "<unknown>";
|
||||
int line = 0, col = 0;
|
||||
|
||||
if (stack) {
|
||||
int n = cJSON_GetArraySize(stack);
|
||||
/* depth 0 = immediate caller of caller_info, which is frame index 1
|
||||
(frame 0 is caller_info itself) */
|
||||
int idx = depth + 1;
|
||||
if (idx >= n) idx = n - 1;
|
||||
if (idx < 0) idx = 0;
|
||||
|
||||
cJSON *fr = cJSON_GetArrayItem(stack, idx);
|
||||
const char *v;
|
||||
v = cJSON_GetStringValue(cJSON_GetObjectItemCaseSensitive(fr, "function"));
|
||||
if (v) fn_str = v;
|
||||
v = cJSON_GetStringValue(cJSON_GetObjectItemCaseSensitive(fr, "file"));
|
||||
if (v) file_str = v;
|
||||
line = (int)cJSON_GetNumberValue(cJSON_GetObjectItemCaseSensitive(fr, "line"));
|
||||
col = (int)cJSON_GetNumberValue(cJSON_GetObjectItemCaseSensitive(fr, "column"));
|
||||
}
|
||||
|
||||
JSGCRef obj;
|
||||
JS_PushGCRef(ctx, &obj);
|
||||
obj.val = JS_NewObject(ctx);
|
||||
JS_SetPropertyStr(ctx, obj.val, "file", JS_NewString(ctx, file_str));
|
||||
JS_SetPropertyStr(ctx, obj.val, "line", JS_NewInt32(ctx, line));
|
||||
JS_SetPropertyStr(ctx, obj.val, "column", JS_NewInt32(ctx, col));
|
||||
JS_SetPropertyStr(ctx, obj.val, "function", JS_NewString(ctx, fn_str));
|
||||
JSValue result = obj.val;
|
||||
JS_PopGCRef(ctx, &obj);
|
||||
|
||||
if (stack) cJSON_Delete(stack);
|
||||
return result;
|
||||
}
|
||||
|
||||
/* ----------------------------------------------------------------------------
|
||||
* array function and sub-functions
|
||||
* ----------------------------------------------------------------------------
|
||||
@@ -11452,6 +11502,7 @@ static void JS_AddIntrinsicBaseObjects (JSContext *ctx) {
|
||||
/* I/O functions */
|
||||
js_set_global_cfunc(ctx, "print", js_print, -1); /* variadic: length < 0 means no arg limit */
|
||||
js_set_global_cfunc(ctx, "stacktrace", js_stacktrace, 0);
|
||||
js_set_global_cfunc(ctx, "caller_info", js_caller_info, 1);
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
@@ -7,6 +7,8 @@ sections:
|
||||
url: "/docs/actors/"
|
||||
- title: "Requestors"
|
||||
url: "/docs/requestors/"
|
||||
- title: "Logging"
|
||||
url: "/docs/logging/"
|
||||
- title: "Reference"
|
||||
pages:
|
||||
- title: "Built-in Functions"
|
||||
|
||||
Reference in New Issue
Block a user