Compare commits
992 Commits
steam
...
a1b41d5ecf
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
a1b41d5ecf | ||
|
|
eb19b18594 | ||
|
|
e203700d37 | ||
|
|
c56444556d | ||
|
|
080e675d18 | ||
|
|
957b964d9d | ||
|
|
fa9d2609b1 | ||
|
|
e38c2f07bf | ||
|
|
ecc1777b24 | ||
|
|
1cfd5b8133 | ||
|
|
9c1141f408 | ||
|
|
696cca530b | ||
|
|
c92a4087a6 | ||
|
|
01637c49b0 | ||
|
|
f9e660ebaa | ||
|
|
4f8fada57d | ||
|
|
adcaa92bea | ||
|
|
fc36707b39 | ||
|
|
7cb8ce7945 | ||
|
|
bb7997a751 | ||
|
|
327b990442 | ||
|
|
51c0a0b306 | ||
|
|
8ac82016dd | ||
|
|
d0bf757d91 | ||
|
|
c77f1f8639 | ||
|
|
2b877e6b0c | ||
|
|
3d4c0ec3d3 | ||
|
|
33d9013409 | ||
|
|
c2f57d1dae | ||
|
|
7bd17c6476 | ||
|
|
5ac1620b48 | ||
|
|
124c9536b4 | ||
|
|
940807c37a | ||
|
|
70f560550f | ||
|
|
060a494f47 | ||
|
|
6812d3edbc | ||
|
|
4da15d2a3e | ||
|
|
a34566a0c1 | ||
|
|
9f7d861932 | ||
|
|
c5536697ff | ||
|
|
d066ab03cd | ||
|
|
9c1cb43c7d | ||
|
|
99fb575c9c | ||
|
|
193991c532 | ||
|
|
76552c6854 | ||
|
|
f26b6e853d | ||
|
|
94c28f0e17 | ||
|
|
a18584afd3 | ||
|
|
3f6cfad7ef | ||
|
|
b03edb0d90 | ||
|
|
62440d3ed6 | ||
|
|
4edc4b7cc5 | ||
|
|
012b507415 | ||
|
|
ee6398ada9 | ||
|
|
173438e8bc | ||
|
|
7ac5ac63d2 | ||
|
|
a05f180356 | ||
|
|
d88692cd30 | ||
|
|
b0ac5de7e2 | ||
|
|
1d4fc11772 | ||
|
|
7372b80e07 | ||
|
|
d27047dd82 | ||
|
|
8e96379377 | ||
|
|
8f415fea80 | ||
|
|
cec0b99207 | ||
|
|
2d4645da9c | ||
|
|
017b63ba80 | ||
|
|
99fa86a09c | ||
|
|
6d6b53009f | ||
|
|
cc7fc6b667 | ||
|
|
bbeb757e40 | ||
|
|
2ac446f7cf | ||
|
|
517bd64275 | ||
|
|
7d0c96f328 | ||
|
|
f7e3c0803c | ||
|
|
e7ed6bd8b2 | ||
|
|
700b640cf1 | ||
|
|
071aa33153 | ||
|
|
d041c49972 | ||
|
|
74e0923629 | ||
|
|
eadad194be | ||
|
|
fea76ecac5 | ||
|
|
81c88f9439 | ||
|
|
20c2576fa7 | ||
|
|
ac707bc399 | ||
|
|
03b5fc1a5e | ||
|
|
5caa5d1288 | ||
|
|
3ebd98fc00 | ||
|
|
ede033f52e | ||
|
|
f4ad851a3f | ||
|
|
50a2d67c90 | ||
|
|
5ea0de9fbb | ||
|
|
34cb19c357 | ||
|
|
26f63bccee | ||
|
|
93aaaa43a1 | ||
|
|
1e606095a2 | ||
|
|
652e8a19f0 | ||
|
|
bcc9bdac83 | ||
|
|
f46784d884 | ||
|
|
fca1041e52 | ||
|
|
e73152bc36 | ||
|
|
6f932cd3c3 | ||
|
|
55fa250f12 | ||
|
|
374069ae99 | ||
|
|
3e4f3dff11 | ||
|
|
6cac591dc9 | ||
|
|
f6dab3c081 | ||
|
|
a82c13170f | ||
|
|
3e7b3e9994 | ||
|
|
68ae10bed0 | ||
|
|
cf3c2c9c5f | ||
|
|
bb8536f6c9 | ||
|
|
0e86d6f5a1 | ||
|
|
d8df467eae | ||
|
|
995d900e6e | ||
|
|
eeb2f038a1 | ||
|
|
ce2e11429f | ||
|
|
55fae8b5d0 | ||
|
|
c722b9d648 | ||
|
|
680b257a44 | ||
|
|
611d538e9f | ||
|
|
844ca0b8d5 | ||
|
|
ec404205ca | ||
|
|
9ebe6efe2b | ||
|
|
e455159b2d | ||
|
|
5af76bce9b | ||
|
|
5e413e8d81 | ||
|
|
8fe14f9a42 | ||
|
|
148cf12787 | ||
|
|
9c35e77f3f | ||
|
|
882a3ae8cb | ||
|
|
35d0890242 | ||
|
|
98515e9218 | ||
|
|
11fb213a74 | ||
|
|
4ac92c8a87 | ||
|
|
ed69d53573 | ||
|
|
e4588e43f2 | ||
|
|
2f41f58521 | ||
|
|
06866bcc0a | ||
|
|
f20fbedeea | ||
|
|
285395807b | ||
|
|
601a78b3c7 | ||
|
|
ebfc89e072 | ||
|
|
f0c2486a5c | ||
|
|
c5ad4f0a99 | ||
|
|
e6d05abd03 | ||
|
|
c0aff9e9bf | ||
|
|
8d449e6fc6 | ||
|
|
54e5be0773 | ||
|
|
38f368c6d6 | ||
|
|
06ad466b1a | ||
|
|
65fa37cc03 | ||
|
|
bab4d50b2a | ||
|
|
e7fec94e38 | ||
|
|
ab43ab0d2c | ||
|
|
a15844af58 | ||
|
|
85ef711229 | ||
|
|
ddfb0b1345 | ||
|
|
3f206d80dd | ||
|
|
3e0dc14318 | ||
|
|
19132c1517 | ||
|
|
e59bfe19f7 | ||
|
|
e004b2c472 | ||
|
|
27ca008f18 | ||
|
|
a05d0e2525 | ||
|
|
777474ab4f | ||
|
|
621da78de9 | ||
|
|
e2c26737f4 | ||
|
|
02eb58772c | ||
|
|
14a94aff12 | ||
|
|
6bc9dd53a7 | ||
|
|
f7499c4f60 | ||
|
|
fa5c0416fb | ||
|
|
dc70a15981 | ||
|
|
94fe47b472 | ||
|
|
34521e44f1 | ||
|
|
81561d426b | ||
|
|
7a4c72025f | ||
|
|
303f894a70 | ||
|
|
c33c35de87 | ||
|
|
417eec2419 | ||
|
|
c0cd6a61a6 | ||
|
|
42dc7243f3 | ||
|
|
469b7ac478 | ||
|
|
4872c62704 | ||
|
|
91b73f923a | ||
|
|
4868a50085 | ||
|
|
6f8cad9bb2 | ||
|
|
a1d1e721b6 | ||
|
|
dc7f933424 | ||
|
|
36f054d99d | ||
|
|
037fdbfd2c | ||
|
|
28f5a108d8 | ||
|
|
d8422ae69b | ||
|
|
187d7e9832 | ||
|
|
4aafb3c5e9 | ||
|
|
4b635228f9 | ||
|
|
42f7c270e1 | ||
|
|
bd7f9f34ec | ||
|
|
22c0b421d2 | ||
|
|
8be5936c10 | ||
|
|
bd53089578 | ||
|
|
76c482b84e | ||
|
|
b16fa75706 | ||
|
|
ad419797b4 | ||
|
|
5ee51198a7 | ||
|
|
2df45b2acb | ||
|
|
933c63caf8 | ||
|
|
dc422932d3 | ||
|
|
b25285f2e1 | ||
|
|
b3573dbf26 | ||
|
|
8a24a69120 | ||
|
|
56ac53637b | ||
|
|
5415726e33 | ||
|
|
56cb1fb4c6 | ||
|
|
278d685c8f | ||
|
|
51815b66d8 | ||
|
|
78051e24f3 | ||
|
|
c02fbbd9e0 | ||
|
|
ad26e71ad1 | ||
|
|
2e78e7e0b8 | ||
|
|
8f9eb0aaa9 | ||
|
|
0965aed0ef | ||
|
|
1b00fd1f0a | ||
|
|
3bf63780fd | ||
|
|
f7f26a1f00 | ||
|
|
4c9db198db | ||
|
|
eff3548c50 | ||
|
|
4fc48fd6f2 | ||
|
|
3f6388ff4e | ||
|
|
2be2b15a61 | ||
|
|
12b6c3544e | ||
|
|
570f0cdc83 | ||
|
|
cc82fcb7d9 | ||
|
|
5ef3381fff | ||
|
|
5fcf765c8d | ||
|
|
027c1549fc | ||
|
|
8c408a4b81 | ||
|
|
2d054fcf21 | ||
|
|
857f099a68 | ||
|
|
e0b6c69bfe | ||
|
|
2cef766b0a | ||
|
|
a3ecb0ad05 | ||
|
|
2a38292ff7 | ||
|
|
9e42a28d55 | ||
|
|
4d4d50a905 | ||
|
|
08515389d2 | ||
|
|
fbdfbc1200 | ||
|
|
d975214ba6 | ||
|
|
3c28dc2c30 | ||
|
|
2633fb986f | ||
|
|
400c58e5f2 | ||
|
|
bd4714a732 | ||
|
|
0ac575db85 | ||
|
|
41f373981d | ||
|
|
c9dad91ea1 | ||
|
|
63955e45ff | ||
|
|
4b7cde9400 | ||
|
|
8a19cffe9f | ||
|
|
6315574a45 | ||
|
|
2051677679 | ||
|
|
d398ab8db0 | ||
|
|
e7b599e3ac | ||
|
|
1f3e53587d | ||
|
|
dce0b5cc89 | ||
|
|
ce387d18d5 | ||
|
|
c02945e236 | ||
|
|
8e198d9822 | ||
|
|
17e35f023f | ||
|
|
5a7169654a | ||
|
|
a1ee7dd458 | ||
|
|
9dbe699033 | ||
|
|
f809cb05f0 | ||
|
|
788ea98651 | ||
|
|
433ce8a86e | ||
|
|
cd6e357b6e | ||
|
|
f4f56ed470 | ||
|
|
ff61ab1f50 | ||
|
|
46c345d34e | ||
|
|
dc440587ff | ||
|
|
8f92870141 | ||
|
|
7fc4a205f6 | ||
|
|
23b201bdd7 | ||
|
|
913ec9afb1 | ||
|
|
56de0ce803 | ||
|
|
96bbb9e4c8 | ||
|
|
ebd624b772 | ||
|
|
7de20b39da | ||
|
|
ee646db394 | ||
|
|
ff80e0d30d | ||
|
|
d9f41db891 | ||
|
|
860632e0fa | ||
|
|
dcc9659e6b | ||
|
|
2f7f2233b8 | ||
|
|
eee06009b9 | ||
|
|
a765872017 | ||
|
|
a93218e1ff | ||
|
|
f2c4fa2f2b | ||
|
|
5fe05c60d3 | ||
|
|
e75596ce30 | ||
|
|
86609c27f8 | ||
|
|
356c51bde3 | ||
|
|
89421e11a4 | ||
|
|
e5fc04fecd | ||
|
|
8ec56e85fa | ||
|
|
f49ca530bb | ||
|
|
83263379bd | ||
|
|
e80e615634 | ||
|
|
c1430fd59b | ||
|
|
db73eb4eeb | ||
|
|
f2556c5622 | ||
|
|
291304f75d | ||
|
|
3795533554 | ||
|
|
d26a96bc62 | ||
|
|
0acaabd5fa | ||
|
|
1ba060668e | ||
|
|
77fa058135 | ||
|
|
f7e2ff13b5 | ||
|
|
36fd0a35f9 | ||
|
|
77c02bf9bf | ||
|
|
f251691146 | ||
|
|
e9ea6ec299 | ||
|
|
bf5fdbc688 | ||
|
|
b960d03eeb | ||
|
|
b4d42fb83d | ||
|
|
0a680a0cd3 | ||
|
|
9f0fd84f4f | ||
|
|
cb9d6e0c0e | ||
|
|
4f18a0b524 | ||
|
|
f296a0c10d | ||
|
|
1df6553577 | ||
|
|
30a9cfee79 | ||
|
|
6fff96d9d9 | ||
|
|
4a50d0587d | ||
|
|
e346348eb5 | ||
|
|
ff560973f3 | ||
|
|
de4b3079d4 | ||
|
|
29227e655b | ||
|
|
588e88373e | ||
|
|
9aca365771 | ||
|
|
c56d4d5c3c | ||
|
|
c1e101b24f | ||
|
|
9f0dfbc6a2 | ||
|
|
5c9403a43b | ||
|
|
89e34ba71d | ||
|
|
73bfa8d7b1 | ||
|
|
4aedb8b0c5 | ||
|
|
ec072f3b63 | ||
|
|
65755d9c0c | ||
|
|
19524b3a53 | ||
|
|
f901332c5b | ||
|
|
add136c140 | ||
|
|
c1a99dfd4c | ||
|
|
7b46c6e947 | ||
|
|
1efb0b1bc9 | ||
|
|
0ba2783b48 | ||
|
|
6de542f0d0 | ||
|
|
6ba4727119 | ||
|
|
900db912a5 | ||
|
|
b771b2b5d8 | ||
|
|
68fb440502 | ||
|
|
e7a2f16004 | ||
|
|
3a8a17ab60 | ||
|
|
8a84be65e1 | ||
|
|
c1910ee1db | ||
|
|
7036cdf2d1 | ||
|
|
fbeec17ce5 | ||
|
|
2c55ae8cb2 | ||
|
|
259bc139fc | ||
|
|
a252412eca | ||
|
|
b327e16463 | ||
|
|
da6f096a56 | ||
|
|
1320ef9f47 | ||
|
|
ed4a5474d5 | ||
|
|
f52dd80d52 | ||
|
|
504e268b9d | ||
|
|
0d47002167 | ||
|
|
b65db63447 | ||
|
|
c1ccff5437 | ||
|
|
2f681fa366 | ||
|
|
682b1cf9cf | ||
|
|
ddf3fc1c77 | ||
|
|
f1a5072ff2 | ||
|
|
f44fb502be | ||
|
|
d75ce916d7 | ||
|
|
fe5dc6ecc9 | ||
|
|
54673e4a04 | ||
|
|
0d8b5cfb04 | ||
|
|
3d71f4a363 | ||
|
|
4deb0e2577 | ||
|
|
67b96e1627 | ||
|
|
4e5f1d8faa | ||
|
|
bd577712d9 | ||
|
|
6df3b741cf | ||
|
|
178837b88d | ||
|
|
120ce9d30c | ||
|
|
58f185b379 | ||
|
|
f7b5252044 | ||
|
|
ded5f7d74b | ||
|
|
fe6033d6cb | ||
|
|
60e61eef76 | ||
|
|
ad863fb89b | ||
|
|
96f8157039 | ||
|
|
c4ff0bc109 | ||
|
|
877250b1d8 | ||
|
|
747227de40 | ||
|
|
3f7e34cd7a | ||
|
|
cef5c50169 | ||
|
|
0428424ec7 | ||
|
|
78e64c5067 | ||
|
|
ff11c49c39 | ||
|
|
b8b110b616 | ||
|
|
930dcfba36 | ||
|
|
eeccb3b34a | ||
|
|
407797881c | ||
|
|
7069475729 | ||
|
|
3e42c57479 | ||
|
|
4b76728230 | ||
|
|
4ff9332d38 | ||
|
|
27e852af5b | ||
|
|
66a44595c8 | ||
|
|
fc0a1547dc | ||
|
|
c0b4e70eb2 | ||
|
|
f4714b2b36 | ||
|
|
7f691fd52b | ||
|
|
d5209e1d59 | ||
|
|
68e2395b92 | ||
|
|
1b747720b7 | ||
|
|
849123d8fc | ||
|
|
6ad919624b | ||
|
|
a11f3e7d47 | ||
|
|
3d1fd37979 | ||
|
|
8fc9bfe013 | ||
|
|
368511f666 | ||
|
|
3934cdb683 | ||
|
|
45556c344d | ||
|
|
bc87fe5f70 | ||
|
|
790293d915 | ||
|
|
872cd6ab51 | ||
|
|
e04ab4c30c | ||
|
|
0503acb7e6 | ||
|
|
d0c68d7a7d | ||
|
|
7469383e66 | ||
|
|
1fee8f9f8b | ||
|
|
a4f3b025c5 | ||
|
|
d18ea1b330 | ||
|
|
4de0659474 | ||
|
|
27a9b72b07 | ||
|
|
a3622bd5bd | ||
|
|
2f6700415e | ||
|
|
243d92f7f3 | ||
|
|
8f9d026b9b | ||
|
|
2c9ac8f7b6 | ||
|
|
80f24e131f | ||
|
|
a8f8af7662 | ||
|
|
f5b3494762 | ||
|
|
13a6f6c79d | ||
|
|
1a925371d3 | ||
|
|
08d2bacb1f | ||
|
|
7322153e57 | ||
|
|
cc72c4cb0f | ||
|
|
ae1f09a28f | ||
|
|
3c842912a1 | ||
|
|
7cacf32078 | ||
|
|
b740612761 | ||
|
|
6001c2b4bb | ||
|
|
98625fa15b | ||
|
|
87fafa44c8 | ||
|
|
45ce76aef7 | ||
|
|
32fb44857c | ||
|
|
31d67f6710 | ||
|
|
bae4e957e9 | ||
|
|
3621b1ef33 | ||
|
|
836227c8d3 | ||
|
|
0ae59705d4 | ||
|
|
8e2607b6ca | ||
|
|
dc73e86d8c | ||
|
|
555cceb9d6 | ||
|
|
fbb7933eb6 | ||
|
|
0287d6ada4 | ||
|
|
73cd6a255d | ||
|
|
83ea67c01b | ||
|
|
16059cca4e | ||
|
|
9ffe60ebef | ||
|
|
2beafec5d9 | ||
|
|
aba8eb66bd | ||
|
|
1abcaa92c7 | ||
|
|
168f7c71d5 | ||
|
|
56ed895b6e | ||
|
|
1e4646999d | ||
|
|
68d6c907fe | ||
|
|
8150c64c7d | ||
|
|
024d796ca4 | ||
|
|
ea185dbffd | ||
|
|
6571262af0 | ||
|
|
77ae133747 | ||
|
|
142a2d518b | ||
|
|
5b65c64fe5 | ||
|
|
e985fa5fe1 | ||
|
|
160ade2410 | ||
|
|
e2bc5948c1 | ||
|
|
8cf98d8a9e | ||
|
|
3c38e828e5 | ||
|
|
af2d296f40 | ||
|
|
0a45394689 | ||
|
|
32885a422f | ||
|
|
8959e53303 | ||
|
|
8a9a02b131 | ||
|
|
f9d68b2990 | ||
|
|
017a57b1eb | ||
|
|
ff8c68d01c | ||
|
|
9212003401 | ||
|
|
f9f8a4db42 | ||
|
|
8db95c654b | ||
|
|
63feabed5d | ||
|
|
c814c0e1d8 | ||
|
|
bead0c48d4 | ||
|
|
98dcab4ba7 | ||
|
|
ae44ce7b4b | ||
|
|
1c38699b5a | ||
|
|
9a70a12d82 | ||
|
|
a8a271e014 | ||
|
|
91761c03e6 | ||
|
|
5a479cc765 | ||
|
|
97a003e025 | ||
|
|
20f14abd17 | ||
|
|
19ba184fec | ||
|
|
7909b11f6b | ||
|
|
27229c675c | ||
|
|
64d234ee35 | ||
|
|
e861d73eec | ||
|
|
a24331aae5 | ||
|
|
c1cb922b64 | ||
|
|
aacb0b48bf | ||
|
|
b38aec95b6 | ||
|
|
b29d3c2fe0 | ||
|
|
1cc3005b68 | ||
|
|
b86cd042fc | ||
|
|
8b7af0c22a | ||
|
|
f71f6a296b | ||
|
|
9bd764b11b | ||
|
|
058cdfd2e4 | ||
|
|
1ef837c6ff | ||
|
|
cd21de3d70 | ||
|
|
a98faa4dbb | ||
|
|
08559234c4 | ||
|
|
c3dc27eac6 | ||
|
|
7170a9c7eb | ||
|
|
a08ee50f84 | ||
|
|
ed7dd91c3f | ||
|
|
3abe20fee0 | ||
|
|
a92a96118e | ||
|
|
4e407fe301 | ||
|
|
ab74cdc173 | ||
|
|
2c9d039271 | ||
|
|
80d314c58f | ||
|
|
611fba2b6f | ||
|
|
f5fad52d47 | ||
|
|
2fc7d333ad | ||
|
|
d4635f2a75 | ||
|
|
19576533d9 | ||
|
|
c08249b6f1 | ||
|
|
dc348d023f | ||
|
|
fd5e4d155e | ||
|
|
e734353722 | ||
|
|
94f1645be1 | ||
|
|
41e3a6d91a | ||
|
|
04c569eab1 | ||
|
|
dc3c474b3a | ||
|
|
f1117bbd41 | ||
|
|
43faad95e0 | ||
|
|
acc9878b36 | ||
|
|
03c45ee8b0 | ||
|
|
a171a0d2af | ||
|
|
bb8d3930b3 | ||
|
|
522ae6128a | ||
|
|
16c26e4bf2 | ||
|
|
a9804785e0 | ||
|
|
11ae703693 | ||
|
|
f203278c3e | ||
|
|
a80557283a | ||
|
|
3e40885e07 | ||
|
|
e4b7de46f6 | ||
|
|
e680439a9b | ||
|
|
ae11504e00 | ||
|
|
893deaec23 | ||
|
|
69b032d3dc | ||
|
|
256a00c501 | ||
|
|
8e166b8f98 | ||
|
|
ddbdd00496 | ||
|
|
bdf0461e1f | ||
|
|
0b86af1d4c | ||
|
|
beac9608ea | ||
|
|
a04bebd0d7 | ||
|
|
ce74f726dd | ||
|
|
4d1ab60852 | ||
|
|
22ab6c8098 | ||
|
|
9a9775690f | ||
|
|
be71ae3bba | ||
|
|
f2a76cbb55 | ||
|
|
2d834c37b3 | ||
|
|
ba1b92aa78 | ||
|
|
c356fe462d | ||
|
|
b23b918f97 | ||
|
|
e720152bcd | ||
|
|
f093e6f5a3 | ||
|
|
4fc904de63 | ||
|
|
e53f55fb23 | ||
|
|
6150406905 | ||
|
|
a189440769 | ||
|
|
6c3c492446 | ||
|
|
bb83327a52 | ||
|
|
c74bee89a7 | ||
|
|
271a3d6724 | ||
|
|
c5ccc66e51 | ||
|
|
3a0ea31896 | ||
|
|
67e82fd12c | ||
|
|
3c59087c0c | ||
|
|
b79e07f57b | ||
|
|
fdcb374403 | ||
|
|
03feb370fd | ||
|
|
6712755940 | ||
|
|
b3f3bc8a5f | ||
|
|
a49b94e0a1 | ||
|
|
24ecff3f1c | ||
|
|
3ccaf68a5b | ||
|
|
64933260d4 | ||
|
|
561ab9d917 | ||
|
|
bcd6e641a5 | ||
|
|
2857581271 | ||
|
|
378ad6dc98 | ||
|
|
06f7791159 | ||
|
|
086508bacd | ||
|
|
8325253f1a | ||
|
|
802c94085b | ||
|
|
f9170b33e5 | ||
|
|
20f10ab887 | ||
|
|
d8b13548d2 | ||
|
|
0d93741c31 | ||
|
|
c6440ff98c | ||
|
|
a01b48dabc | ||
|
|
beea76949c | ||
|
|
36833db2c9 | ||
|
|
b7615bb801 | ||
|
|
e6838338fc | ||
|
|
4cf0ce00de | ||
|
|
0714017547 | ||
|
|
b60e79ccad | ||
|
|
420c2b859a | ||
|
|
6c1f53ec5f | ||
|
|
45d82438ca | ||
|
|
addb38da65 | ||
|
|
aa847ddf6e | ||
|
|
4da63db16e | ||
|
|
bfdd920178 | ||
|
|
26fce3a5a8 | ||
|
|
ef49606098 | ||
|
|
854d94e5c3 | ||
|
|
dc02d6899d | ||
|
|
b28ef39562 | ||
|
|
2841e91f40 | ||
|
|
8d601dfce3 | ||
|
|
2b60e3a242 | ||
|
|
823183c510 | ||
|
|
c051a99e75 | ||
|
|
b3c0837d49 | ||
|
|
ff18682485 | ||
|
|
9b3891c126 | ||
|
|
38a3697e28 | ||
|
|
cbf99295da | ||
|
|
5271688dd4 | ||
|
|
98cb2c3239 | ||
|
|
e695810e64 | ||
|
|
bbd2d298ba | ||
|
|
a7a323a74e | ||
|
|
97ece8e5cb | ||
|
|
45ee4a337c | ||
|
|
ce7d83ec91 | ||
|
|
b46406f755 | ||
|
|
ac91495679 | ||
|
|
5018901acb | ||
|
|
78a46b4a72 | ||
|
|
66a9ca27e2 | ||
|
|
ac4b47f075 | ||
|
|
6b9f75247e | ||
|
|
b039b0c4ba | ||
|
|
ffe7b61ae2 | ||
|
|
17b9aaaf51 | ||
|
|
5fd29366a6 | ||
|
|
f16586eaa2 | ||
|
|
86a70bce3a | ||
|
|
e04b15973a | ||
|
|
d044bde4f9 | ||
|
|
8403883b9d | ||
|
|
69245f82db | ||
|
|
8203f6d1c3 | ||
|
|
ef94b55058 | ||
|
|
3a3e77eccd | ||
|
|
438c90acb5 | ||
|
|
dd309b1a37 | ||
|
|
63cf76dcf9 | ||
|
|
df07069c38 | ||
|
|
eba0727247 | ||
|
|
b0f0a5f63f | ||
|
|
d12d77c22c | ||
|
|
d6468e7fd2 | ||
|
|
7ae5a0c06b | ||
|
|
0664c11af6 | ||
|
|
058ad89c96 | ||
|
|
a0038a7ab2 | ||
|
|
d0674e7921 | ||
|
|
b586df63ad | ||
|
|
05b57550da | ||
|
|
fa616ee444 | ||
|
|
0720368c48 | ||
|
|
4f3e2819fe | ||
|
|
3b53a9dcc3 | ||
|
|
6d7581eff8 | ||
|
|
dca1963b5d | ||
|
|
1833a3c74c | ||
|
|
4201be47ee | ||
|
|
c388fb311b | ||
|
|
c432bc211f | ||
|
|
5f471ee003 | ||
|
|
5d384e2706 | ||
|
|
a0daf98ca8 | ||
|
|
3b42426e6f | ||
|
|
a035e28100 | ||
|
|
4f076bc868 | ||
|
|
c8831c85d0 | ||
|
|
3cf7aa473e | ||
|
|
8b9e088385 | ||
|
|
d50f4119ee | ||
|
|
aa18a8c8d2 | ||
|
|
04a648a73e | ||
|
|
363ca1c3c1 | ||
|
|
3211b59408 | ||
|
|
169448f156 | ||
|
|
b98cef6e8d | ||
|
|
33dc6b053c | ||
|
|
6ff54d8347 | ||
|
|
bf421743a5 | ||
|
|
5a3e260821 | ||
|
|
d0f5dc6951 | ||
|
|
752479e250 | ||
|
|
37f2cff6ec | ||
|
|
657e342159 | ||
|
|
de4c9c724e | ||
|
|
5c5427fdd9 | ||
|
|
38d6b4d4e8 | ||
|
|
aae267a6e9 | ||
|
|
dbc1b2c31e | ||
|
|
32301cf61f | ||
|
|
270521a01e | ||
|
|
401f69b503 | ||
|
|
1bcdab64ff | ||
|
|
107c4a5dce | ||
|
|
16aba47782 | ||
|
|
14cf48931d | ||
|
|
c24a5079cb | ||
|
|
ce5949e0ee | ||
|
|
3f2b4177d6 | ||
|
|
687c6d264f | ||
|
|
abf1332bf5 | ||
|
|
7c0f4dcd5f | ||
|
|
33ebd8f45d | ||
|
|
5ac58dfbb0 | ||
|
|
825c6aa284 | ||
|
|
3bcc642158 | ||
|
|
ea510d609f | ||
|
|
744f64b83b | ||
|
|
7a6209df72 | ||
|
|
dde0efc8aa | ||
|
|
d6daa97ac7 | ||
|
|
230880a02f | ||
|
|
d229784d8d | ||
|
|
19e747c120 | ||
|
|
22d3adca93 | ||
|
|
20fa012b57 | ||
|
|
95ac4bd096 | ||
|
|
6b8464aca4 | ||
|
|
39ac340e01 | ||
|
|
fc87c05daf | ||
|
|
1886f10211 | ||
|
|
c8ce152ade | ||
|
|
1a9734f265 | ||
|
|
ac718fdb13 | ||
|
|
1769d7f456 | ||
|
|
85e0e3dab1 | ||
|
|
ddfa636ac0 | ||
|
|
ee3a890d4a | ||
|
|
76bdccc4ee | ||
|
|
814ee7c0db | ||
|
|
0c5609c4e9 | ||
|
|
20f5e4e81c | ||
|
|
139405a30b | ||
|
|
495f145524 | ||
|
|
6cfba975e5 | ||
|
|
fd4222f196 | ||
|
|
5971924785 | ||
|
|
8e4c60baf3 | ||
|
|
53085d7e3a | ||
|
|
c88d551cad | ||
|
|
4b5ceb4c02 | ||
|
|
2b90a160ba | ||
|
|
4df134b327 | ||
|
|
b44d79ccab | ||
|
|
b29cdc8b93 | ||
|
|
42bede58bc | ||
|
|
3800c23eae | ||
|
|
8fa80b4720 | ||
|
|
acf842e2a1 | ||
|
|
1d295d11e5 | ||
|
|
0ddbc3e953 | ||
|
|
9f696a0342 | ||
|
|
efe93b7206 | ||
|
|
a9cff079d9 | ||
|
|
d361cb0555 | ||
|
|
b577e889a1 | ||
|
|
e28e241485 | ||
|
|
2110688fa5 | ||
|
|
fd19ecb41e | ||
|
|
155d0bf4b5 | ||
|
|
311a57b1e2 | ||
|
|
2cf94f7c57 | ||
|
|
9b19d19698 | ||
|
|
67badc3e48 | ||
|
|
820413a72a | ||
|
|
8bc31e3ac6 | ||
|
|
b613c7b6fa | ||
|
|
8bdcaf7d9d | ||
|
|
2beb369af9 | ||
|
|
44c46f2bb4 | ||
|
|
20685d80f1 | ||
|
|
532cfd0ed0 | ||
|
|
34de9e6dc4 | ||
|
|
9881158e62 | ||
|
|
be416b0124 | ||
|
|
906b60276a | ||
|
|
d61f98f81d | ||
|
|
dbf9f8ef30 | ||
|
|
0082eab729 | ||
|
|
60d2d3b8d6 | ||
|
|
0dd4ad8046 | ||
|
|
c02b42f710 | ||
|
|
6961e19114 | ||
|
|
721ac3bb93 | ||
|
|
768ad399de | ||
|
|
a05dda4914 | ||
|
|
149ee23e45 | ||
|
|
c9b504ead4 | ||
|
|
a8eed4e0d8 | ||
|
|
708c01ee29 | ||
|
|
8a33be3550 | ||
|
|
175146ab37 | ||
|
|
ef7b492984 | ||
|
|
923b5bc3d6 | ||
|
|
5b2a88d520 | ||
|
|
b5496d93bc | ||
|
|
faf19db27e | ||
|
|
07595aad63 | ||
|
|
8033c161d0 | ||
|
|
625503e654 | ||
|
|
6a9a1a90dd | ||
|
|
ce5b4950d4 | ||
|
|
cba6f4fd59 | ||
|
|
e5c19e7e80 | ||
|
|
09c3d5cc4e | ||
|
|
5ae95aee01 | ||
|
|
d5789598a0 | ||
|
|
8ffb4ec73a | ||
|
|
fecc4b1285 | ||
|
|
f82d924577 | ||
|
|
f389609bd9 | ||
|
|
ff7b2f4db7 | ||
|
|
8fb50a129f | ||
|
|
b3c4b1fee9 | ||
|
|
bb3c2c34d0 | ||
|
|
7fb0d9e80a | ||
|
|
f49d4180ed | ||
|
|
7bfd244bf2 | ||
|
|
881407a64f | ||
|
|
e677832b12 | ||
|
|
13c1e7560a | ||
|
|
2e275adcd2 | ||
|
|
2607604a0b | ||
|
|
d1d9a296a8 | ||
|
|
7edbb85e4e | ||
|
|
874252db87 | ||
|
|
13dd685f65 | ||
|
|
4b817b8d1b | ||
|
|
09b78781e6 | ||
|
|
19ce1008b1 | ||
|
|
d1c7ff768d | ||
|
|
0d97b47728 | ||
|
|
c87f85cf6c | ||
|
|
f0afdfc7d9 | ||
|
|
ac9f40fd26 | ||
|
|
39152c1eb2 | ||
|
|
6ff50cb521 | ||
|
|
2b7b3985d5 | ||
|
|
a9b59750e3 | ||
|
|
525263a8a6 | ||
|
|
310a0db99e | ||
|
|
6e66bf59f6 | ||
|
|
71c0056df4 | ||
|
|
d4f0059419 | ||
|
|
d52d50fe61 | ||
|
|
1bc34bb99c | ||
|
|
8ac78e0be6 | ||
|
|
dca3ede464 | ||
|
|
7b622d9788 | ||
|
|
42087910ab | ||
|
|
c581935fd8 | ||
|
|
632b038561 | ||
|
|
6eb33b8e48 | ||
|
|
3d5f345236 | ||
|
|
4689ea1167 | ||
|
|
15d85096a2 | ||
|
|
f2c2ecf692 | ||
|
|
458215f838 | ||
|
|
7f002e306d | ||
|
|
91a3fef065 | ||
|
|
843b4bd8a8 | ||
|
|
41fdf49df5 | ||
|
|
c9adbed3ff | ||
|
|
3459d85a82 | ||
|
|
9ecdaae7a7 | ||
|
|
43b55b29f3 | ||
|
|
a88cee7fae | ||
|
|
8b3f5476a9 | ||
|
|
233f59e04a | ||
|
|
7b16259c00 | ||
|
|
43baa23dfe | ||
|
|
a9ebea9f26 | ||
|
|
19b729afbf | ||
|
|
47729c225f | ||
|
|
ea6cf5db49 | ||
|
|
794baf8598 | ||
|
|
a551368681 | ||
|
|
c20ca8c937 | ||
|
|
0217cf0da6 | ||
|
|
6bd7251933 | ||
|
|
6dc8d97001 | ||
|
|
9c0565d34f | ||
|
|
0702e3495d | ||
|
|
108c39d22d | ||
|
|
946ebe5cd7 | ||
|
|
fa12281ab9 | ||
|
|
38a52fcb73 | ||
|
|
95a95e55e3 | ||
|
|
fc978a5766 | ||
|
|
9082ee2c47 | ||
|
|
b8ad8431f4 | ||
|
|
1c2b8228fe | ||
|
|
a274fb174f | ||
|
|
3622a5ec58 | ||
|
|
8a5f8a4d74 | ||
|
|
c1d341eecd | ||
|
|
3176e6775d | ||
|
|
34dcd0a235 | ||
|
|
cbda7dfbc9 | ||
|
|
d039e2cfe6 | ||
|
|
c02bd06ec0 | ||
|
|
efa63771e6 | ||
|
|
9f6d27fb3c | ||
|
|
1a61ae6f77 | ||
|
|
83c816fd0e | ||
|
|
adbaa92dd5 | ||
|
|
580df9f233 | ||
|
|
d5d17560f9 | ||
|
|
cd05ab97b5 | ||
|
|
4eecbd692b | ||
|
|
72beed7177 | ||
|
|
e0595de71a | ||
|
|
6687008d1a | ||
|
|
5b9f1b8f51 | ||
|
|
c570de7f41 | ||
|
|
d0138a6c23 | ||
|
|
29aa25e866 | ||
|
|
ef28be93db | ||
|
|
0d7be6a94e | ||
|
|
4fe78c4a63 | ||
|
|
b52edb2746 | ||
|
|
79d5412fe6 | ||
|
|
fcec2cd1dc | ||
|
|
2038ce15a7 | ||
|
|
08557011cb | ||
|
|
3e87bfd6cc | ||
|
|
ef86dd3ecf | ||
|
|
c887bcf7b9 | ||
|
|
709f2459e4 |
@@ -1,8 +0,0 @@
|
||||
[dependencies]
|
||||
extramath = "https://gitea.pockle.world/john/extramath@master"
|
||||
|
||||
[system]
|
||||
ar_timer = 60 # seconds before idle actor reclamation
|
||||
actor_memory = 0 # MB of memory an actor can use; 0 for unbounded
|
||||
net_service = 0.1 # seconds per net service pull
|
||||
reply_timeout = 60 # seconds to hold callback for reply messages; 0 for unbounded
|
||||
@@ -1,6 +0,0 @@
|
||||
[modules]
|
||||
[modules.extramath]
|
||||
hash = "MCLZT3JABTAENS4WVXKGWJ7JPBLZER4YQ5VN2PE7ZD2Z4WYGTIMA===="
|
||||
url = "https://gitea.pockle.world/john/extramath@master"
|
||||
downloaded = "Monday June 2 12:07:20.42 PM -5 2025 AD"
|
||||
commit = "84d81a19a8455bcf8dc494739e9e6d545df6ff2c"
|
||||
@@ -1,9 +1,20 @@
|
||||
BasedOnStyle: GNU
|
||||
Language: C
|
||||
|
||||
IndentWidth: 2
|
||||
TabWidth: 2
|
||||
UseTab: Never
|
||||
ContinuationIndentWidth: 2 # Indents continuation lines by 2 spaces
|
||||
ContinuationIndentWidth: 2
|
||||
|
||||
AllowShortFunctionsOnASingleLine: true
|
||||
AllowShortBlocksOnASingleLine: true
|
||||
AllowShortIfStatementsOnASingleLine: true
|
||||
BreakBeforeBraces: Attach
|
||||
ColumnLimit: 0
|
||||
BreakFunctionDefinitionParameters: false
|
||||
BinPackParameters: false
|
||||
BinPackArguments: false
|
||||
|
||||
# --- Fix the "static T\nname(...)" style ---
|
||||
AlwaysBreakAfterDefinitionReturnType: None
|
||||
BreakAfterReturnType: None
|
||||
|
||||
1
.gitattributes
vendored
Normal file
1
.gitattributes
vendored
Normal file
@@ -0,0 +1 @@
|
||||
*.mach binary merge=ours
|
||||
30
.github/docker/Dockerfile.alpine
vendored
30
.github/docker/Dockerfile.alpine
vendored
@@ -1,30 +0,0 @@
|
||||
# Dockerfile.alpine
|
||||
FROM alpine:edge
|
||||
|
||||
# Enable the edge and edge/community repositories.
|
||||
# If you already have those in your base image, you might not need these echo lines.
|
||||
RUN echo "https://dl-cdn.alpinelinux.org/alpine/edge/main" >> /etc/apk/repositories && \
|
||||
echo "https://dl-cdn.alpinelinux.org/alpine/edge/community" >> /etc/apk/repositories
|
||||
|
||||
# Update indexes and install packages
|
||||
RUN apk update && \
|
||||
apk add --no-cache \
|
||||
build-base \
|
||||
binutils \
|
||||
mold \
|
||||
meson \
|
||||
cmake \
|
||||
ninja \
|
||||
git \
|
||||
pkgconf \
|
||||
ccache \
|
||||
nodejs \
|
||||
npm \
|
||||
zip \
|
||||
alsa-lib-dev \
|
||||
pulseaudio-dev \
|
||||
libudev-zero-dev \
|
||||
wayland-dev \
|
||||
wayland-protocols \
|
||||
mesa-dev \
|
||||
sdl3
|
||||
32
.github/docker/Dockerfile.linux
vendored
32
.github/docker/Dockerfile.linux
vendored
@@ -1,32 +0,0 @@
|
||||
FROM ubuntu:plucky
|
||||
|
||||
RUN apt-get update && apt-get install -y --no-install-recommends \
|
||||
python3 python3-pip \
|
||||
libasound2-dev \
|
||||
libpulse-dev \
|
||||
libudev-dev \
|
||||
libwayland-dev \
|
||||
wayland-protocols \
|
||||
libxkbcommon-dev \
|
||||
libx11-dev \
|
||||
libxext-dev \
|
||||
libxrandr-dev \
|
||||
libxcursor-dev \
|
||||
libxi-dev \
|
||||
libxinerama-dev \
|
||||
libxss-dev \
|
||||
libegl1-mesa-dev \
|
||||
libgl1-mesa-dev \
|
||||
cmake \
|
||||
ninja-build \
|
||||
git \
|
||||
build-essential \
|
||||
binutils \
|
||||
mold \
|
||||
pkg-config \
|
||||
meson \
|
||||
ccache \
|
||||
mingw-w64 \
|
||||
wine \
|
||||
npm nodejs zip && \
|
||||
rm -rf /var/lib/apt/lists/*
|
||||
15
.github/docker/Dockerfile.mingw
vendored
15
.github/docker/Dockerfile.mingw
vendored
@@ -1,15 +0,0 @@
|
||||
FROM ubuntu:plucky
|
||||
|
||||
RUN apt-get update && \
|
||||
apt-get install -y --no-install-recommends \
|
||||
mingw-w64 \
|
||||
cmake \
|
||||
ninja-build \
|
||||
git \
|
||||
build-essential \
|
||||
binutils \
|
||||
pkg-config \
|
||||
zip \
|
||||
ccache \
|
||||
npm nodejs && \
|
||||
rm -rf /var/lib/apt/lists/*
|
||||
304
.github/workflows/build.yml
vendored
304
.github/workflows/build.yml
vendored
@@ -1,304 +0,0 @@
|
||||
name: Build and Deploy
|
||||
|
||||
on:
|
||||
push:
|
||||
branches: [ "*" ]
|
||||
tags: [ "v*" ]
|
||||
pull_request:
|
||||
|
||||
jobs:
|
||||
# ──────────────────────────────────────────────────────────────
|
||||
# LINUX BUILD
|
||||
# ──────────────────────────────────────────────────────────────
|
||||
build-linux:
|
||||
runs-on: ubuntu-latest
|
||||
container:
|
||||
image: gitea.pockle.world/john/prosperon/linux:latest
|
||||
|
||||
steps:
|
||||
- name: Check Out Code
|
||||
uses: actions/checkout@v4
|
||||
with: { fetch-depth: 0 }
|
||||
|
||||
- name: Build Prosperon (Linux)
|
||||
run: |
|
||||
meson setup build -Dbuildtype=release -Db_lto=true -Db_lto_mode=thin -Db_ndebug=true
|
||||
meson compile -C build
|
||||
|
||||
- name: Test Prosperon (Linux)
|
||||
env: { TRACY_NO_INVARIANT_CHECK: 1 }
|
||||
run: |
|
||||
meson test --print-errorlogs -C build
|
||||
|
||||
- name: Upload Test Log (Linux)
|
||||
if: ${{ always() }}
|
||||
uses: actions/upload-artifact@v3
|
||||
with:
|
||||
name: testlog-linux
|
||||
path: build/meson-logs/testlog.txt
|
||||
|
||||
- name: Upload Artifact (Linux)
|
||||
if: startsWith(github.ref, 'refs/tags/v')
|
||||
uses: actions/upload-artifact@v3
|
||||
with:
|
||||
name: prosperon-artifacts-linux
|
||||
path: build/prosperon
|
||||
|
||||
- name: Set up Docker Buildx
|
||||
uses: docker/setup-buildx-action@v3
|
||||
|
||||
- name: Log in to Gitea Registry
|
||||
uses: docker/login-action@v3
|
||||
with:
|
||||
registry: gitea.pockle.world
|
||||
username: ${{ secrets.USER_GITEA }}
|
||||
password: ${{ secrets.TOKEN_GITEA }}
|
||||
|
||||
- name: Determine Docker Tag
|
||||
id: docker_tag
|
||||
run: |
|
||||
if [[ "${{ github.ref }}" =~ ^refs/tags/v.* ]]; then
|
||||
TAG=$(echo "${{ github.ref }}" | sed 's#refs/tags/##')
|
||||
echo "tag=$TAG" >> $GITHUB_OUTPUT
|
||||
else
|
||||
echo "tag=latest" >> $GITHUB_OUTPUT
|
||||
fi
|
||||
|
||||
- name: Build and Push Docker Image
|
||||
uses: docker/build-push-action@v6
|
||||
with:
|
||||
context: .
|
||||
file: ./Dockerfile
|
||||
push: true
|
||||
tags: gitea.pockle.world/john/prosperon:${{ steps.docker_tag.outputs.tag }}
|
||||
platforms: linux/amd64
|
||||
|
||||
# ──────────────────────────────────────────────────────────────
|
||||
# WINDOWS BUILD (MSYS2 / CLANG64)
|
||||
# ──────────────────────────────────────────────────────────────
|
||||
build-windows:
|
||||
runs-on: win-native
|
||||
strategy:
|
||||
matrix: { msystem: [ CLANG64 ] }
|
||||
|
||||
steps:
|
||||
- name: Check Out Code
|
||||
uses: actions/checkout@v4
|
||||
|
||||
- name: Setup MSYS2
|
||||
uses: msys2/setup-msys2@v2
|
||||
with:
|
||||
msystem: ${{ matrix.msystem }}
|
||||
update: true
|
||||
cache: true
|
||||
install: |
|
||||
git zip gzip tar base-devel
|
||||
pacboy: |
|
||||
meson
|
||||
cmake
|
||||
toolchain
|
||||
|
||||
- name: Build Prosperon (Windows)
|
||||
shell: msys2 {0}
|
||||
run: |
|
||||
meson setup build -Dbuildtype=release -Db_lto=true -Db_lto_mode=thin -Db_ndebug=true -Dtracy:only_localhost=true -Dtracy:no_broadcast=true
|
||||
meson compile -C build
|
||||
|
||||
- name: Test Prosperon (Windows)
|
||||
shell: msys2 {0}
|
||||
env:
|
||||
TRACY_NO_INVARIANT_CHECK: 1
|
||||
run: |
|
||||
meson test --print-errorlogs -C build
|
||||
|
||||
- name: Upload Test Log (Windows)
|
||||
if: ${{ always() }}
|
||||
uses: actions/upload-artifact@v3
|
||||
with:
|
||||
name: testlog-windows
|
||||
path: build/meson-logs/testlog.txt
|
||||
|
||||
- name: Upload Artifact (Windows)
|
||||
if: startsWith(github.ref, 'refs/tags/v')
|
||||
uses: actions/upload-artifact@v3
|
||||
with:
|
||||
name: prosperon-artifacts-windows
|
||||
path: build/prosperon.exe
|
||||
|
||||
# ──────────────────────────────────────────────────────────────
|
||||
# MACOS BUILD
|
||||
# ──────────────────────────────────────────────────────────────
|
||||
build-macos:
|
||||
runs-on: macos-latest
|
||||
|
||||
steps:
|
||||
- name: Check Out Code
|
||||
uses: actions/checkout@v4
|
||||
with: { fetch-depth: 0 }
|
||||
|
||||
- name: Build Prosperon (macOS)
|
||||
run: |
|
||||
meson setup build -Dbuildtype=release -Db_lto=true -Db_lto_mode=thin -Db_ndebug=true
|
||||
meson compile -C build
|
||||
|
||||
- name: Test Prosperon (macOS)
|
||||
run: |
|
||||
meson test --print-errorlogs -C build
|
||||
|
||||
- name: Upload Test Log (macOS)
|
||||
if: ${{ always() }}
|
||||
uses: actions/upload-artifact@v3
|
||||
with:
|
||||
name: testlog-macos
|
||||
path: build/meson-logs/testlog.txt
|
||||
|
||||
- name: Upload Artifact (macOS)
|
||||
if: startsWith(github.ref, 'refs/tags/v')
|
||||
uses: actions/upload-artifact@v3
|
||||
with:
|
||||
name: prosperon-artifacts-macos
|
||||
path: build/prosperon
|
||||
|
||||
# ──────────────────────────────────────────────────────────────
|
||||
# PACKAGE CROSS-PLATFORM DIST
|
||||
# ──────────────────────────────────────────────────────────────
|
||||
package-dist:
|
||||
needs: [ build-linux, build-windows, build-macos ]
|
||||
if: startsWith(github.ref, 'refs/tags/v')
|
||||
runs-on: ubuntu-latest
|
||||
|
||||
steps:
|
||||
- name: Check Out Code
|
||||
uses: actions/checkout@v3
|
||||
with: { fetch-depth: 0 }
|
||||
|
||||
- name: Get Latest Tag
|
||||
id: get_tag
|
||||
run: |
|
||||
TAG=$(git describe --tags --abbrev=0)
|
||||
echo "tag=$TAG" >> $GITHUB_OUTPUT
|
||||
|
||||
- name: Download Linux Artifacts
|
||||
uses: actions/download-artifact@v3
|
||||
with:
|
||||
name: prosperon-artifacts-linux
|
||||
path: linux_artifacts
|
||||
|
||||
- name: Download Windows Artifacts
|
||||
uses: actions/download-artifact@v3
|
||||
with:
|
||||
name: prosperon-artifacts-windows
|
||||
path: windows_artifacts
|
||||
|
||||
- name: Download macOS Artifacts
|
||||
uses: actions/download-artifact@v3
|
||||
with:
|
||||
name: prosperon-artifacts-macos
|
||||
path: mac_artifacts
|
||||
|
||||
- name: Create Dist Folder
|
||||
run: |
|
||||
mkdir -p dist/linux dist/win dist/mac
|
||||
cp README.md dist/
|
||||
cp license.txt dist/
|
||||
cp -r examples dist/
|
||||
cp linux_artifacts/* dist/linux/
|
||||
cp windows_artifacts/* dist/win/
|
||||
cp mac_artifacts/* dist/mac/
|
||||
|
||||
- name: Package Final Dist
|
||||
run: |
|
||||
TAG=${{ steps.get_tag.outputs.tag }}
|
||||
zip -r "prosperon-${TAG}.zip" dist
|
||||
echo "Created prosperon-${TAG}.zip"
|
||||
|
||||
- name: Upload Final Dist
|
||||
uses: actions/upload-artifact@v3
|
||||
with:
|
||||
name: "prosperon-${{ steps.get_tag.outputs.tag }}"
|
||||
path: "prosperon-${{ steps.get_tag.outputs.tag }}.zip"
|
||||
|
||||
# ──────────────────────────────────────────────────────────────
|
||||
# DEPLOY TO ITCH.IO (single ZIP containing all OSes)
|
||||
# ──────────────────────────────────────────────────────────────
|
||||
deploy-itch:
|
||||
needs: [ package-dist ]
|
||||
runs-on: ubuntu-latest
|
||||
|
||||
steps:
|
||||
- name: Check Out Code
|
||||
uses: actions/checkout@v3
|
||||
with: { fetch-depth: 0 }
|
||||
|
||||
- name: Get Latest Tag
|
||||
id: get_tag
|
||||
run: |
|
||||
TAG=$(git describe --tags --abbrev=0)
|
||||
echo "tag=$TAG" >> $GITHUB_OUTPUT
|
||||
|
||||
- name: Download Final Distribution
|
||||
uses: actions/download-artifact@v3
|
||||
with:
|
||||
name: "prosperon-${{ steps.get_tag.outputs.tag }}"
|
||||
path: dist
|
||||
|
||||
- name: Set up Butler
|
||||
uses: jdno/setup-butler@v1
|
||||
|
||||
- name: Push to itch.io
|
||||
run: |
|
||||
butler push "dist/prosperon-${{ steps.get_tag.outputs.tag }}.zip" \
|
||||
${{ secrets.ITCHIO_USERNAME }}/prosperon:universal \
|
||||
--userversion ${{ steps.get_tag.outputs.tag }}
|
||||
env:
|
||||
BUTLER_API_KEY: ${{ secrets.ITCHIO_API_KEY }}
|
||||
|
||||
# ──────────────────────────────────────────────────────────────
|
||||
# DEPLOY TO SELF-HOSTED GITEA
|
||||
# ──────────────────────────────────────────────────────────────
|
||||
deploy-gitea:
|
||||
needs: [ package-dist ]
|
||||
runs-on: ubuntu-latest
|
||||
|
||||
steps:
|
||||
- name: Check Out Code
|
||||
uses: actions/checkout@v3
|
||||
with: { fetch-depth: 0 }
|
||||
|
||||
- name: Get Latest Tag & Commit Message
|
||||
id: get_tag
|
||||
run: |
|
||||
TAG=$(git describe --tags --abbrev=0)
|
||||
COMMIT_MSG=$(git log -1 --pretty=%B "$TAG")
|
||||
echo "tag=$TAG" >> $GITHUB_OUTPUT
|
||||
echo "commit_msg=$COMMIT_MSG" >> $GITHUB_OUTPUT
|
||||
|
||||
- name: Download Final Distribution
|
||||
uses: actions/download-artifact@v3
|
||||
with:
|
||||
name: "prosperon-${{ steps.get_tag.outputs.tag }}"
|
||||
path: dist
|
||||
|
||||
- name: Create / Update Gitea Release
|
||||
run: |
|
||||
TAG=${{ steps.get_tag.outputs.tag }}
|
||||
ZIP=dist/prosperon-${TAG}.zip
|
||||
BODY=$(echo "${{ steps.get_tag.outputs.commit_msg }}" | jq -R -s '.')
|
||||
RELEASE=$(curl -s -H "Authorization: token ${{ secrets.TOKEN_GITEA }}" \
|
||||
"https://gitea.pockle.world/api/v1/repos/john/prosperon/releases/tags/$TAG" | jq -r '.id')
|
||||
|
||||
if [ "$RELEASE" = "null" ] || [ -z "$RELEASE" ]; then
|
||||
RELEASE=$(curl -X POST \
|
||||
-H "Authorization: token ${{ secrets.TOKEN_GITEA }}" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d "{\"tag_name\":\"$TAG\",\"target_commitish\":\"${{ github.sha }}\",\"name\":\"$TAG\",\"body\":$BODY,\"draft\":false,\"prerelease\":false}" \
|
||||
"https://gitea.pockle.world/api/v1/repos/john/prosperon/releases" | jq -r '.id')
|
||||
fi
|
||||
|
||||
curl -X POST \
|
||||
-H "Authorization: token ${{ secrets.TOKEN_GITEA }}" \
|
||||
-H "Content-Type: application/octet-stream" \
|
||||
--data-binary @"$ZIP" \
|
||||
"https://gitea.pockle.world/api/v1/repos/john/prosperon/releases/$RELEASE/assets?name=prosperon-${TAG}.zip"
|
||||
env:
|
||||
TOKEN_GITEA: ${{ secrets.TOKEN_GITEA }}
|
||||
15
.gitignore
vendored
15
.gitignore
vendored
@@ -1,5 +1,12 @@
|
||||
.git/
|
||||
.obj/
|
||||
website/public/
|
||||
website/site/
|
||||
website/.hugo_build.lock
|
||||
.cache
|
||||
.cell
|
||||
cell
|
||||
libcell_runtime*
|
||||
bin/
|
||||
build/
|
||||
*.zip
|
||||
@@ -14,6 +21,7 @@ build/
|
||||
source/shaders/*.h
|
||||
.DS_Store
|
||||
*.html
|
||||
!website/themes/**/*.html
|
||||
.vscode
|
||||
*.icns
|
||||
icon.ico
|
||||
@@ -21,3 +29,10 @@ steam/
|
||||
subprojects/*/
|
||||
build_dbg/
|
||||
modules/
|
||||
sdk/
|
||||
artifacts/
|
||||
discord_social_sdk/
|
||||
discord_partner_sdk/
|
||||
steam_api64.dll
|
||||
subprojects/.wraplock
|
||||
.gemini
|
||||
|
||||
27
AGENTS.md
27
AGENTS.md
@@ -1,27 +0,0 @@
|
||||
# AGENTS.md
|
||||
|
||||
## Project Overview
|
||||
This is a game engine developed using a QuickJS fork as its scripting language. It is an actor based system, based on Douglas Crockford's Misty. It is a Meson compiled project with a number of dependencies.
|
||||
|
||||
## File Structure
|
||||
- `source/`: Contains the C source code
|
||||
- `scripts/`: Contains script code that is loaded on executable start, and modules
|
||||
- `shaders/`: Contains shaders that ship with the engine (for shader based backends)
|
||||
- `benchmarks/`: Benchmark programs for testing speed
|
||||
- `tests/`: Unit tests
|
||||
- `examples/`: Contains full game examples
|
||||
|
||||
## Coding Practices
|
||||
- Use K&R style C
|
||||
- Use as little whitespace as possible
|
||||
- Javascript style prefers objects and prototypical inheritence over ES6 classes, liberal use of closures, and var everywhere
|
||||
|
||||
## Instructions
|
||||
- When generating code, adhere to the coding practices outlined above.
|
||||
- When adding new features, ensure they align with the project's goals.
|
||||
- When fixing bugs, review the code carefully before making changes.
|
||||
- When writing unit tests, cover all important scenarios.
|
||||
|
||||
## Compiling, running, and testing
|
||||
- To compile the code, run "make", which generates a prosperon executable in build_dbg/, and copy it into the root folder
|
||||
- Run a test by giving it as its command: so ./prosperon tests/overling.js would run the test overling.js, ./prosperon tests/nota.js runs the nota benchmark
|
||||
584
CLAUDE.md
584
CLAUDE.md
@@ -1,405 +1,237 @@
|
||||
# CLAUDE.md
|
||||
# ƿit (pit) Language Project
|
||||
|
||||
This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
|
||||
## Building
|
||||
|
||||
## Build Commands
|
||||
Build (or rebuild after changes): `make`
|
||||
Install to system: `make install`
|
||||
Run `cell --help` to see all CLI flags.
|
||||
|
||||
### Build variants
|
||||
- `make` - Make and install debug version. Usually all that's needed.
|
||||
- `make fast` - Build optimized version
|
||||
- `make release` - Build release version with LTO and optimizations
|
||||
- `make small` - Build minimal size version
|
||||
- `make web` - Build for web/emscripten platform
|
||||
- `make crosswin` - Cross-compile for Windows using mingw32
|
||||
## Code Style
|
||||
|
||||
### Testing
|
||||
After install with 'make', just run 'cell' and point it at the actor you want to launch. "cell tests/toml" runs the actor "tests/toml.js"
|
||||
All code uses 2 spaces for indentation. K&R style for C and Javascript.
|
||||
|
||||
## Scripting language
|
||||
This is called "cell", but it is is a variant of javascript and extremely similar.
|
||||
## ƿit Script Quick Reference
|
||||
|
||||
### Common development commands
|
||||
- `meson setup build_<variant>` - Configure build directory
|
||||
- `meson compile -C build_<variant>` - Compile in build directory
|
||||
- `./build_dbg/prosperon examples/<example>` - Run example from build directory
|
||||
- Copy prosperon to game directory and run: `cp build_dbg/prosperon <game-dir>/ && cd <game-dir> && ./prosperon`
|
||||
ƿit script files: `.ce` (actors) and `.cm` (modules). The syntax is similar to JavaScript with important differences listed below.
|
||||
|
||||
## Architecture Overview
|
||||
### Key Differences from JavaScript
|
||||
|
||||
Prosperon is an actor-based game engine inspired by Douglas Crockford's Misty system. Key architectural principles:
|
||||
- `var` (mutable) and `def` (constant) — no `let` or `const`
|
||||
- `==` and `!=` are strict (no `===` or `!==`)
|
||||
- No `undefined` — only `null`
|
||||
- No classes — only objects and prototypes (`meme()`, `proto()`, `isa()`)
|
||||
- No `switch`/`case` — use record dispatch (a record keyed by case, values are functions or results) instead of if/else chains
|
||||
- No `for...in`, `for...of`, spread (`...`), rest params, or default params
|
||||
- Functions have a maximum of 4 parameters — use a record for more
|
||||
- Variables must be declared at function body level only (not in if/while/for/blocks)
|
||||
- All variables must be initialized at declaration (`var x` alone is an error; use `var x = null`)
|
||||
- No `try`/`catch`/`throw` — use `disrupt`/`disruption`
|
||||
- No arraybuffers — only `blob` (works with bits; must `stone(blob)` before reading)
|
||||
- Identifiers can contain `?` and `!` (e.g., `nil?`, `set!`, `is?valid`)
|
||||
- Prefer backticks for string interpolation; otherwise use `text()` to convert non-strings
|
||||
- Everything should be lowercase
|
||||
|
||||
### Intrinsic Functions (always available, no `use()` needed)
|
||||
|
||||
The creator functions are **polymorphic** — behavior depends on argument types:
|
||||
|
||||
- `array(number)` — create array of size N filled with null
|
||||
- `array(number, value_or_fn)` — create array with initial values
|
||||
- `array(array)` — copy array
|
||||
- `array(array, fn)` — map
|
||||
- `array(array, array)` — concatenate
|
||||
- `array(array, from, to)` — slice
|
||||
- `array(record)` — get keys as array of text
|
||||
- **`array(text)` — split text into individual characters** (e.g., `array("hello")` → `["h","e","l","l","o"]`)
|
||||
- `array(text, separator)` — split by separator
|
||||
- `array(text, length)` — split into chunks of length
|
||||
|
||||
- `text(array, separator)` — join array into text
|
||||
- `text(number)` or `text(number, radix)` — number to text
|
||||
- `text(text, from, to)` — substring
|
||||
|
||||
- `number(text)` or `number(text, radix)` — parse text to number
|
||||
- `number(logical)` — boolean to number
|
||||
|
||||
- `record(record)` — copy
|
||||
- `record(record, another)` — merge
|
||||
- `record(array_of_keys)` — create record from keys
|
||||
|
||||
Other key intrinsics: `object()`, `length()`, `stone()`, `is_stone()`, `filter()`, `find()`, `reduce()`, `sort()`, `reverse()`, `some()`, `every()`, `starts_with()`, `ends_with()`, `meme()`, `proto()`, `isa()`, `splat()`, `apply()`, `extract()`, `replace()`, `search()`, `format()`, `lower()`, `upper()`, `trim()`
|
||||
|
||||
Sensory functions: `is_array()`, `is_text()`, `is_number()`, `is_object()`, `is_function()`, `is_null()`, `is_logical()`, `is_integer()`, `is_stone()`, etc.
|
||||
|
||||
### Standard Library (loaded with `use()`)
|
||||
|
||||
- `blob` — binary data (bits, not bytes)
|
||||
- `time` — time constants and conversions
|
||||
- `math` — trig, logarithms, roots (`math/radians`, `math/turns`)
|
||||
- `json` — JSON encoding/decoding
|
||||
- `random` — random number generation
|
||||
|
||||
### Actor Model
|
||||
- Each actor runs on its own thread
|
||||
- Communication only through message passing (no shared JavaScript objects)
|
||||
- Hierarchical actor system with spawning/killing
|
||||
- Actor lifecycle: awake, update, draw, garbage collection
|
||||
|
||||
### JavaScript Style Guide
|
||||
- Use `use()` function for imports (Misty-style, not ES6 import/export)
|
||||
- Prefer closures and javascript objects and prototypes over ES6 style classes
|
||||
- Follow existing JavaScript patterns in the codebase
|
||||
- Functions as first-class citizens
|
||||
- Do not use const or let; only var
|
||||
- `.ce` files are actors (independent execution units, don't return values)
|
||||
- `.cm` files are modules (return a value, cached and frozen)
|
||||
- Actors never share memory; communicate via `$send()` message passing
|
||||
- Actor intrinsics start with `$`: `$me`, `$stop()`, `$send()`, `$start()`, `$delay()`, `$receiver()`, `$clock()`, `$portal()`, `$contact()`, `$couple()`, `$unneeded()`, `$connection()`, `$time_limit()`
|
||||
|
||||
### Core Systems
|
||||
1. **Actor System** (scripts/core/engine.js)
|
||||
- Message passing via `send()`, `$_.receive()`
|
||||
- Actor spawning/management
|
||||
- Register-based component system (update, draw, gui, etc.)
|
||||
|
||||
2. **Module System**
|
||||
- `use()` function for loading modules
|
||||
- Module paths: `scripts/modules/`, `scripts/modules/ext/`
|
||||
- Custom QuickJS build with embedded C modules
|
||||
### Requestors (async composition)
|
||||
|
||||
3. **Build System**
|
||||
- Meson build configuration (Makefile is convenience wrapper)
|
||||
- Multiple platform targets (Windows, macOS, Linux, Web)
|
||||
- Custom QuickJS build in `subprojects/`
|
||||
- Uses SDL3 for cross-platform support
|
||||
`sequence()`, `parallel()`, `race()`, `fallback()` — compose asynchronous operations. See docs/requestors.md.
|
||||
|
||||
### Engine Entry Points
|
||||
- `source/prosperon.c` - Main C entry point
|
||||
- `scripts/core/engine.js` - JavaScript engine initialization for system
|
||||
- `scripts/core/base.js` has modifications to this Javascript runtime (for example, additions to the base Array, String, etc)
|
||||
|
||||
### Subprojects
|
||||
- C code has many subprojects, who's source and sometimes documentation can be found in subprojects. subprojects/quickjs/doc has documentation for quickjs
|
||||
|
||||
### Resource System
|
||||
- Scripts are bundled into `core.zip` during build
|
||||
- Runtime module loading via PhysFS
|
||||
- Resource paths checked in order: `/`, `scripts/modules/`, `scripts/modules/ext/`
|
||||
|
||||
### Notable Dependencies
|
||||
- QuickJS (custom build) - JavaScript runtime
|
||||
- SDL3 - Platform abstraction
|
||||
- Chipmunk2D - Physics
|
||||
- ENet - Networking
|
||||
- Soloud - Audio
|
||||
- Tracy - Profiling (when enabled)
|
||||
|
||||
## Development Tips
|
||||
|
||||
### Running Games
|
||||
```bash
|
||||
# Build first
|
||||
make debug
|
||||
|
||||
# Run example from build directory
|
||||
./build_dbg/prosperon examples/chess
|
||||
|
||||
# Or copy to game directory
|
||||
cp build_dbg/prosperon examples/chess/
|
||||
cd examples/chess
|
||||
./prosperon
|
||||
```
|
||||
|
||||
### Documentation
|
||||
- Documentation is found in docs
|
||||
- Documentation for the JS modules loaded with 'use' is docs/api/modules
|
||||
- .md files directly in docs gives a high level overview
|
||||
- docs/dull is what this specific Javascript system is (including alterations from quickjs/es6)
|
||||
|
||||
### Shader Development
|
||||
- Shaders are in `shaders/` directory as HLSL
|
||||
- Compile script: `shaders/compile.sh`
|
||||
- Outputs to platform-specific formats: `dxil/`, `msl/`, `spv/`
|
||||
|
||||
### Example Games
|
||||
Located in `examples/` directory:
|
||||
- `chess` - Chess implementation (has its own Makefile)
|
||||
- `pong` - Classic pong game
|
||||
- `snake` - Snake game
|
||||
- `tetris` - Tetris clone
|
||||
- `bunnymark` - Performance test
|
||||
|
||||
### Testing
|
||||
```bash
|
||||
# Run all tests
|
||||
meson test -C build_dbg
|
||||
|
||||
# Run specific test
|
||||
./build_dbg/prosperon tests/spawn_actor.js
|
||||
```
|
||||
|
||||
### Debugging
|
||||
- Use debug build: `make debug`
|
||||
- Tracy profiler support when enabled
|
||||
- Console logging available via `log.console()`, `log.error()`, etc.
|
||||
- Log files written to `.prosperon/log.txt`
|
||||
|
||||
# Project Structure Notes
|
||||
|
||||
## Core JavaScript Modules
|
||||
|
||||
- JavaScript modules are defined using the MISTUSE macro in jsffi.c
|
||||
- The `js_os_funcs`, `js_io_funcs`, etc. arrays define the available functions for each module
|
||||
- New functions are added with MIST_FUNC_DEF(module, function, args_count)
|
||||
|
||||
## File I/O
|
||||
|
||||
- `io.slurp(path)` - Reads a file as text
|
||||
- `io.slurpbytes(path)` - Reads a file as an ArrayBuffer
|
||||
- `io.slurpwrite(path, data)` - Writes data (string or ArrayBuffer) to a file
|
||||
- `io.exists(path)` - Checks if a file exists
|
||||
|
||||
## Script Loading
|
||||
|
||||
- The `use(path)` function in engine.js loads JavaScript modules
|
||||
- Script loading happens in prosperon.c and the engine.js script
|
||||
- jsffi.c contains the C hooks for the QuickJS JavaScript engine
|
||||
- Added functionality for bytecode compilation and loading:
|
||||
- `os.compile_bytecode(source, filename)` - Compiles JS to bytecode, returns ArrayBuffer
|
||||
- `os.eval_bytecode(bytecode)` - Evaluates bytecode from an ArrayBuffer
|
||||
- `compile(scriptPath)` - Compiles a JS file to a .jso bytecode file
|
||||
- Modified `use()` to check for .jso files before loading .js files
|
||||
|
||||
## QuickJS Bytecode API
|
||||
|
||||
- `JS_Eval` with JS_EVAL_FLAG_COMPILE_ONLY - Compiles without executing
|
||||
- `JS_WriteObject` with JS_WRITE_OBJ_BYTECODE - Serializes to bytecode
|
||||
- `JS_ReadObject` with JS_READ_OBJ_BYTECODE - Deserializes and loads bytecode
|
||||
- Bytecode files use .jso extension alongside .js files
|
||||
|
||||
## Available JavaScript APIs
|
||||
|
||||
### Core APIs
|
||||
- `actor` - Base prototype for all actor objects
|
||||
- `$_` - Special global for actor messaging
|
||||
- `prosperon` - Global engine interface
|
||||
- `console` - Logging and debugging interface
|
||||
|
||||
### Framework APIs
|
||||
- `moth` - Higher-level game framework that simplifies Prosperon usage
|
||||
- Handles window creation, game loop, and event dispatching
|
||||
- Provides simple configuration via config.js
|
||||
- Auto-initializes systems like rendering and input
|
||||
- Manages camera, resolution, and FPS automatically
|
||||
|
||||
### Rendering
|
||||
- `draw2d` - 2D drawing primitives
|
||||
- `render` - Low-level rendering operations
|
||||
- `graphics` - Higher-level graphics utilities
|
||||
- `camera` - Camera controls and transformations
|
||||
- `sprite` - Sprite rendering and management
|
||||
|
||||
### Physics and Math
|
||||
- `math` - Mathematical utilities
|
||||
- `geometry` - Geometric calculations and shapes
|
||||
- `transform` - Object transformations
|
||||
|
||||
### Input and Events
|
||||
- `input` - Mouse, keyboard, and touch handling
|
||||
- `event` - Event management system
|
||||
|
||||
### Networking
|
||||
- `enet` - Networking through ENet library
|
||||
- `http` - HTTP client capabilities
|
||||
|
||||
### Audio
|
||||
- `sound` - Audio playback using SoLoud
|
||||
|
||||
### Utility Modules
|
||||
- `time` - Time management and delays
|
||||
- **Must be imported with `use('time')`**
|
||||
- No `time.now()` function - use:
|
||||
- `time.number()` - Number representation of current time
|
||||
- `time.record()` - Struct representation of current time
|
||||
- `time.text()` - Text representation of current time
|
||||
- `io` - File I/O operations
|
||||
- `json` - JSON parsing and serialization
|
||||
- `util` - General utilities
|
||||
- `color` - Color manipulation
|
||||
- `miniz` - Compression utilities
|
||||
- `nota` - Structured data format
|
||||
- `wota` - Serialization format
|
||||
- `qr` - QR code generation/reading
|
||||
- `tween` - Animation tweening
|
||||
- `spline` - Spline calculations
|
||||
- `imgui` - Immediate mode GUI
|
||||
|
||||
## Game Development Patterns
|
||||
|
||||
### Project Structure
|
||||
- Game config is typically in `config.js`
|
||||
- Main entry point is `main.js`
|
||||
- Resource loading through `resources.js`
|
||||
|
||||
### Actor Pattern Usage
|
||||
- Create actors with `actor.spawn(script, config)`
|
||||
- Start actors with `$_.start(callback, script)` - the system automatically sends a greeting, callback receives {type: 'greet', actor: actor_ref}
|
||||
- No need to manually send greetings - `$_.start` handles this automatically
|
||||
- Manage actor hierarchy with overlings and underlings
|
||||
- Schedule actor tasks with `$_.delay()` method
|
||||
- Clean up with `kill()` and `garbage()`
|
||||
|
||||
### Actor Messaging with Callbacks
|
||||
When sending a message with a callback, respond by sending to the message itself:
|
||||
```javascript
|
||||
// Sender side:
|
||||
send(actor, {type: 'status'}, response => {
|
||||
log.console(response); // Handle the response
|
||||
});
|
||||
|
||||
// Receiver side:
|
||||
$_.receiver(msg => {
|
||||
if (msg.type === 'status') {
|
||||
send(msg, {status: 'ok'}); // Send response to the message itself
|
||||
}
|
||||
});
|
||||
```
|
||||
|
||||
**Critical Rules for Message Callbacks**:
|
||||
- **A message can only be used ONCE as a send target** - after sending a response to a message, it cannot be used again
|
||||
- If you need to send multiple updates (like progress), only the download request message should be used for the final response
|
||||
- Status requests should each get their own individual response
|
||||
- Actor objects and message headers are completely opaque - never try to access internal properties
|
||||
- Never access `msg.__HEADER__` or similar - the actor system handles routing internally
|
||||
- Use `$_.delay()` to schedule work and avoid blocking the message receiver
|
||||
|
||||
### Game Loop Registration
|
||||
- Register functions like `update`, `draw`, `gui`, etc.
|
||||
- Set function.layer property to control execution order
|
||||
- Use `Register` system to manage callbacks
|
||||
|
||||
### Program vs Module Pattern
|
||||
- Programs are actor scripts that don't return values, they execute top-to-bottom
|
||||
- Modules are files that return single values (usually objects) that get frozen
|
||||
- Programs can spawn other programs as underlings
|
||||
- Programs have lifecycle hooks: awake, update, draw, garbage, etc.
|
||||
|
||||
## Technical Capabilities
|
||||
|
||||
### Graphics Pipeline
|
||||
- Supports multiple render backends (Direct3D, Metal, Vulkan via SDL3)
|
||||
- Custom shader system with cross-platform compilation
|
||||
- Sprite batching for efficient 2D rendering
|
||||
- Camera systems for both 2D and 3D
|
||||
|
||||
### Asset Support
|
||||
- Images: PNG, JPG, QOI, etc.
|
||||
- Audio: Various formats through SoLoud
|
||||
- Models: Basic 3D model support
|
||||
- Custom formats: Aseprite animations, etc.
|
||||
|
||||
### Developer Tools
|
||||
- Built-in documentation system with `cell.DOC`
|
||||
- Tracy profiler integration for performance monitoring
|
||||
- Imgui debugging tools
|
||||
- Console logging with various severity levels
|
||||
|
||||
## Misty Networking Patterns
|
||||
|
||||
Prosperon implements the Misty actor networking model. Understanding these patterns is critical for building distributed applications.
|
||||
|
||||
### Portal Reply Pattern
|
||||
Portals must reply with an actor object, not application data:
|
||||
```javascript
|
||||
// CORRECT: Portal replies with actor
|
||||
$_.portal(e => {
|
||||
send(e, $_); // Reply with server actor
|
||||
}, 5678);
|
||||
|
||||
// WRONG: Portal sends application data
|
||||
$_.portal(e => {
|
||||
send(e, {type: 'game_start'}); // This breaks the pattern
|
||||
}, 5678);
|
||||
```
|
||||
|
||||
### Two-Phase Connection Protocol
|
||||
Proper Misty networking follows a two-phase pattern:
|
||||
|
||||
**Phase 1: Actor Connection**
|
||||
- Client contacts portal using `$_.contact()`
|
||||
- Portal replies with an actor object
|
||||
- This establishes the communication channel
|
||||
|
||||
**Phase 2: Application Communication**
|
||||
- Client sends application messages to the received actor
|
||||
- Normal bidirectional messaging begins
|
||||
- Application logic handles game/service initialization
|
||||
|
||||
### Message Handling Best Practices
|
||||
Messages should be treated as opaque objects with your application data:
|
||||
### Error Handling
|
||||
|
||||
```javascript
|
||||
// CORRECT: Store actor references separately
|
||||
var players = {};
|
||||
$_.receiver(msg => {
|
||||
if (msg.type === 'join_game' && msg.player_id) {
|
||||
// Store the message for later response
|
||||
players[msg.player_id] = msg;
|
||||
// Later, respond to the stored message
|
||||
send(players[msg.player_id], {type: 'game_start'});
|
||||
}
|
||||
});
|
||||
|
||||
// WRONG: Trying to access internal message properties
|
||||
$_.receiver(msg => {
|
||||
var sender = msg.__HEADER__.replycc; // Never do this!
|
||||
});
|
||||
var fn = function() {
|
||||
disrupt // bare keyword, no value
|
||||
} disruption {
|
||||
// handle error; can re-raise with disrupt
|
||||
}
|
||||
```
|
||||
|
||||
### Return ID Lifecycle
|
||||
- Each reply callback gets a unique return ID
|
||||
- Return IDs are consumed once and then deleted
|
||||
- Reusing message objects with return headers causes "Could not find return function" errors
|
||||
- Always create clean actor references for ongoing communication
|
||||
|
||||
### Actor Object Transparency
|
||||
Actor objects must be completely opaque black boxes that work identically regardless of transport:
|
||||
### Push/Pop Syntax
|
||||
|
||||
```javascript
|
||||
// Actor objects work transparently for:
|
||||
// - Same-process communication (fastest - uses mailbox)
|
||||
// - Inter-process communication (uses mailbox)
|
||||
// - Network communication (uses ENet)
|
||||
|
||||
// The actor shouldn't know or care about the transport mechanism
|
||||
send(opponent, {type: 'move', from: [0,0], to: [1,1]});
|
||||
var a = [1, 2]
|
||||
a[] = 3 // push: [1, 2, 3]
|
||||
var v = a[] // pop: v is 3, a is [1, 2]
|
||||
```
|
||||
|
||||
**Key Implementation Details:**
|
||||
- `actor_send()` in `scripts/core/engine.js` handles routing based on available actor data
|
||||
- Actor objects sent in message data automatically get address/port populated when received over network
|
||||
- Three communication pathways: `os.mailbox_exist()` check → mailbox send → network send
|
||||
- Actor objects must contain all necessary routing information for transparent messaging
|
||||
## C Integration
|
||||
|
||||
### Common Networking Bugs
|
||||
1. **Portal sending application data**: Portal should only establish actor connections
|
||||
2. **Return ID collision**: Reusing messages with return headers for multiple sends
|
||||
3. **Mixed phases**: Trying to do application logic during connection establishment
|
||||
4. **Header pollution**: Using received message objects as actor references
|
||||
5. **Missing actor address info**: Actor objects in message data need network address population (fixed in engine.js:746-766)
|
||||
- Declare everything `static` that can be
|
||||
- Most files don't have headers; files in a package are not shared between packages
|
||||
- No undefined in C API: use `JS_IsNull` and `JS_NULL` only
|
||||
- A C file with correct macros (`CELL_USE_FUNCS` etc) is loaded as a module by its name (e.g., `png.c` in a package → `use('<package>/png')`)
|
||||
- C symbol naming: `js_<pkg>_<file>_use` (e.g., `js_core_math_radians_use` for `core/math/radians`)
|
||||
- Core is the `core` package — its symbols follow the same `js_core_<name>_use` pattern as all other packages
|
||||
- Package directories should contain only source files (no `.mach`/`.mcode` alongside source)
|
||||
- Build cache files in `build/` are bare hashes (no extensions)
|
||||
|
||||
### Example: Correct Chess Networking
|
||||
```javascript
|
||||
// Server: Portal setup
|
||||
$_.portal(e => {
|
||||
send(e, $_); // Just reply with actor
|
||||
}, 5678);
|
||||
### MANDATORY: GC Rooting for C Functions
|
||||
|
||||
// Client: Two-phase connection
|
||||
$_.contact((actor, reason) => {
|
||||
if (actor) {
|
||||
opponent = actor;
|
||||
send(opponent, {type: 'join_game'}); // Phase 2: app messaging
|
||||
}
|
||||
}, {address: "localhost", port: 5678});
|
||||
This project uses a **copying garbage collector**. ANY JS allocation (`JS_NewObject`, `JS_NewString`, `JS_NewArray`, `JS_NewInt32`, `JS_SetPropertyStr`, `js_new_blob_stoned_copy`, etc.) can trigger GC, which **invalidates all unrooted JSValue locals**. This is not theoretical — it causes real crashes.
|
||||
|
||||
// Server: Handle application messages
|
||||
$_.receiver(e => {
|
||||
if (e.type === 'join_game') {
|
||||
opponent = e.__HEADER__.replycc;
|
||||
send(opponent, {type: 'game_start', your_color: 'black'});
|
||||
}
|
||||
});
|
||||
**Before writing or modifying ANY C function**, apply this checklist:
|
||||
|
||||
1. Count the number of `JS_New*`, `JS_SetProperty*`, and `js_new_blob*` calls in the function
|
||||
2. If there are 2 or more, the function MUST use `JS_FRAME`/`JS_ROOT`/`JS_RETURN`
|
||||
3. Every JSValue that is held across an allocating call must be rooted
|
||||
|
||||
**Pattern — object with properties:**
|
||||
```c
|
||||
JS_FRAME(js);
|
||||
JS_ROOT(obj, JS_NewObject(js));
|
||||
JS_SetPropertyStr(js, obj.val, "x", JS_NewInt32(js, 42));
|
||||
JSValue name = JS_NewString(js, "hello");
|
||||
JS_SetPropertyStr(js, obj.val, "name", name);
|
||||
JS_RETURN(obj.val);
|
||||
```
|
||||
|
||||
## Memory Management
|
||||
**Pattern — array with loop (declare root BEFORE the loop):**
|
||||
```c
|
||||
JS_FRAME(js);
|
||||
JS_ROOT(arr, JS_NewArray(js));
|
||||
JSGCRef item = { .val = JS_NULL, .prev = NULL };
|
||||
JS_PushGCRef(js, &item);
|
||||
for (int i = 0; i < count; i++) {
|
||||
item.val = JS_NewObject(js);
|
||||
JS_SetPropertyStr(js, item.val, "v", JS_NewInt32(js, i));
|
||||
JS_SetPropertyNumber(js, arr.val, i, item.val);
|
||||
}
|
||||
JS_RETURN(arr.val);
|
||||
```
|
||||
|
||||
- When working with a conversational AI system like Claude, it's important to maintain a clean and focused memory
|
||||
- Regularly review and update memories to ensure they remain relevant and helpful
|
||||
- Delete or modify memories that are no longer accurate or useful
|
||||
- Prioritize information that can genuinely assist in future interactions
|
||||
**Rules:**
|
||||
- Access rooted values via `.val` (e.g., `obj.val`, not `obj`)
|
||||
- NEVER put `JS_ROOT` inside a loop — it pushes the same stack address twice, corrupting the GC chain
|
||||
- Error returns before `JS_FRAME` use plain `return`
|
||||
- Error returns after `JS_FRAME` must use `JS_RETURN_EX()` or `JS_RETURN_NULL()`
|
||||
|
||||
**CRITICAL — C argument evaluation order bug:**
|
||||
|
||||
Allocating functions (`JS_NewString`, `JS_NewFloat64`, `js_new_blob_stoned_copy`, etc.) used as arguments to `JS_SetPropertyStr` can crash because C evaluates arguments in unspecified order. The compiler may read `obj.val` BEFORE the allocating call, then GC moves the object, leaving a stale pointer.
|
||||
|
||||
```c
|
||||
// UNSAFE — intermittent crash:
|
||||
JS_SetPropertyStr(js, obj.val, "format", JS_NewString(js, "rgba32"));
|
||||
JS_SetPropertyStr(js, obj.val, "pixels", js_new_blob_stoned_copy(js, data, len));
|
||||
|
||||
// SAFE — separate the allocation:
|
||||
JSValue fmt = JS_NewString(js, "rgba32");
|
||||
JS_SetPropertyStr(js, obj.val, "format", fmt);
|
||||
JSValue pixels = js_new_blob_stoned_copy(js, data, len);
|
||||
JS_SetPropertyStr(js, obj.val, "pixels", pixels);
|
||||
```
|
||||
|
||||
`JS_NewInt32`, `JS_NewUint32`, and `JS_NewBool` do NOT allocate and are safe inline.
|
||||
|
||||
See `docs/c-modules.md` for the full GC safety reference.
|
||||
|
||||
## Project Layout
|
||||
|
||||
- `source/` — C source for the cell runtime and CLI
|
||||
- `docs/` — master documentation (Markdown), reflected on the website
|
||||
- `website/` — Hugo site; theme at `website/themes/knr/`
|
||||
- `internal/` — internal ƿit scripts (engine.cm etc.)
|
||||
- `packages/` — core packages
|
||||
- `Makefile` — build system (`make` to rebuild, `make bootstrap` for first build)
|
||||
|
||||
## Package Management (Shop CLI)
|
||||
|
||||
**Two shops:** `cell <cmd>` uses the global shop at `~/.cell/packages/`. `cell --dev <cmd>` uses the local shop at `.cell/packages/`. Linked packages (via `cell link`) are symlinked into the shop — edit the source directory directly.
|
||||
|
||||
```
|
||||
cell add <path> # add a package (local path or remote)
|
||||
cell remove <path> # remove a package (cleans lock, symlink, dylibs)
|
||||
cell build <path> # build C modules for a package
|
||||
cell build <path> --force # force rebuild (ignore stat cache)
|
||||
cell test package <path> # run tests for a package
|
||||
cell list # list installed packages
|
||||
cell link # list linked packages
|
||||
```
|
||||
|
||||
The build step compiles C files to content-addressed dylibs in `~/.cell/build/<hash>` and writes a per-package manifest so the runtime can find them. C files in `src/` are support files linked into module dylibs, not standalone modules.
|
||||
|
||||
## Debugging Compiler Issues
|
||||
|
||||
When investigating bugs in compiled output (wrong values, missing operations, incorrect comparisons), **start from the optimizer down, not the VM up**. The compiler inspection tools will usually identify the problem faster than adding C-level tracing:
|
||||
|
||||
```
|
||||
./cell --dev streamline --types <file> # show inferred slot types — look for wrong types
|
||||
./cell --dev ir_report --events <file> # show every optimization applied and why
|
||||
./cell --dev ir_report --types <file> # show type inference results per function
|
||||
./cell --dev mcode --pretty <file> # show raw IR before optimization
|
||||
./cell --dev streamline --ir <file> # show human-readable optimized IR
|
||||
```
|
||||
|
||||
**Triage order:**
|
||||
1. `streamline --types` — are slot types correct? Wrong type inference causes wrong optimizations.
|
||||
2. `ir_report --events` — are type checks being incorrectly eliminated? Look for `known_type_eliminates_guard` on slots that shouldn't have known types.
|
||||
3. `mcode --pretty` — is the raw IR correct before optimization? If so, the bug is in streamline.
|
||||
4. Only dig into `source/mach.c` if the IR looks correct at all levels.
|
||||
|
||||
See `docs/compiler-tools.md` for the full tool reference and `docs/spec/streamline.md` for pass details.
|
||||
|
||||
## Testing
|
||||
|
||||
After any C runtime changes, run all three test suites before considering the work done:
|
||||
|
||||
```
|
||||
make # rebuild
|
||||
./cell --dev vm_suite # VM-level tests (641 tests)
|
||||
./cell --dev test suite # language-level tests (493 tests)
|
||||
./cell --dev fuzz # fuzzer (100 iterations)
|
||||
```
|
||||
|
||||
All three must pass with 0 failures.
|
||||
|
||||
## Documentation
|
||||
|
||||
The `docs/` folder is the single source of truth. The website at `website/` mounts it via Hugo. Key files:
|
||||
- `docs/language.md` — language syntax reference
|
||||
- `docs/functions.md` — all built-in intrinsic functions
|
||||
- `docs/actors.md` — actor model and actor intrinsics
|
||||
- `docs/requestors.md` — async requestor pattern
|
||||
- `docs/library/*.md` — intrinsic type reference (text, number, array, object) and standard library modules
|
||||
|
||||
54
Dockerfile
54
Dockerfile
@@ -1,54 +0,0 @@
|
||||
# Builder stage
|
||||
FROM ubuntu:plucky AS builder
|
||||
|
||||
RUN apt-get update && apt-get install -y --no-install-recommends \
|
||||
python3 python3-pip \
|
||||
libasound2-dev \
|
||||
libpulse-dev \
|
||||
libudev-dev \
|
||||
libwayland-dev \
|
||||
wayland-protocols \
|
||||
libxkbcommon-dev \
|
||||
libx11-dev \
|
||||
libxext-dev \
|
||||
libxrandr-dev \
|
||||
libxcursor-dev \
|
||||
libxi-dev \
|
||||
libxinerama-dev \
|
||||
libxss-dev \
|
||||
libegl1-mesa-dev \
|
||||
libgl1-mesa-dev \
|
||||
cmake \
|
||||
ninja-build \
|
||||
git \
|
||||
build-essential \
|
||||
binutils \
|
||||
pkg-config \
|
||||
meson \
|
||||
zip && \
|
||||
rm -rf /var/lib/apt/lists/*
|
||||
|
||||
WORKDIR /app
|
||||
RUN git clone https://gitea.pockle.world/john/prosperon.git
|
||||
WORKDIR /app/prosperon
|
||||
RUN git checkout jsffi_refactor
|
||||
RUN meson setup build -Dbuildtype=release -Db_lto=true -Db_lto_mode=thin -Db_ndebug=true
|
||||
RUN meson compile -C build
|
||||
|
||||
# Runtime stage
|
||||
FROM ubuntu:latest
|
||||
|
||||
# Install minimal runtime dependencies (e.g., for dynamically linked libraries)
|
||||
RUN apt-get update && apt-get install -y libstdc++6 && rm -rf /var/lib/apt/lists/*
|
||||
|
||||
# Copy the compiled prosperon binary from the build stage
|
||||
COPY --from=builder /app/prosperon/build/prosperon /usr/local/bin/prosperon
|
||||
|
||||
# Create an entrypoint script
|
||||
RUN echo '#!/bin/bash' > /entrypoint.sh && \
|
||||
echo '/usr/local/bin/prosperon "$@" &' >> /entrypoint.sh && \
|
||||
echo 'tail -f /dev/null' >> /entrypoint.sh && \
|
||||
chmod +x /entrypoint.sh
|
||||
|
||||
WORKDIR /workdir
|
||||
ENTRYPOINT ["/entrypoint.sh"]
|
||||
16
Info.plist
16
Info.plist
@@ -1,16 +0,0 @@
|
||||
<?xml version="1.0" encoding="UTF-8"?>
|
||||
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
|
||||
<plist version="1.0">
|
||||
<dict>
|
||||
<key>CFBundleExecutable</key>
|
||||
<string>Prosperon</string>
|
||||
<key>CFBundleIdentifier</key>
|
||||
<string>pockle.world.prosperon</string>
|
||||
<key>CFBundleName</key>
|
||||
<string>Prosperon</string>
|
||||
<key>CFBundleVersion</key>
|
||||
<string>0.5</string>
|
||||
<key>NSHumanReadableCopyright</key>
|
||||
<string>Copyright © 2024 Pockle World. All rights reserved.</string>
|
||||
</dict>
|
||||
</plist>
|
||||
26
LICENSE
26
LICENSE
@@ -1,26 +0,0 @@
|
||||
Prosperon Game Engine
|
||||
|
||||
Copyright (c) 2019-2024 John Alanbrook
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to deal
|
||||
in the Software without restriction, including without limitation the rights
|
||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
|
||||
(1) The above copyright notice and this permission notice shall be included in
|
||||
all copies or substantial portions of the Software.
|
||||
|
||||
(2) Any games or other derivative software must display the "Prosperon" logo
|
||||
at near the beginning of the software's startup, before the chief purpose
|
||||
of the software is underway.
|
||||
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL
|
||||
THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||
THE SOFTWARE.
|
||||
64
Makefile
64
Makefile
@@ -1,29 +1,47 @@
|
||||
debug: FORCE
|
||||
meson setup build_dbg -Dbuildtype=debug
|
||||
meson install --only-changed -C build_dbg
|
||||
|
||||
fast: FORCE
|
||||
meson setup build_fast
|
||||
meson install -C build_fast
|
||||
BUILD = build
|
||||
BUILD_DBG = build_debug
|
||||
INSTALL_BIN = /opt/homebrew/bin
|
||||
INSTALL_LIB = /opt/homebrew/lib
|
||||
INSTALL_INC = /opt/homebrew/include
|
||||
CELL_SHOP = $(HOME)/.cell
|
||||
|
||||
release: FORCE
|
||||
meson setup -Dbuildtype=release -Db_lto=true -Db_lto_mode=thin -Db_ndebug=true build_release
|
||||
meson install -C build_release
|
||||
all: $(BUILD)/build.ninja
|
||||
meson compile -C $(BUILD)
|
||||
cp $(BUILD)/libcell_runtime.dylib .
|
||||
cp $(BUILD)/cell .
|
||||
|
||||
sanitize: FORCE
|
||||
meson setup -Db_sanitize=address -Db_sanitize=memory -Db_sanitize=leak -Db_sanitize=undefined build_sani
|
||||
meson install -C build_sani
|
||||
$(BUILD)/build.ninja:
|
||||
meson setup $(BUILD) -Dbuildtype=release
|
||||
|
||||
small: FORCE
|
||||
meson setup -Dbuildtype=minsize -Db_lto=true -Db_ndebug=true build_small
|
||||
meson install -C build_small
|
||||
debug: $(BUILD_DBG)/build.ninja
|
||||
meson compile -C $(BUILD_DBG)
|
||||
cp $(BUILD_DBG)/libcell_runtime.dylib .
|
||||
cp $(BUILD_DBG)/cell .
|
||||
|
||||
web: FORCE
|
||||
meson setup -Deditor=false -Dbuildtype=minsize -Db_lto=true -Db_ndebug=true --cross-file emscripten.cross build_web
|
||||
meson compile -C build_web
|
||||
$(BUILD_DBG)/build.ninja:
|
||||
meson setup $(BUILD_DBG) -Dbuildtype=debug -Db_sanitize=address
|
||||
|
||||
crosswin: FORCE
|
||||
meson setup -Dbuildtype=debugoptimized --cross-file mingw32.cross build_win
|
||||
meson compile -C build_win
|
||||
install: all $(CELL_SHOP)
|
||||
cp cell $(INSTALL_BIN)/cell
|
||||
cp libcell_runtime.dylib $(INSTALL_LIB)/
|
||||
cp source/cell.h $(INSTALL_INC)/
|
||||
rm -rf $(CELL_SHOP)/packages/core
|
||||
ln -s $(CURDIR) $(CELL_SHOP)/packages/core
|
||||
@echo "Installed cell to $(INSTALL_BIN) and $(INSTALL_LIB)"
|
||||
|
||||
FORCE:
|
||||
install_debug: debug $(CELL_SHOP)
|
||||
cp cell $(INSTALL_BIN)/cell
|
||||
cp libcell_runtime.dylib $(INSTALL_LIB)/
|
||||
cp source/cell.h $(INSTALL_INC)/
|
||||
rm -rf $(CELL_SHOP)/packages/core
|
||||
ln -s $(CURDIR) $(CELL_SHOP)/packages/core
|
||||
@echo "Installed cell (debug+asan) to $(INSTALL_BIN) and $(INSTALL_LIB)"
|
||||
|
||||
$(CELL_SHOP):
|
||||
mkdir -p $(CELL_SHOP)/packages $(CELL_SHOP)/cache $(CELL_SHOP)/build
|
||||
|
||||
clean:
|
||||
rm -rf $(BUILD) $(BUILD_DBG)
|
||||
rm -f cell libcell_runtime.dylib
|
||||
|
||||
.PHONY: all install debug install_debug clean
|
||||
|
||||
@@ -1,7 +1 @@
|
||||
Thank you for using Prosperon!
|
||||
|
||||
Provided are prosperon builds for all available platforms. Simply run prosperon for your platform in a game folder to play!
|
||||
|
||||
To get started, take a dive into the provided example games in the examples folder. You can either copy the prosperon executable into an example directory and run it there, or run `prosperon path/to/example` from the project root.
|
||||
|
||||
You can take a look through the docs folder for the prosperon manual to learn all about it. The manual is available on the web at [docs.prosperon.dev](https://docs.prosperon.dev).
|
||||
Read the docs to get started.
|
||||
|
||||
135
add.ce
Normal file
135
add.ce
Normal file
@@ -0,0 +1,135 @@
|
||||
// cell add <locator> [alias] - Add a dependency to the current package
|
||||
//
|
||||
// Usage:
|
||||
// cell add <locator> Add a dependency using default alias
|
||||
// cell add <locator> <alias> Add a dependency with custom alias
|
||||
// cell add -r <directory> Recursively find and add all packages in directory
|
||||
//
|
||||
// This adds the dependency to cell.toml and installs it to the shop.
|
||||
|
||||
var shop = use('internal/shop')
|
||||
var pkg = use('package')
|
||||
var fd = use('fd')
|
||||
|
||||
var locator = null
|
||||
var alias = null
|
||||
var recursive = false
|
||||
var cwd = fd.realpath('.')
|
||||
var parts = null
|
||||
var locators = null
|
||||
var added = 0
|
||||
var failed = 0
|
||||
var _add_dep = null
|
||||
var _install = null
|
||||
var i = 0
|
||||
|
||||
var run = function() {
|
||||
for (i = 0; i < length(args); i++) {
|
||||
if (args[i] == '--help' || args[i] == '-h') {
|
||||
log.console("Usage: cell add <locator> [alias]")
|
||||
log.console("")
|
||||
log.console("Add a dependency to the current package.")
|
||||
log.console("")
|
||||
log.console("Examples:")
|
||||
log.console(" cell add gitea.pockle.world/john/prosperon")
|
||||
log.console(" cell add gitea.pockle.world/john/cell-image image")
|
||||
log.console(" cell add ../local-package")
|
||||
log.console(" cell add -r ../packages")
|
||||
return
|
||||
} else if (args[i] == '-r') {
|
||||
recursive = true
|
||||
} else if (!starts_with(args[i], '-')) {
|
||||
if (!locator) {
|
||||
locator = args[i]
|
||||
} else if (!alias) {
|
||||
alias = args[i]
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if (!locator && !recursive) {
|
||||
log.console("Usage: cell add <locator> [alias]")
|
||||
return
|
||||
}
|
||||
|
||||
if (locator)
|
||||
locator = shop.resolve_locator(locator)
|
||||
|
||||
// Generate default alias from locator
|
||||
if (!alias && locator) {
|
||||
parts = array(locator, '/')
|
||||
alias = parts[length(parts) - 1]
|
||||
if (search(alias, '@') != null)
|
||||
alias = array(alias, '@')[0]
|
||||
}
|
||||
|
||||
// Check we're in a package directory
|
||||
if (!fd.is_file(cwd + '/cell.toml')) {
|
||||
log.error("Not in a package directory (no cell.toml found)")
|
||||
return
|
||||
}
|
||||
|
||||
// Recursive mode
|
||||
if (recursive) {
|
||||
if (!locator) locator = '.'
|
||||
locator = shop.resolve_locator(locator)
|
||||
if (!fd.is_dir(locator)) {
|
||||
log.error(`${locator} is not a directory`)
|
||||
return
|
||||
}
|
||||
locators = filter(pkg.find_packages(locator), function(p) {
|
||||
return p != cwd
|
||||
})
|
||||
if (length(locators) == 0) {
|
||||
log.console("No packages found in " + locator)
|
||||
return
|
||||
}
|
||||
log.console(`Found ${text(length(locators))} package(s) in ${locator}`)
|
||||
|
||||
added = 0
|
||||
failed = 0
|
||||
arrfor(locators, function(loc) {
|
||||
var loc_parts = array(loc, '/')
|
||||
var loc_alias = loc_parts[length(loc_parts) - 1]
|
||||
log.console(" Adding " + loc + " as '" + loc_alias + "'...")
|
||||
var _add = function() {
|
||||
pkg.add_dependency(null, loc, loc_alias)
|
||||
shop.sync(loc)
|
||||
added = added + 1
|
||||
} disruption {
|
||||
log.console(` Warning: Failed to add ${loc}`)
|
||||
failed = failed + 1
|
||||
}
|
||||
_add()
|
||||
})
|
||||
|
||||
log.console("Added " + text(added) + " package(s)." + (failed > 0 ? " Failed: " + text(failed) + "." : ""))
|
||||
return
|
||||
}
|
||||
|
||||
// Single package add
|
||||
log.console("Adding " + locator + " as '" + alias + "'...")
|
||||
|
||||
_add_dep = function() {
|
||||
pkg.add_dependency(null, locator, alias)
|
||||
log.console(" Added to cell.toml")
|
||||
} disruption {
|
||||
log.error("Failed to update cell.toml")
|
||||
return
|
||||
}
|
||||
_add_dep()
|
||||
|
||||
_install = function() {
|
||||
shop.sync_with_deps(locator)
|
||||
log.console(" Installed to shop")
|
||||
} disruption {
|
||||
log.error("Failed to install")
|
||||
return
|
||||
}
|
||||
_install()
|
||||
|
||||
log.console("Added " + alias + " (" + locator + ")")
|
||||
}
|
||||
run()
|
||||
|
||||
$stop()
|
||||
144
analyze.cm
Normal file
144
analyze.cm
Normal file
@@ -0,0 +1,144 @@
|
||||
// analyze.cm — Static analysis over index data.
|
||||
//
|
||||
// All functions take an index object (from index.cm) and return structured results.
|
||||
// Does not depend on streamline — operates purely on source-semantic data.
|
||||
|
||||
var analyze = {}
|
||||
|
||||
// Find all references to a name, with optional scope filter.
|
||||
// scope: "top" (enclosing == null), "fn" (enclosing != null), null (all)
|
||||
analyze.find_refs = function(idx, name, scope) {
|
||||
var hits = []
|
||||
var i = 0
|
||||
var ref = null
|
||||
while (i < length(idx.references)) {
|
||||
ref = idx.references[i]
|
||||
if (ref.name == name) {
|
||||
if (scope == null) {
|
||||
hits[] = ref
|
||||
} else if (scope == "top" && ref.enclosing == null) {
|
||||
hits[] = ref
|
||||
} else if (scope == "fn" && ref.enclosing != null) {
|
||||
hits[] = ref
|
||||
}
|
||||
}
|
||||
i = i + 1
|
||||
}
|
||||
return hits
|
||||
}
|
||||
|
||||
// Find all <name>.<property> usage patterns (channel analysis).
|
||||
// Only counts unshadowed uses (name not declared as local var in scope).
|
||||
analyze.channels = function(idx, name) {
|
||||
var channels = {}
|
||||
var summary = {}
|
||||
var i = 0
|
||||
var cs = null
|
||||
var callee = null
|
||||
var prop = null
|
||||
var prefix_dot = name + "."
|
||||
while (i < length(idx.call_sites)) {
|
||||
cs = idx.call_sites[i]
|
||||
callee = cs.callee
|
||||
if (callee != null && starts_with(callee, prefix_dot)) {
|
||||
prop = text(callee, length(prefix_dot), length(callee))
|
||||
if (channels[prop] == null) {
|
||||
channels[prop] = []
|
||||
}
|
||||
channels[prop][] = {span: cs.span}
|
||||
if (summary[prop] == null) {
|
||||
summary[prop] = 0
|
||||
}
|
||||
summary[prop] = summary[prop] + 1
|
||||
}
|
||||
i = i + 1
|
||||
}
|
||||
return {channels: channels, summary: summary}
|
||||
}
|
||||
|
||||
// Find declarations by name, with optional kind filter.
|
||||
// kind: "var", "def", "fn", "param", or null (any)
|
||||
analyze.find_decls = function(idx, name, kind) {
|
||||
var hits = []
|
||||
var i = 0
|
||||
var sym = null
|
||||
while (i < length(idx.symbols)) {
|
||||
sym = idx.symbols[i]
|
||||
if (sym.name == name) {
|
||||
if (kind == null || sym.kind == kind) {
|
||||
hits[] = sym
|
||||
}
|
||||
}
|
||||
i = i + 1
|
||||
}
|
||||
return hits
|
||||
}
|
||||
|
||||
// Find intrinsic usage by name.
|
||||
analyze.find_intrinsic = function(idx, name) {
|
||||
var hits = []
|
||||
var i = 0
|
||||
var ref = null
|
||||
if (idx.intrinsic_refs == null) return hits
|
||||
while (i < length(idx.intrinsic_refs)) {
|
||||
ref = idx.intrinsic_refs[i]
|
||||
if (ref.name == name) {
|
||||
hits[] = ref
|
||||
}
|
||||
i = i + 1
|
||||
}
|
||||
return hits
|
||||
}
|
||||
|
||||
// Call sites with >4 args — always a compile error (max arity is 4).
|
||||
analyze.excess_args = function(idx) {
|
||||
var hits = []
|
||||
var i = 0
|
||||
var cs = null
|
||||
while (i < length(idx.call_sites)) {
|
||||
cs = idx.call_sites[i]
|
||||
if (cs.args_count > 4) {
|
||||
hits[] = {span: cs.span, callee: cs.callee, args_count: cs.args_count}
|
||||
}
|
||||
i = i + 1
|
||||
}
|
||||
return hits
|
||||
}
|
||||
|
||||
// Extract module export shape from index data (for cross-module analysis).
|
||||
analyze.module_summary = function(idx) {
|
||||
var exports = {}
|
||||
var i = 0
|
||||
var j = 0
|
||||
var exp = null
|
||||
var sym = null
|
||||
var found = false
|
||||
if (idx.exports == null) return {exports: exports}
|
||||
while (i < length(idx.exports)) {
|
||||
exp = idx.exports[i]
|
||||
found = false
|
||||
if (exp.symbol_id != null) {
|
||||
j = 0
|
||||
while (j < length(idx.symbols)) {
|
||||
sym = idx.symbols[j]
|
||||
if (sym.symbol_id == exp.symbol_id) {
|
||||
if (sym.kind == "fn" && sym.params != null) {
|
||||
exports[exp.name] = {type: "function", arity: length(sym.params)}
|
||||
} else {
|
||||
exports[exp.name] = {type: sym.kind}
|
||||
}
|
||||
found = true
|
||||
break
|
||||
}
|
||||
j = j + 1
|
||||
}
|
||||
}
|
||||
if (!found) {
|
||||
exports[exp.name] = {type: "unknown"}
|
||||
}
|
||||
i = i + 1
|
||||
}
|
||||
return {exports: exports}
|
||||
}
|
||||
|
||||
return analyze
|
||||
@@ -1,6 +1,5 @@
|
||||
#include "quickjs.h"
|
||||
#include "cell.h"
|
||||
#include "miniz.h"
|
||||
#include "qjs_blob.h"
|
||||
|
||||
static JSClassID js_reader_class_id;
|
||||
static JSClassID js_writer_class_id;
|
||||
@@ -8,14 +7,14 @@ static JSClassID js_writer_class_id;
|
||||
static void js_reader_finalizer(JSRuntime *rt, JSValue val) {
|
||||
mz_zip_archive *zip = JS_GetOpaque(val, js_reader_class_id);
|
||||
mz_zip_reader_end(zip);
|
||||
js_free_rt(rt,zip);
|
||||
js_free_rt(zip);
|
||||
}
|
||||
|
||||
static void js_writer_finalizer(JSRuntime *rt, JSValue val) {
|
||||
mz_zip_archive *zip = JS_GetOpaque(val, js_writer_class_id);
|
||||
mz_zip_writer_finalize_archive(zip);
|
||||
mz_zip_writer_end(zip);
|
||||
js_free_rt(rt,zip);
|
||||
js_free_rt(zip);
|
||||
}
|
||||
|
||||
static JSClassDef js_reader_class = {
|
||||
@@ -30,26 +29,32 @@ static JSClassDef js_writer_class = {
|
||||
|
||||
static mz_zip_archive *js2reader(JSContext *js, JSValue v)
|
||||
{
|
||||
return JS_GetOpaque2(js, v, js_reader_class_id);
|
||||
return JS_GetOpaque(v, js_reader_class_id);
|
||||
}
|
||||
|
||||
static mz_zip_archive *js2writer(JSContext *js, JSValue v)
|
||||
{
|
||||
return JS_GetOpaque2(js, v, js_writer_class_id);
|
||||
return JS_GetOpaque(v, js_writer_class_id);
|
||||
}
|
||||
|
||||
static JSValue js_miniz_read(JSContext *js, JSValue self, int argc, JSValue *argv)
|
||||
{
|
||||
size_t len;
|
||||
void *data = js_get_blob_data(js, &len, argv[0]);
|
||||
if (!data)
|
||||
return JS_ThrowReferenceError(js, "Could not create data.\n");
|
||||
if (data == (void *)-1)
|
||||
return JS_EXCEPTION;
|
||||
|
||||
mz_zip_archive *zip = calloc(sizeof(*zip),1);
|
||||
int success = mz_zip_reader_init_mem(zip, data, len, 0);
|
||||
int err = mz_zip_get_last_error(zip);
|
||||
if (err)
|
||||
return JS_ThrowInternalError(js, "miniz error: %s\n", mz_zip_get_error_string(err));
|
||||
mz_zip_archive *zip = calloc(sizeof(*zip), 1);
|
||||
if (!zip)
|
||||
return JS_RaiseOOM(js);
|
||||
|
||||
mz_bool success = mz_zip_reader_init_mem(zip, data, len, 0);
|
||||
|
||||
if (!success) {
|
||||
int err = mz_zip_get_last_error(zip);
|
||||
free(zip);
|
||||
return JS_RaiseDisrupt(js, "Failed to initialize zip reader: %s", mz_zip_get_error_string(err));
|
||||
}
|
||||
|
||||
JSValue jszip = JS_NewObjectClass(js, js_reader_class_id);
|
||||
JS_SetOpaque(jszip, zip);
|
||||
@@ -65,7 +70,7 @@ static JSValue js_miniz_write(JSContext *js, JSValue self, int argc, JSValue *ar
|
||||
mz_zip_archive *zip = calloc(sizeof(*zip), 1);
|
||||
if (!zip) {
|
||||
JS_FreeCString(js, file);
|
||||
return JS_ThrowOutOfMemory(js);
|
||||
return JS_RaiseOOM(js);
|
||||
}
|
||||
|
||||
mz_bool success = mz_zip_writer_init_file(zip, file, 0);
|
||||
@@ -75,7 +80,7 @@ static JSValue js_miniz_write(JSContext *js, JSValue self, int argc, JSValue *ar
|
||||
int err = mz_zip_get_last_error(zip);
|
||||
mz_zip_writer_end(zip);
|
||||
free(zip);
|
||||
return JS_ThrowInternalError(js, "Failed to initialize zip writer: %s", mz_zip_get_error_string(err));
|
||||
return JS_RaiseDisrupt(js, "Failed to initialize zip writer: %s", mz_zip_get_error_string(err));
|
||||
}
|
||||
|
||||
JSValue jszip = JS_NewObjectClass(js, js_writer_class_id);
|
||||
@@ -87,7 +92,7 @@ static JSValue js_miniz_compress(JSContext *js, JSValue this_val,
|
||||
int argc, JSValueConst *argv)
|
||||
{
|
||||
if (argc < 1)
|
||||
return JS_ThrowTypeError(js,
|
||||
return JS_RaiseDisrupt(js,
|
||||
"compress needs a string or ArrayBuffer");
|
||||
|
||||
/* ─── 1. Grab the input data ──────────────────────────────── */
|
||||
@@ -95,44 +100,46 @@ static JSValue js_miniz_compress(JSContext *js, JSValue this_val,
|
||||
size_t in_len = 0;
|
||||
const void *in_ptr = NULL;
|
||||
|
||||
if (JS_IsString(argv[0])) {
|
||||
if (JS_IsText(argv[0])) {
|
||||
/* String → UTF-8 bytes without the terminating NUL */
|
||||
cstring = JS_ToCStringLen(js, &in_len, argv[0]);
|
||||
if (!cstring)
|
||||
return JS_EXCEPTION;
|
||||
in_ptr = cstring;
|
||||
} else { /* assume ArrayBuffer / TypedArray */
|
||||
} else {
|
||||
in_ptr = js_get_blob_data(js, &in_len, argv[0]);
|
||||
if (!in_ptr)
|
||||
return JS_ThrowTypeError(js,
|
||||
"Argument must be a string or ArrayBuffer");
|
||||
if (in_ptr == (const void *)-1)
|
||||
return JS_EXCEPTION;
|
||||
}
|
||||
|
||||
/* ─── 2. Allocate an output buffer big enough ────────────── */
|
||||
/* ─── 2. Allocate output blob (before getting blob input ptr) ── */
|
||||
mz_ulong out_len_est = mz_compressBound(in_len);
|
||||
void *out_buf = js_malloc(js, out_len_est);
|
||||
if (!out_buf) {
|
||||
void *out_ptr;
|
||||
JSValue abuf = js_new_blob_alloc(js, (size_t)out_len_est, &out_ptr);
|
||||
if (JS_IsException(abuf)) {
|
||||
if (cstring) JS_FreeCString(js, cstring);
|
||||
return JS_EXCEPTION;
|
||||
return abuf;
|
||||
}
|
||||
|
||||
/* Re-derive blob input pointer after alloc (GC may have moved it) */
|
||||
if (!cstring) {
|
||||
in_ptr = js_get_blob_data(js, &in_len, argv[0]);
|
||||
}
|
||||
|
||||
/* ─── 3. Do the compression (MZ_DEFAULT_COMPRESSION = level 6) */
|
||||
mz_ulong out_len = out_len_est;
|
||||
int st = mz_compress2(out_buf, &out_len,
|
||||
int st = mz_compress2(out_ptr, &out_len,
|
||||
in_ptr, in_len, MZ_DEFAULT_COMPRESSION);
|
||||
|
||||
/* clean-up for string input */
|
||||
if (cstring) JS_FreeCString(js, cstring);
|
||||
|
||||
if (st != MZ_OK) {
|
||||
js_free(js, out_buf);
|
||||
return JS_ThrowInternalError(js,
|
||||
if (st != MZ_OK)
|
||||
return JS_RaiseDisrupt(js,
|
||||
"miniz: compression failed (%d)", st);
|
||||
}
|
||||
|
||||
/* ─── 4. Hand JavaScript a copy of the compressed data ────── */
|
||||
JSValue abuf = js_new_blob_stoned_copy(js, out_buf, out_len);
|
||||
js_free(js, out_buf);
|
||||
/* ─── 4. Stone with actual compressed size ────────────────── */
|
||||
js_blob_stone(abuf, (size_t)out_len);
|
||||
return abuf;
|
||||
}
|
||||
|
||||
@@ -142,15 +149,14 @@ static JSValue js_miniz_decompress(JSContext *js,
|
||||
JSValueConst *argv)
|
||||
{
|
||||
if (argc < 1)
|
||||
return JS_ThrowTypeError(js,
|
||||
return JS_RaiseDisrupt(js,
|
||||
"decompress: need compressed ArrayBuffer");
|
||||
|
||||
/* grab compressed data */
|
||||
size_t in_len;
|
||||
void *in_ptr = js_get_blob_data(js, &in_len, argv[0]);
|
||||
if (!in_ptr)
|
||||
return JS_ThrowTypeError(js,
|
||||
"decompress: first arg must be an ArrayBuffer");
|
||||
if (in_ptr == (void *)-1)
|
||||
return JS_EXCEPTION;
|
||||
|
||||
/* zlib header present → tell tinfl to parse it */
|
||||
size_t out_len = 0;
|
||||
@@ -159,18 +165,11 @@ static JSValue js_miniz_decompress(JSContext *js,
|
||||
TINFL_FLAG_PARSE_ZLIB_HEADER);
|
||||
|
||||
if (!out_ptr)
|
||||
return JS_ThrowInternalError(js,
|
||||
return JS_RaiseDisrupt(js,
|
||||
"miniz: decompression failed");
|
||||
|
||||
/* wrap for JS */
|
||||
JSValue ret;
|
||||
int asString = (argc > 1) && JS_ToBool(js, argv[1]);
|
||||
|
||||
if (asString)
|
||||
ret = JS_NewStringLen(js, (const char *)out_ptr, out_len);
|
||||
else
|
||||
ret = JS_NewArrayBufferCopy(js, out_ptr, out_len);
|
||||
|
||||
#ifdef MZ_FREE
|
||||
MZ_FREE(out_ptr);
|
||||
#else
|
||||
@@ -190,27 +189,27 @@ static const JSCFunctionListEntry js_miniz_funcs[] = {
|
||||
JSValue js_writer_add_file(JSContext *js, JSValue self, int argc, JSValue *argv)
|
||||
{
|
||||
if (argc < 2)
|
||||
return JS_ThrowTypeError(js, "add_file requires (path, arrayBuffer)");
|
||||
return JS_RaiseDisrupt(js, "add_file requires (path, arrayBuffer)");
|
||||
|
||||
mz_zip_archive *zip = js2writer(js, self);
|
||||
const char *pathInZip = JS_ToCString(js, argv[0]);
|
||||
if (!pathInZip)
|
||||
return JS_ThrowTypeError(js, "Could not parse path argument");
|
||||
return JS_RaiseDisrupt(js, "Could not parse path argument");
|
||||
|
||||
size_t dataLen;
|
||||
void *data = js_get_blob_data(js, &dataLen, argv[1]);
|
||||
if (!data) {
|
||||
if (data == (void *)-1) {
|
||||
JS_FreeCString(js, pathInZip);
|
||||
return JS_ThrowTypeError(js, "Second argument must be an ArrayBuffer");
|
||||
return JS_EXCEPTION;
|
||||
}
|
||||
|
||||
int success = mz_zip_writer_add_mem(zip, pathInZip, data, dataLen, MZ_DEFAULT_COMPRESSION);
|
||||
JS_FreeCString(js, pathInZip);
|
||||
|
||||
if (!success)
|
||||
return JS_ThrowInternalError(js, "Failed to add memory to zip");
|
||||
return JS_RaiseDisrupt(js, "Failed to add memory to zip");
|
||||
|
||||
return JS_UNDEFINED;
|
||||
return JS_NULL;
|
||||
}
|
||||
|
||||
|
||||
@@ -220,6 +219,7 @@ static const JSCFunctionListEntry js_writer_funcs[] = {
|
||||
|
||||
JSValue js_reader_mod(JSContext *js, JSValue self, int argc, JSValue *argv)
|
||||
{
|
||||
#ifndef MINIZ_NO_TIME
|
||||
const char *file = JS_ToCString(js,argv[0]);
|
||||
if (!file)
|
||||
return JS_EXCEPTION;
|
||||
@@ -227,7 +227,7 @@ JSValue js_reader_mod(JSContext *js, JSValue self, int argc, JSValue *argv)
|
||||
mz_zip_archive *zip = js2reader(js, self);
|
||||
if (!zip) {
|
||||
JS_FreeCString(js, file);
|
||||
return JS_ThrowInternalError(js, "Invalid zip reader");
|
||||
return JS_RaiseDisrupt(js, "Invalid zip reader");
|
||||
}
|
||||
|
||||
mz_zip_archive_file_stat pstat;
|
||||
@@ -235,17 +235,20 @@ JSValue js_reader_mod(JSContext *js, JSValue self, int argc, JSValue *argv)
|
||||
|
||||
if (index == (mz_uint)-1) {
|
||||
JS_FreeCString(js, file);
|
||||
return JS_ThrowReferenceError(js, "File '%s' not found in archive", file);
|
||||
return JS_RaiseDisrupt(js, "File '%s' not found in archive", file);
|
||||
}
|
||||
|
||||
JS_FreeCString(js, file);
|
||||
|
||||
if (!mz_zip_reader_file_stat(zip, index, &pstat)) {
|
||||
int err = mz_zip_get_last_error(zip);
|
||||
return JS_ThrowInternalError(js, "Failed to get file stats: %s", mz_zip_get_error_string(err));
|
||||
return JS_RaiseDisrupt(js, "Failed to get file stats: %s", mz_zip_get_error_string(err));
|
||||
}
|
||||
|
||||
return JS_NewFloat64(js, pstat.m_time);
|
||||
#else
|
||||
return JS_RaiseDisrupt(js, "MINIZ_NO_TIME is defined");
|
||||
#endif
|
||||
}
|
||||
|
||||
JSValue js_reader_exists(JSContext *js, JSValue self, int argc, JSValue *argv)
|
||||
@@ -257,7 +260,7 @@ JSValue js_reader_exists(JSContext *js, JSValue self, int argc, JSValue *argv)
|
||||
mz_zip_archive *zip = js2reader(js, self);
|
||||
if (!zip) {
|
||||
JS_FreeCString(js, file);
|
||||
return JS_ThrowInternalError(js, "Invalid zip reader");
|
||||
return JS_RaiseDisrupt(js, "Invalid zip reader");
|
||||
}
|
||||
|
||||
mz_uint index = mz_zip_reader_locate_file(zip, file, NULL, 0);
|
||||
@@ -275,7 +278,7 @@ JSValue js_reader_slurp(JSContext *js, JSValue self, int argc, JSValue *argv)
|
||||
mz_zip_archive *zip = js2reader(js, self);
|
||||
if (!zip) {
|
||||
JS_FreeCString(js, file);
|
||||
return JS_ThrowInternalError(js, "Invalid zip reader");
|
||||
return JS_RaiseDisrupt(js, "Invalid zip reader");
|
||||
}
|
||||
|
||||
size_t len;
|
||||
@@ -285,7 +288,7 @@ JSValue js_reader_slurp(JSContext *js, JSValue self, int argc, JSValue *argv)
|
||||
int err = mz_zip_get_last_error(zip);
|
||||
const char *filename = file;
|
||||
JS_FreeCString(js, file);
|
||||
return JS_ThrowInternalError(js, "Failed to extract file '%s': %s", filename, mz_zip_get_error_string(err));
|
||||
return JS_RaiseDisrupt(js, "Failed to extract file '%s': %s", filename, mz_zip_get_error_string(err));
|
||||
}
|
||||
|
||||
JS_FreeCString(js, file);
|
||||
@@ -299,7 +302,7 @@ JSValue js_reader_list(JSContext *js, JSValue self, int argc, JSValue *argv)
|
||||
{
|
||||
mz_zip_archive *zip = js2reader(js, self);
|
||||
if (!zip)
|
||||
return JS_ThrowInternalError(js, "Invalid zip reader");
|
||||
return JS_RaiseDisrupt(js, "Invalid zip reader");
|
||||
|
||||
mz_uint num_files = mz_zip_reader_get_num_files(zip);
|
||||
|
||||
@@ -315,10 +318,9 @@ JSValue js_reader_list(JSContext *js, JSValue self, int argc, JSValue *argv)
|
||||
|
||||
JSValue filename = JS_NewString(js, file_stat.m_filename);
|
||||
if (JS_IsException(filename)) {
|
||||
JS_FreeValue(js, arr);
|
||||
return filename;
|
||||
}
|
||||
JS_SetPropertyUint32(js, arr, arr_index++, filename);
|
||||
JS_SetPropertyNumber(js, arr, arr_index++, filename);
|
||||
}
|
||||
|
||||
return arr;
|
||||
@@ -327,7 +329,7 @@ JSValue js_reader_list(JSContext *js, JSValue self, int argc, JSValue *argv)
|
||||
JSValue js_reader_is_directory(JSContext *js, JSValue self, int argc, JSValue *argv)
|
||||
{
|
||||
if (argc < 1)
|
||||
return JS_ThrowTypeError(js, "is_directory requires a file index");
|
||||
return JS_RaiseDisrupt(js, "is_directory requires a file index");
|
||||
|
||||
int32_t index;
|
||||
if (JS_ToInt32(js, &index, argv[0]))
|
||||
@@ -335,7 +337,7 @@ JSValue js_reader_is_directory(JSContext *js, JSValue self, int argc, JSValue *a
|
||||
|
||||
mz_zip_archive *zip = js2reader(js, self);
|
||||
if (!zip)
|
||||
return JS_ThrowInternalError(js, "Invalid zip reader");
|
||||
return JS_RaiseDisrupt(js, "Invalid zip reader");
|
||||
|
||||
return JS_NewBool(js, mz_zip_reader_is_file_a_directory(zip, index));
|
||||
}
|
||||
@@ -343,7 +345,7 @@ JSValue js_reader_is_directory(JSContext *js, JSValue self, int argc, JSValue *a
|
||||
JSValue js_reader_get_filename(JSContext *js, JSValue self, int argc, JSValue *argv)
|
||||
{
|
||||
if (argc < 1)
|
||||
return JS_ThrowTypeError(js, "get_filename requires a file index");
|
||||
return JS_RaiseDisrupt(js, "get_filename requires a file index");
|
||||
|
||||
int32_t index;
|
||||
if (JS_ToInt32(js, &index, argv[0]))
|
||||
@@ -351,11 +353,11 @@ JSValue js_reader_get_filename(JSContext *js, JSValue self, int argc, JSValue *a
|
||||
|
||||
mz_zip_archive *zip = js2reader(js, self);
|
||||
if (!zip)
|
||||
return JS_ThrowInternalError(js, "Invalid zip reader");
|
||||
return JS_RaiseDisrupt(js, "Invalid zip reader");
|
||||
|
||||
mz_zip_archive_file_stat file_stat;
|
||||
if (!mz_zip_reader_file_stat(zip, index, &file_stat))
|
||||
return JS_ThrowInternalError(js, "Failed to get file stats");
|
||||
return JS_RaiseDisrupt(js, "Failed to get file stats");
|
||||
|
||||
return JS_NewString(js, file_stat.m_filename);
|
||||
}
|
||||
@@ -364,8 +366,7 @@ JSValue js_reader_count(JSContext *js, JSValue self, int argc, JSValue *argv)
|
||||
{
|
||||
mz_zip_archive *zip = js2reader(js, self);
|
||||
if (!zip)
|
||||
return JS_ThrowInternalError(js, "Invalid zip reader");
|
||||
|
||||
return JS_RaiseDisrupt(js, "Invalid zip reader");
|
||||
return JS_NewUint32(js, mz_zip_reader_get_num_files(zip));
|
||||
}
|
||||
|
||||
@@ -379,39 +380,23 @@ static const JSCFunctionListEntry js_reader_funcs[] = {
|
||||
JS_CFUNC_DEF("count", 0, js_reader_count),
|
||||
};
|
||||
|
||||
JSValue js_miniz_use(JSContext *js)
|
||||
JSValue js_core_miniz_use(JSContext *js)
|
||||
{
|
||||
JS_FRAME(js);
|
||||
|
||||
JS_NewClassID(&js_reader_class_id);
|
||||
JS_NewClass(JS_GetRuntime(js), js_reader_class_id, &js_reader_class);
|
||||
JSValue reader_proto = JS_NewObject(js);
|
||||
JS_SetPropertyFunctionList(js, reader_proto, js_reader_funcs, sizeof(js_reader_funcs) / sizeof(JSCFunctionListEntry));
|
||||
JS_SetClassProto(js, js_reader_class_id, reader_proto);
|
||||
JS_NewClass(js, js_reader_class_id, &js_reader_class);
|
||||
JS_ROOT(reader_proto, JS_NewObject(js));
|
||||
JS_SetPropertyFunctionList(js, reader_proto.val, js_reader_funcs, sizeof(js_reader_funcs) / sizeof(JSCFunctionListEntry));
|
||||
JS_SetClassProto(js, js_reader_class_id, reader_proto.val);
|
||||
|
||||
JS_NewClassID(&js_writer_class_id);
|
||||
JS_NewClass(JS_GetRuntime(js), js_writer_class_id, &js_writer_class);
|
||||
JSValue writer_proto = JS_NewObject(js);
|
||||
JS_SetPropertyFunctionList(js, writer_proto, js_writer_funcs, sizeof(js_writer_funcs) / sizeof(JSCFunctionListEntry));
|
||||
JS_SetClassProto(js, js_writer_class_id, writer_proto);
|
||||
|
||||
JSValue export = JS_NewObject(js);
|
||||
JS_SetPropertyFunctionList(js, export, js_miniz_funcs, sizeof(js_miniz_funcs)/sizeof(JSCFunctionListEntry));
|
||||
return export;
|
||||
}
|
||||
JS_NewClass(js, js_writer_class_id, &js_writer_class);
|
||||
JS_ROOT(writer_proto, JS_NewObject(js));
|
||||
JS_SetPropertyFunctionList(js, writer_proto.val, js_writer_funcs, sizeof(js_writer_funcs) / sizeof(JSCFunctionListEntry));
|
||||
JS_SetClassProto(js, js_writer_class_id, writer_proto.val);
|
||||
|
||||
static int js_miniz_init(JSContext *ctx, JSModuleDef *m) {
|
||||
JS_SetModuleExport(ctx, m, "default",js_miniz_use(ctx));
|
||||
return 0;
|
||||
}
|
||||
|
||||
#ifdef JS_SHARED_LIBRARY
|
||||
#define JS_INIT_MODULE js_init_module
|
||||
#else
|
||||
#define JS_INIT_MODULE js_init_module_miniz
|
||||
#endif
|
||||
|
||||
JSModuleDef *JS_INIT_MODULE(JSContext *ctx, const char *module_name) {
|
||||
JSModuleDef *m = JS_NewCModule(ctx, module_name, js_miniz_init);
|
||||
if (!m) return NULL;
|
||||
JS_AddModuleExport(ctx, m, "default");
|
||||
return m;
|
||||
JS_ROOT(export, JS_NewObject(js));
|
||||
JS_SetPropertyFunctionList(js, export.val, js_miniz_funcs, sizeof(js_miniz_funcs)/sizeof(JSCFunctionListEntry));
|
||||
JS_RETURN(export.val);
|
||||
}
|
||||
149
audit.ce
Normal file
149
audit.ce
Normal file
@@ -0,0 +1,149 @@
|
||||
// cell audit [<locator>] - Test-compile all .ce and .cm scripts
|
||||
//
|
||||
// Usage:
|
||||
// cell audit Audit all packages
|
||||
// cell audit <locator> Audit specific package
|
||||
// cell audit . Audit current directory package
|
||||
// cell audit --function-hoist [<locator>] Report function hoisting usage
|
||||
//
|
||||
// Compiles every script in the package(s) to check for errors.
|
||||
// Continues past failures and reports all issues at the end.
|
||||
|
||||
var shop = use('internal/shop')
|
||||
var pkg = use('package')
|
||||
var fd = use('fd')
|
||||
|
||||
var target_package = null
|
||||
var function_hoist = false
|
||||
var i = 0
|
||||
|
||||
var run = function() {
|
||||
var packages = null
|
||||
var tokenize_mod = null
|
||||
var parse_mod = null
|
||||
var hoist_files = 0
|
||||
var hoist_refs = 0
|
||||
var total_ok = 0
|
||||
var total_errors = 0
|
||||
var total_scripts = 0
|
||||
var all_failures = []
|
||||
var all_unresolved = []
|
||||
var summary = null
|
||||
|
||||
for (i = 0; i < length(args); i++) {
|
||||
if (args[i] == '--help' || args[i] == '-h') {
|
||||
log.console("Usage: cell audit [--function-hoist] [<locator>]")
|
||||
log.console("")
|
||||
log.console("Test-compile all .ce and .cm scripts in package(s).")
|
||||
log.console("Reports all errors without stopping at the first failure.")
|
||||
log.console("")
|
||||
log.console("Flags:")
|
||||
log.console(" --function-hoist Report files that rely on function hoisting")
|
||||
return
|
||||
} else if (args[i] == '--function-hoist') {
|
||||
function_hoist = true
|
||||
} else if (!starts_with(args[i], '-')) {
|
||||
target_package = args[i]
|
||||
}
|
||||
}
|
||||
|
||||
// Resolve local paths
|
||||
if (target_package) {
|
||||
target_package = shop.resolve_locator(target_package)
|
||||
}
|
||||
|
||||
if (target_package) {
|
||||
packages = [target_package]
|
||||
} else {
|
||||
packages = shop.list_packages()
|
||||
}
|
||||
|
||||
if (function_hoist) {
|
||||
tokenize_mod = use('tokenize')
|
||||
parse_mod = use('parse')
|
||||
|
||||
arrfor(packages, function(p) {
|
||||
var scripts = shop.get_package_scripts(p)
|
||||
var pkg_dir = shop.get_package_dir(p)
|
||||
if (length(scripts) == 0) return
|
||||
|
||||
arrfor(scripts, function(script) {
|
||||
var src_path = pkg_dir + '/' + script
|
||||
var src = null
|
||||
var tok_result = null
|
||||
var ast = null
|
||||
var scan = function() {
|
||||
if (!fd.is_file(src_path)) return
|
||||
src = text(fd.slurp(src_path))
|
||||
tok_result = tokenize_mod(src, script)
|
||||
ast = parse_mod(tok_result.tokens, src, script, tokenize_mod)
|
||||
if (ast._hoisted_fns != null && length(ast._hoisted_fns) > 0) {
|
||||
log.console(p + '/' + script + ":")
|
||||
hoist_files = hoist_files + 1
|
||||
arrfor(ast._hoisted_fns, function(ref) {
|
||||
var msg = " " + ref.name
|
||||
if (ref.line != null) msg = msg + " (ref line " + text(ref.line)
|
||||
if (ref.decl_line != null) msg = msg + ", declared line " + text(ref.decl_line)
|
||||
if (ref.line != null) msg = msg + ")"
|
||||
log.console(msg)
|
||||
hoist_refs = hoist_refs + 1
|
||||
})
|
||||
}
|
||||
} disruption {
|
||||
// skip files that fail to parse
|
||||
}
|
||||
scan()
|
||||
})
|
||||
})
|
||||
|
||||
log.console("")
|
||||
log.console("Summary: " + text(hoist_files) + " files with function hoisting, " + text(hoist_refs) + " total forward references")
|
||||
return
|
||||
}
|
||||
|
||||
arrfor(packages, function(p) {
|
||||
var scripts = shop.get_package_scripts(p)
|
||||
var result = null
|
||||
var resolution = null
|
||||
if (length(scripts) == 0) return
|
||||
|
||||
log.console("Auditing " + p + " (" + text(length(scripts)) + " scripts)...")
|
||||
result = shop.build_package_scripts(p)
|
||||
total_ok = total_ok + result.ok
|
||||
total_errors = total_errors + length(result.errors)
|
||||
total_scripts = total_scripts + result.total
|
||||
|
||||
arrfor(result.errors, function(e) {
|
||||
all_failures[] = p + ": " + e
|
||||
})
|
||||
|
||||
// Check use() resolution
|
||||
resolution = shop.audit_use_resolution(p)
|
||||
arrfor(resolution.unresolved, function(u) {
|
||||
all_unresolved[] = p + '/' + u.script + ": use('" + u.module + "') cannot be resolved"
|
||||
})
|
||||
})
|
||||
|
||||
log.console("")
|
||||
if (length(all_failures) > 0) {
|
||||
log.console("Failed scripts:")
|
||||
arrfor(all_failures, function(f) {
|
||||
log.console(" " + f)
|
||||
})
|
||||
log.console("")
|
||||
}
|
||||
|
||||
if (length(all_unresolved) > 0) {
|
||||
log.console("Unresolved modules:")
|
||||
arrfor(all_unresolved, function(u) {
|
||||
log.console(" " + u)
|
||||
})
|
||||
log.console("")
|
||||
}
|
||||
|
||||
summary = "Audit complete: " + text(total_ok) + "/" + text(total_scripts) + " scripts compiled"
|
||||
if (total_errors > 0) summary = summary + ", " + text(total_errors) + " failed"
|
||||
if (length(all_unresolved) > 0) summary = summary + ", " + text(length(all_unresolved)) + " unresolved use() calls"
|
||||
log.console(summary)
|
||||
}
|
||||
run()
|
||||
701
bench.ce
Normal file
701
bench.ce
Normal file
@@ -0,0 +1,701 @@
|
||||
// cell bench - Run benchmarks with statistical analysis
|
||||
|
||||
var shop = use('internal/shop')
|
||||
var pkg = use('package')
|
||||
var fd = use('fd')
|
||||
var time = use('time')
|
||||
var json = use('json')
|
||||
var blob = use('blob')
|
||||
var os = use('internal/os')
|
||||
var testlib = use('internal/testlib')
|
||||
var math = use('math/radians')
|
||||
|
||||
var _args = args == null ? [] : args
|
||||
|
||||
var target_pkg = null // null = current package
|
||||
var target_bench = null // null = all benchmarks, otherwise specific bench file
|
||||
var all_pkgs = false
|
||||
var bench_mode = "bytecode" // "bytecode", "native", or "compare"
|
||||
|
||||
// Strip mode flags from args before parsing
|
||||
function strip_mode_flags() {
|
||||
var filtered = []
|
||||
arrfor(_args, function(a) {
|
||||
if (a == '--native') {
|
||||
bench_mode = "native"
|
||||
} else if (a == '--bytecode') {
|
||||
bench_mode = "bytecode"
|
||||
} else if (a == '--compare') {
|
||||
bench_mode = "compare"
|
||||
} else {
|
||||
filtered[] = a
|
||||
}
|
||||
})
|
||||
_args = filtered
|
||||
}
|
||||
strip_mode_flags()
|
||||
|
||||
// Benchmark configuration
|
||||
def WARMUP_BATCHES = 3
|
||||
def SAMPLES = 11 // Number of timing samples to collect
|
||||
def TARGET_SAMPLE_NS = 20000000 // 20ms per sample (fast mode)
|
||||
def MIN_SAMPLE_NS = 2000000 // 2ms minimum sample duration
|
||||
def MIN_BATCH_SIZE = 1
|
||||
def MAX_BATCH_SIZE = 100000000 // 100M iterations max per batch
|
||||
|
||||
// Statistical functions
|
||||
function median(arr) {
|
||||
if (length(arr) == 0) return 0
|
||||
var sorted = sort(arr)
|
||||
var mid = floor(length(arr) / 2)
|
||||
if (length(arr) % 2 == 0) {
|
||||
return (sorted[mid - 1] + sorted[mid]) / 2
|
||||
}
|
||||
return sorted[mid]
|
||||
}
|
||||
|
||||
function mean(arr) {
|
||||
if (length(arr) == 0) return 0
|
||||
var sum = 0
|
||||
arrfor(arr, function(val) {
|
||||
sum += val
|
||||
})
|
||||
return sum / length(arr)
|
||||
}
|
||||
|
||||
function stddev(arr, mean_val) {
|
||||
if (length(arr) < 2) return 0
|
||||
var sum_sq_diff = 0
|
||||
arrfor(arr, function(val) {
|
||||
var diff = val - mean_val
|
||||
sum_sq_diff += diff * diff
|
||||
})
|
||||
return math.sqrt(sum_sq_diff / (length(arr) - 1))
|
||||
}
|
||||
|
||||
function percentile(arr, p) {
|
||||
if (length(arr) == 0) return 0
|
||||
var sorted = sort(arr)
|
||||
var idx = floor(length(arr) * p / 100)
|
||||
if (idx >= length(arr)) idx = length(arr) - 1
|
||||
return sorted[idx]
|
||||
}
|
||||
|
||||
// Parse arguments similar to test.ce
|
||||
function parse_args() {
|
||||
var name = null
|
||||
var lock = null
|
||||
var resolved = null
|
||||
var bench_path = null
|
||||
|
||||
if (length(_args) == 0) {
|
||||
if (!testlib.is_valid_package('.')) {
|
||||
log.console('No cell.toml found in current directory')
|
||||
return false
|
||||
}
|
||||
target_pkg = null
|
||||
return true
|
||||
}
|
||||
|
||||
if (_args[0] == 'all') {
|
||||
if (!testlib.is_valid_package('.')) {
|
||||
log.console('No cell.toml found in current directory')
|
||||
return false
|
||||
}
|
||||
target_pkg = null
|
||||
return true
|
||||
}
|
||||
|
||||
if (_args[0] == 'package') {
|
||||
if (length(_args) < 2) {
|
||||
log.console('Usage: cell bench package <name> [bench]')
|
||||
log.console(' cell bench package all')
|
||||
return false
|
||||
}
|
||||
|
||||
if (_args[1] == 'all') {
|
||||
all_pkgs = true
|
||||
log.console('Benchmarking all packages...')
|
||||
return true
|
||||
}
|
||||
|
||||
name = _args[1]
|
||||
lock = shop.load_lock()
|
||||
if (lock[name]) {
|
||||
target_pkg = name
|
||||
} else if (starts_with(name, '/') && testlib.is_valid_package(name)) {
|
||||
target_pkg = name
|
||||
} else {
|
||||
if (testlib.is_valid_package('.')) {
|
||||
resolved = pkg.alias_to_package(null, name)
|
||||
if (resolved) {
|
||||
target_pkg = resolved
|
||||
} else {
|
||||
log.console(`Package not found: ${name}`)
|
||||
return false
|
||||
}
|
||||
} else {
|
||||
log.console(`Package not found: ${name}`)
|
||||
return false
|
||||
}
|
||||
}
|
||||
|
||||
if (length(_args) >= 3) {
|
||||
target_bench = _args[2]
|
||||
}
|
||||
|
||||
log.console(`Benchmarking package: ${target_pkg}`)
|
||||
return true
|
||||
}
|
||||
|
||||
// cell bench benches/suite or cell bench <path>
|
||||
bench_path = _args[0]
|
||||
|
||||
// Normalize path - add benches/ prefix if not present
|
||||
if (!starts_with(bench_path, 'benches/') && !starts_with(bench_path, '/')) {
|
||||
if (!fd.is_file(bench_path + '.cm') && !fd.is_file(bench_path)) {
|
||||
if (fd.is_file('benches/' + bench_path + '.cm') || fd.is_file('benches/' + bench_path)) {
|
||||
bench_path = 'benches/' + bench_path
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
target_bench = bench_path
|
||||
target_pkg = null
|
||||
|
||||
if (!testlib.is_valid_package('.')) {
|
||||
log.console('No cell.toml found in current directory')
|
||||
return false
|
||||
}
|
||||
|
||||
return true
|
||||
}
|
||||
|
||||
if (!parse_args()) {
|
||||
$stop()
|
||||
return
|
||||
}
|
||||
|
||||
// Collect benchmark files from a package
|
||||
function collect_benches(package_name, specific_bench) {
|
||||
var prefix = testlib.get_pkg_dir(package_name)
|
||||
var benches_dir = prefix + '/benches'
|
||||
|
||||
if (!fd.is_dir(benches_dir)) return []
|
||||
|
||||
var files = pkg.list_files(package_name)
|
||||
var bench_files = []
|
||||
arrfor(files, function(f) {
|
||||
var bench_name = null
|
||||
var match_name = null
|
||||
var match_base = null
|
||||
if (starts_with(f, "benches/") && ends_with(f, ".cm")) {
|
||||
if (specific_bench) {
|
||||
bench_name = text(f, 0, -3)
|
||||
match_name = specific_bench
|
||||
if (!starts_with(match_name, 'benches/')) match_name = 'benches/' + match_name
|
||||
match_base = ends_with(match_name, '.cm') ? text(match_name, 0, -3) : match_name
|
||||
if (bench_name != match_base) return
|
||||
}
|
||||
bench_files[] = f
|
||||
}
|
||||
})
|
||||
return bench_files
|
||||
}
|
||||
|
||||
// Calibrate batch size for a benchmark
|
||||
function calibrate_batch_size(bench_fn, is_batch) {
|
||||
if (!is_batch) return 1
|
||||
|
||||
var n = MIN_BATCH_SIZE
|
||||
var dt = 0
|
||||
var start = 0
|
||||
var new_n = 0
|
||||
var calc = 0
|
||||
var target_n = 0
|
||||
|
||||
// Find a batch size that takes at least MIN_SAMPLE_NS
|
||||
while (n < MAX_BATCH_SIZE) {
|
||||
if (!is_number(n) || n < 1) {
|
||||
n = 1
|
||||
break
|
||||
}
|
||||
|
||||
start = os.now()
|
||||
bench_fn(n)
|
||||
dt = os.now() - start
|
||||
|
||||
if (dt >= MIN_SAMPLE_NS) break
|
||||
|
||||
new_n = n * 2
|
||||
if (!is_number(new_n) || new_n > MAX_BATCH_SIZE) {
|
||||
n = MAX_BATCH_SIZE
|
||||
break
|
||||
}
|
||||
n = new_n
|
||||
}
|
||||
|
||||
// Adjust to target sample duration
|
||||
if (dt > 0 && dt < TARGET_SAMPLE_NS && is_number(n) && is_number(dt)) {
|
||||
calc = n * TARGET_SAMPLE_NS / dt
|
||||
if (is_number(calc) && calc > 0) {
|
||||
target_n = floor(calc)
|
||||
if (is_number(target_n) && target_n > 0) {
|
||||
if (target_n > MAX_BATCH_SIZE) target_n = MAX_BATCH_SIZE
|
||||
if (target_n < MIN_BATCH_SIZE) target_n = MIN_BATCH_SIZE
|
||||
n = target_n
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if (!is_number(n) || n < 1) {
|
||||
n = 1
|
||||
}
|
||||
|
||||
return n
|
||||
}
|
||||
|
||||
// Run a single benchmark function
|
||||
function run_single_bench(bench_fn, bench_name) {
|
||||
var timings_per_op = []
|
||||
var is_structured = is_object(bench_fn) && bench_fn.run
|
||||
var is_batch = false
|
||||
var batch_size = 1
|
||||
var setup_fn = null
|
||||
var run_fn = null
|
||||
var teardown_fn = null
|
||||
var calibrate_fn = null
|
||||
var _detect = null
|
||||
var i = 0
|
||||
var state = null
|
||||
var start = 0
|
||||
var duration = 0
|
||||
var ns_per_op = 0
|
||||
|
||||
if (is_structured) {
|
||||
setup_fn = bench_fn.setup || function() { return null }
|
||||
run_fn = bench_fn.run
|
||||
teardown_fn = bench_fn.teardown || function(s) {}
|
||||
|
||||
// Check if run function accepts batch size
|
||||
_detect = function() {
|
||||
var test_state = setup_fn()
|
||||
run_fn(1, test_state)
|
||||
is_batch = true
|
||||
if (teardown_fn) teardown_fn(test_state)
|
||||
} disruption {
|
||||
is_batch = false
|
||||
}
|
||||
_detect()
|
||||
|
||||
calibrate_fn = function(n) {
|
||||
var s = setup_fn()
|
||||
run_fn(n, s)
|
||||
if (teardown_fn) teardown_fn(s)
|
||||
}
|
||||
batch_size = calibrate_batch_size(calibrate_fn, is_batch)
|
||||
|
||||
if (!is_number(batch_size) || batch_size < 1) {
|
||||
batch_size = 1
|
||||
}
|
||||
} else {
|
||||
// Simple function format
|
||||
_detect = function() {
|
||||
bench_fn(1)
|
||||
is_batch = true
|
||||
} disruption {
|
||||
is_batch = false
|
||||
}
|
||||
_detect()
|
||||
batch_size = calibrate_batch_size(bench_fn, is_batch)
|
||||
}
|
||||
|
||||
if (!batch_size || batch_size < 1) {
|
||||
batch_size = 1
|
||||
}
|
||||
|
||||
// Warmup phase
|
||||
for (i = 0; i < WARMUP_BATCHES; i++) {
|
||||
if (!is_number(batch_size) || batch_size < 1) {
|
||||
batch_size = 1
|
||||
}
|
||||
|
||||
if (is_structured) {
|
||||
state = setup_fn()
|
||||
if (is_batch) {
|
||||
run_fn(batch_size, state)
|
||||
} else {
|
||||
run_fn(state)
|
||||
}
|
||||
if (teardown_fn) teardown_fn(state)
|
||||
} else {
|
||||
if (is_batch) {
|
||||
bench_fn(batch_size)
|
||||
} else {
|
||||
bench_fn()
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Measurement phase - collect SAMPLES timing samples
|
||||
for (i = 0; i < SAMPLES; i++) {
|
||||
if (!is_number(batch_size) || batch_size < 1) {
|
||||
batch_size = 1
|
||||
}
|
||||
|
||||
if (is_structured) {
|
||||
state = setup_fn()
|
||||
start = os.now()
|
||||
if (is_batch) {
|
||||
run_fn(batch_size, state)
|
||||
} else {
|
||||
run_fn(state)
|
||||
}
|
||||
duration = os.now() - start
|
||||
if (teardown_fn) teardown_fn(state)
|
||||
|
||||
ns_per_op = is_batch ? duration / batch_size : duration
|
||||
timings_per_op[] = ns_per_op
|
||||
} else {
|
||||
start = os.now()
|
||||
if (is_batch) {
|
||||
bench_fn(batch_size)
|
||||
} else {
|
||||
bench_fn()
|
||||
}
|
||||
duration = os.now() - start
|
||||
|
||||
ns_per_op = is_batch ? duration / batch_size : duration
|
||||
timings_per_op[] = ns_per_op
|
||||
}
|
||||
}
|
||||
|
||||
// Calculate statistics
|
||||
var mean_ns = mean(timings_per_op)
|
||||
var median_ns = median(timings_per_op)
|
||||
var min_ns = reduce(timings_per_op, min)
|
||||
var max_ns = reduce(timings_per_op, max)
|
||||
var stddev_ns = stddev(timings_per_op, mean_ns)
|
||||
var p95_ns = percentile(timings_per_op, 95)
|
||||
var p99_ns = percentile(timings_per_op, 99)
|
||||
|
||||
var ops_per_sec = 0
|
||||
if (median_ns > 0) {
|
||||
ops_per_sec = floor(1000000000 / median_ns)
|
||||
}
|
||||
|
||||
return {
|
||||
name: bench_name,
|
||||
batch_size: batch_size,
|
||||
samples: SAMPLES,
|
||||
mean_ns: round(mean_ns),
|
||||
median_ns: round(median_ns),
|
||||
min_ns: round(min_ns),
|
||||
max_ns: round(max_ns),
|
||||
stddev_ns: round(stddev_ns),
|
||||
p95_ns: round(p95_ns),
|
||||
p99_ns: round(p99_ns),
|
||||
ops_per_sec: ops_per_sec
|
||||
}
|
||||
}
|
||||
|
||||
// Format nanoseconds for display
|
||||
function format_ns(ns) {
|
||||
if (ns < 1000) return `${ns}ns`
|
||||
if (ns < 1000000) return `${round(ns / 1000 * 100) / 100}µs`
|
||||
if (ns < 1000000000) return `${round(ns / 1000000 * 100) / 100}ms`
|
||||
return `${round(ns / 1000000000 * 100) / 100}s`
|
||||
}
|
||||
|
||||
// Format ops/sec for display
|
||||
function format_ops(ops) {
|
||||
if (ops < 1000) return `${ops} ops/s`
|
||||
if (ops < 1000000) return `${round(ops / 1000 * 100) / 100}K ops/s`
|
||||
if (ops < 1000000000) return `${round(ops / 1000000 * 100) / 100}M ops/s`
|
||||
return `${round(ops / 1000000000 * 100) / 100}G ops/s`
|
||||
}
|
||||
|
||||
// Load a module for benchmarking in the given mode
|
||||
// Returns the module value, or null on failure
|
||||
function resolve_bench_load(f, package_name) {
|
||||
var mod_path = text(f, 0, -3)
|
||||
var use_pkg = package_name ? package_name : fd.realpath('.')
|
||||
var prefix = testlib.get_pkg_dir(package_name)
|
||||
var src_path = prefix + '/' + f
|
||||
return {mod_path, use_pkg, src_path}
|
||||
}
|
||||
|
||||
function load_bench_module_native(f, package_name) {
|
||||
var r = resolve_bench_load(f, package_name)
|
||||
return shop.use_native(r.src_path, r.use_pkg)
|
||||
}
|
||||
|
||||
function load_bench_module(f, package_name, mode) {
|
||||
var r = resolve_bench_load(f, package_name)
|
||||
if (mode == "native") {
|
||||
return load_bench_module_native(f, package_name)
|
||||
}
|
||||
return shop.use(r.mod_path, r.use_pkg)
|
||||
}
|
||||
|
||||
// Collect benchmark functions from a loaded module
|
||||
function collect_bench_fns(bench_mod) {
|
||||
var benches = []
|
||||
if (is_function(bench_mod)) {
|
||||
benches[] = {name: 'main', fn: bench_mod}
|
||||
} else if (is_object(bench_mod)) {
|
||||
arrfor(array(bench_mod), function(k) {
|
||||
if (is_function(bench_mod[k]))
|
||||
benches[] = {name: k, fn: bench_mod[k]}
|
||||
})
|
||||
}
|
||||
return benches
|
||||
}
|
||||
|
||||
// Print results for a single benchmark
|
||||
function print_bench_result(result, label) {
|
||||
var prefix = label ? `[${label}] ` : ''
|
||||
log.console(` ${prefix}${format_ns(result.median_ns)}/op ${format_ops(result.ops_per_sec)}`)
|
||||
log.console(` ${prefix}min: ${format_ns(result.min_ns)} max: ${format_ns(result.max_ns)} stddev: ${format_ns(result.stddev_ns)}`)
|
||||
if (result.batch_size > 1) {
|
||||
log.console(` ${prefix}batch: ${result.batch_size} samples: ${result.samples}`)
|
||||
}
|
||||
}
|
||||
|
||||
// Run benchmarks for a package
|
||||
function run_benchmarks(package_name, specific_bench) {
|
||||
var bench_files = collect_benches(package_name, specific_bench)
|
||||
|
||||
var pkg_result = {
|
||||
package: package_name || "local",
|
||||
files: [],
|
||||
total: 0
|
||||
}
|
||||
|
||||
if (length(bench_files) == 0) return pkg_result
|
||||
|
||||
var mode_label = bench_mode == "compare" ? "bytecode vs native" : bench_mode
|
||||
if (package_name) log.console(`Running benchmarks for ${package_name} (${mode_label})`)
|
||||
else log.console(`Running benchmarks for local package (${mode_label})`)
|
||||
|
||||
arrfor(bench_files, function(f) {
|
||||
var load_error = false
|
||||
var benches = []
|
||||
var native_benches = []
|
||||
var bench_mod = null
|
||||
var native_mod = null
|
||||
var error_result = null
|
||||
|
||||
var file_result = {
|
||||
name: f,
|
||||
benchmarks: []
|
||||
}
|
||||
|
||||
var _load_file = function() {
|
||||
var _load_native = null
|
||||
if (bench_mode == "compare") {
|
||||
bench_mod = load_bench_module(f, package_name, "bytecode")
|
||||
benches = collect_bench_fns(bench_mod)
|
||||
_load_native = function() {
|
||||
native_mod = load_bench_module(f, package_name, "native")
|
||||
native_benches = collect_bench_fns(native_mod)
|
||||
} disruption {
|
||||
log.console(` ${f}: native compilation failed, comparing skipped`)
|
||||
native_benches = []
|
||||
}
|
||||
_load_native()
|
||||
} else {
|
||||
bench_mod = load_bench_module(f, package_name, bench_mode)
|
||||
benches = collect_bench_fns(bench_mod)
|
||||
}
|
||||
|
||||
if (length(benches) > 0) {
|
||||
log.console(` ${f}`)
|
||||
arrfor(benches, function(b) {
|
||||
var bench_error = false
|
||||
var result = null
|
||||
var nat_b = null
|
||||
var nat_error = false
|
||||
var nat_result = null
|
||||
|
||||
var _run_bench = function() {
|
||||
var speedup = 0
|
||||
var _run_nat = null
|
||||
result = run_single_bench(b.fn, b.name)
|
||||
result.package = pkg_result.package
|
||||
result.mode = bench_mode == "compare" ? "bytecode" : bench_mode
|
||||
file_result.benchmarks[] = result
|
||||
pkg_result.total++
|
||||
|
||||
log.console(` ${result.name}`)
|
||||
if (bench_mode == "compare") {
|
||||
print_bench_result(result, "bytecode")
|
||||
|
||||
// Find matching native bench and run it
|
||||
nat_b = find(native_benches, function(nb) { return nb.name == b.name })
|
||||
if (nat_b != null) {
|
||||
_run_nat = function() {
|
||||
nat_result = run_single_bench(native_benches[nat_b].fn, b.name)
|
||||
nat_result.package = pkg_result.package
|
||||
nat_result.mode = "native"
|
||||
file_result.benchmarks[] = nat_result
|
||||
pkg_result.total++
|
||||
print_bench_result(nat_result, "native ")
|
||||
|
||||
if (nat_result.median_ns > 0) {
|
||||
speedup = result.median_ns / nat_result.median_ns
|
||||
log.console(` speedup: ${round(speedup * 100) / 100}x`)
|
||||
}
|
||||
} disruption {
|
||||
nat_error = true
|
||||
}
|
||||
_run_nat()
|
||||
if (nat_error) {
|
||||
log.console(` [native ] ERROR`)
|
||||
}
|
||||
} else {
|
||||
log.console(` [native ] (no matching function)`)
|
||||
}
|
||||
} else {
|
||||
print_bench_result(result, null)
|
||||
}
|
||||
} disruption {
|
||||
bench_error = true
|
||||
}
|
||||
_run_bench()
|
||||
if (bench_error) {
|
||||
log.console(` ERROR ${b.name}`)
|
||||
error_result = {
|
||||
package: pkg_result.package,
|
||||
name: b.name,
|
||||
error: "benchmark disrupted"
|
||||
}
|
||||
file_result.benchmarks[] = error_result
|
||||
pkg_result.total++
|
||||
}
|
||||
})
|
||||
}
|
||||
} disruption {
|
||||
load_error = true
|
||||
}
|
||||
_load_file()
|
||||
if (load_error) {
|
||||
log.console(` Error loading ${f}`)
|
||||
error_result = {
|
||||
package: pkg_result.package,
|
||||
name: "load_module",
|
||||
error: "error loading module"
|
||||
}
|
||||
file_result.benchmarks[] = error_result
|
||||
pkg_result.total++
|
||||
}
|
||||
|
||||
if (length(file_result.benchmarks) > 0) {
|
||||
pkg_result.files[] = file_result
|
||||
}
|
||||
})
|
||||
|
||||
return pkg_result
|
||||
}
|
||||
|
||||
// Run all benchmarks
|
||||
var all_results = []
|
||||
var packages = null
|
||||
|
||||
if (all_pkgs) {
|
||||
if (testlib.is_valid_package('.')) {
|
||||
all_results[] = run_benchmarks(null, null)
|
||||
}
|
||||
|
||||
packages = shop.list_packages()
|
||||
arrfor(packages, function(p) {
|
||||
all_results[] = run_benchmarks(p, null)
|
||||
})
|
||||
} else {
|
||||
all_results[] = run_benchmarks(target_pkg, target_bench)
|
||||
}
|
||||
|
||||
// Calculate totals
|
||||
var total_benches = 0
|
||||
arrfor(all_results, function(result) {
|
||||
total_benches += result.total
|
||||
})
|
||||
|
||||
log.console(`----------------------------------------`)
|
||||
log.console(`Benchmarks: ${total_benches} total`)
|
||||
|
||||
// Generate reports
|
||||
function generate_reports() {
|
||||
var timestamp = text(floor(time.number()))
|
||||
var report_dir = shop.get_reports_dir() + '/bench_' + timestamp
|
||||
testlib.ensure_dir(report_dir)
|
||||
|
||||
var mode_str = bench_mode == "compare" ? "bytecode vs native" : bench_mode
|
||||
var txt_report = `BENCHMARK REPORT
|
||||
Date: ${time.text(time.number())}
|
||||
Mode: ${mode_str}
|
||||
Total benchmarks: ${total_benches}
|
||||
|
||||
=== SUMMARY ===
|
||||
`
|
||||
arrfor(all_results, function(pkg_res) {
|
||||
if (pkg_res.total == 0) return
|
||||
txt_report += `Package: ${pkg_res.package}\n`
|
||||
arrfor(pkg_res.files, function(f) {
|
||||
txt_report += ` ${f.name}\n`
|
||||
arrfor(f.benchmarks, function(b) {
|
||||
var mode_tag = b.mode ? ` [${b.mode}]` : ''
|
||||
if (b.error) {
|
||||
txt_report += ` ERROR ${b.name}: ${b.error}\n`
|
||||
} else {
|
||||
txt_report += ` ${b.name}${mode_tag}: ${format_ns(b.median_ns)}/op (${format_ops(b.ops_per_sec)})\n`
|
||||
}
|
||||
})
|
||||
})
|
||||
})
|
||||
|
||||
txt_report += `\n=== DETAILED RESULTS ===\n`
|
||||
arrfor(all_results, function(pkg_res) {
|
||||
if (pkg_res.total == 0) return
|
||||
|
||||
arrfor(pkg_res.files, function(f) {
|
||||
arrfor(f.benchmarks, function(b) {
|
||||
if (b.error) return
|
||||
|
||||
var detail_mode = b.mode ? ` [${b.mode}]` : ''
|
||||
txt_report += `\n${pkg_res.package}::${b.name}${detail_mode}\n`
|
||||
txt_report += ` batch_size: ${b.batch_size} samples: ${b.samples}\n`
|
||||
txt_report += ` median: ${format_ns(b.median_ns)}/op\n`
|
||||
txt_report += ` mean: ${format_ns(b.mean_ns)}/op\n`
|
||||
txt_report += ` min: ${format_ns(b.min_ns)}\n`
|
||||
txt_report += ` max: ${format_ns(b.max_ns)}\n`
|
||||
txt_report += ` stddev: ${format_ns(b.stddev_ns)}\n`
|
||||
txt_report += ` p95: ${format_ns(b.p95_ns)}\n`
|
||||
txt_report += ` p99: ${format_ns(b.p99_ns)}\n`
|
||||
txt_report += ` ops/s: ${format_ops(b.ops_per_sec)}\n`
|
||||
})
|
||||
})
|
||||
})
|
||||
|
||||
testlib.ensure_dir(report_dir)
|
||||
fd.slurpwrite(`${report_dir}/bench.txt`, stone(blob(txt_report)))
|
||||
log.console(`Report written to ${report_dir}/bench.txt`)
|
||||
|
||||
// Generate JSON per package
|
||||
arrfor(all_results, function(pkg_res) {
|
||||
if (pkg_res.total == 0) return
|
||||
|
||||
var pkg_benches = []
|
||||
arrfor(pkg_res.files, function(f) {
|
||||
arrfor(f.benchmarks, function(benchmark) {
|
||||
pkg_benches[] = benchmark
|
||||
})
|
||||
})
|
||||
|
||||
var json_path = `${report_dir}/${replace(pkg_res.package, /\//, '_')}.json`
|
||||
fd.slurpwrite(json_path, stone(blob(json.encode(pkg_benches))))
|
||||
})
|
||||
}
|
||||
|
||||
generate_reports()
|
||||
$stop()
|
||||
232
benches/actor_patterns.cm
Normal file
232
benches/actor_patterns.cm
Normal file
@@ -0,0 +1,232 @@
|
||||
// actor_patterns.cm — Actor concurrency benchmarks
|
||||
// Message passing, fan-out/fan-in, mailbox throughput.
|
||||
// These use structured benchmarks with setup/run/teardown.
|
||||
|
||||
// Note: actor benchmarks are measured differently from pure compute.
|
||||
// Each iteration sends messages and waits for results, so they're
|
||||
// inherently slower but test real concurrency costs.
|
||||
|
||||
// Simple ping-pong: two actors sending messages back and forth
|
||||
// Since we can't create real actors from a module, we simulate
|
||||
// the message-passing patterns with function call overhead that
|
||||
// mirrors what the actor dispatch does.
|
||||
|
||||
// Simulate message dispatch overhead
|
||||
function make_mailbox() {
|
||||
return {
|
||||
queue: [],
|
||||
delivered: 0
|
||||
}
|
||||
}
|
||||
|
||||
function send(mailbox, msg) {
|
||||
push(mailbox.queue, msg)
|
||||
return null
|
||||
}
|
||||
|
||||
function receive(mailbox) {
|
||||
if (length(mailbox.queue) == 0) return null
|
||||
mailbox.delivered++
|
||||
return mailbox.queue[]
|
||||
}
|
||||
|
||||
function drain(mailbox) {
|
||||
var count = 0
|
||||
while (length(mailbox.queue) > 0) {
|
||||
mailbox.queue[]
|
||||
count++
|
||||
}
|
||||
return count
|
||||
}
|
||||
|
||||
// Ping-pong: simulate two actors exchanging messages
|
||||
function ping_pong(rounds) {
|
||||
var box_a = make_mailbox()
|
||||
var box_b = make_mailbox()
|
||||
var i = 0
|
||||
var msg = null
|
||||
|
||||
send(box_a, {type: "ping", val: 0})
|
||||
|
||||
for (i = 0; i < rounds; i++) {
|
||||
// A receives and sends to B
|
||||
msg = receive(box_a)
|
||||
if (msg) {
|
||||
send(box_b, {type: "pong", val: msg.val + 1})
|
||||
}
|
||||
// B receives and sends to A
|
||||
msg = receive(box_b)
|
||||
if (msg) {
|
||||
send(box_a, {type: "ping", val: msg.val + 1})
|
||||
}
|
||||
}
|
||||
|
||||
return box_a.delivered + box_b.delivered
|
||||
}
|
||||
|
||||
// Fan-out: one sender, N receivers
|
||||
function fan_out(n_receivers, messages_per) {
|
||||
var receivers = []
|
||||
var i = 0
|
||||
var j = 0
|
||||
for (i = 0; i < n_receivers; i++) {
|
||||
push(receivers, make_mailbox())
|
||||
}
|
||||
|
||||
// Send messages to all receivers
|
||||
for (j = 0; j < messages_per; j++) {
|
||||
for (i = 0; i < n_receivers; i++) {
|
||||
send(receivers[i], {seq: j, data: j * 17})
|
||||
}
|
||||
}
|
||||
|
||||
// All receivers drain
|
||||
var total = 0
|
||||
for (i = 0; i < n_receivers; i++) {
|
||||
total += drain(receivers[i])
|
||||
}
|
||||
|
||||
return total
|
||||
}
|
||||
|
||||
// Fan-in: N senders, one receiver
|
||||
function fan_in(n_senders, messages_per) {
|
||||
var inbox = make_mailbox()
|
||||
var i = 0
|
||||
var j = 0
|
||||
|
||||
// Each sender sends messages
|
||||
for (i = 0; i < n_senders; i++) {
|
||||
for (j = 0; j < messages_per; j++) {
|
||||
send(inbox, {sender: i, seq: j, data: i * 100 + j})
|
||||
}
|
||||
}
|
||||
|
||||
// Receiver processes all
|
||||
var total = 0
|
||||
var msg = null
|
||||
msg = receive(inbox)
|
||||
while (msg) {
|
||||
total += msg.data
|
||||
msg = receive(inbox)
|
||||
}
|
||||
|
||||
return total
|
||||
}
|
||||
|
||||
// Pipeline: chain of processors
|
||||
function pipeline(stages, items) {
|
||||
var boxes = []
|
||||
var i = 0
|
||||
var j = 0
|
||||
var msg = null
|
||||
|
||||
for (i = 0; i <= stages; i++) {
|
||||
push(boxes, make_mailbox())
|
||||
}
|
||||
|
||||
// Feed input
|
||||
for (i = 0; i < items; i++) {
|
||||
send(boxes[0], {val: i})
|
||||
}
|
||||
|
||||
// Process each stage
|
||||
for (j = 0; j < stages; j++) {
|
||||
msg = receive(boxes[j])
|
||||
while (msg) {
|
||||
send(boxes[j + 1], {val: msg.val * 2 + 1})
|
||||
msg = receive(boxes[j])
|
||||
}
|
||||
}
|
||||
|
||||
// Drain output
|
||||
var total = 0
|
||||
msg = receive(boxes[stages])
|
||||
while (msg) {
|
||||
total += msg.val
|
||||
msg = receive(boxes[stages])
|
||||
}
|
||||
|
||||
return total
|
||||
}
|
||||
|
||||
// Request-response pattern (simulate RPC)
|
||||
function request_response(n_requests) {
|
||||
var client_box = make_mailbox()
|
||||
var server_box = make_mailbox()
|
||||
var i = 0
|
||||
var req = null
|
||||
var resp = null
|
||||
var total = 0
|
||||
|
||||
for (i = 0; i < n_requests; i++) {
|
||||
// Client sends request
|
||||
send(server_box, {id: i, payload: i * 3, reply_to: client_box})
|
||||
|
||||
// Server processes
|
||||
req = receive(server_box)
|
||||
if (req) {
|
||||
send(req.reply_to, {id: req.id, result: req.payload * 2 + 1})
|
||||
}
|
||||
|
||||
// Client receives response
|
||||
resp = receive(client_box)
|
||||
if (resp) {
|
||||
total += resp.result
|
||||
}
|
||||
}
|
||||
|
||||
return total
|
||||
}
|
||||
|
||||
return {
|
||||
// Ping-pong: 10K rounds
|
||||
ping_pong_10k: function(n) {
|
||||
var i = 0
|
||||
var x = 0
|
||||
for (i = 0; i < n; i++) {
|
||||
x += ping_pong(10000)
|
||||
}
|
||||
return x
|
||||
},
|
||||
|
||||
// Fan-out: 100 receivers, 100 messages each
|
||||
fan_out_100x100: function(n) {
|
||||
var i = 0
|
||||
var x = 0
|
||||
for (i = 0; i < n; i++) {
|
||||
x += fan_out(100, 100)
|
||||
}
|
||||
return x
|
||||
},
|
||||
|
||||
// Fan-in: 100 senders, 100 messages each
|
||||
fan_in_100x100: function(n) {
|
||||
var i = 0
|
||||
var x = 0
|
||||
for (i = 0; i < n; i++) {
|
||||
x += fan_in(100, 100)
|
||||
}
|
||||
return x
|
||||
},
|
||||
|
||||
// Pipeline: 10 stages, 1000 items
|
||||
pipeline_10x1k: function(n) {
|
||||
var i = 0
|
||||
var x = 0
|
||||
for (i = 0; i < n; i++) {
|
||||
x += pipeline(10, 1000)
|
||||
}
|
||||
return x
|
||||
},
|
||||
|
||||
// Request-response: 5K requests
|
||||
rpc_5k: function(n) {
|
||||
var i = 0
|
||||
var x = 0
|
||||
for (i = 0; i < n; i++) {
|
||||
x += request_response(5000)
|
||||
}
|
||||
return x
|
||||
}
|
||||
}
|
||||
141
benches/cli_tool.cm
Normal file
141
benches/cli_tool.cm
Normal file
@@ -0,0 +1,141 @@
|
||||
// cli_tool.cm — CLI tool simulation (macro benchmark)
|
||||
// Parse args + process data + transform + format output.
|
||||
// Simulates a realistic small utility program.
|
||||
|
||||
var json = use('json')
|
||||
|
||||
// Generate fake records
|
||||
function generate_records(n) {
|
||||
var records = []
|
||||
var x = 42
|
||||
var i = 0
|
||||
var status_vals = ["active", "inactive", "pending", "archived"]
|
||||
var dept_vals = ["eng", "sales", "ops", "hr", "marketing"]
|
||||
for (i = 0; i < n; i++) {
|
||||
x = ((x * 1103515245 + 12345) & 0x7FFFFFFF) | 0
|
||||
records[] = {
|
||||
id: i + 1,
|
||||
name: `user_${i}`,
|
||||
score: (x % 1000) / 10,
|
||||
status: status_vals[i % 4],
|
||||
department: dept_vals[i % 5]
|
||||
}
|
||||
}
|
||||
return records
|
||||
}
|
||||
|
||||
// Filter records by field value
|
||||
function filter_records(records, field, value) {
|
||||
var result = []
|
||||
var i = 0
|
||||
for (i = 0; i < length(records); i++) {
|
||||
if (records[i][field] == value) {
|
||||
result[] = records[i]
|
||||
}
|
||||
}
|
||||
return result
|
||||
}
|
||||
|
||||
// Group by a field
|
||||
function group_by(records, field) {
|
||||
var groups = {}
|
||||
var i = 0
|
||||
var key = null
|
||||
for (i = 0; i < length(records); i++) {
|
||||
key = records[i][field]
|
||||
if (!key) key = "unknown"
|
||||
if (!groups[key]) groups[key] = []
|
||||
groups[key][] = records[i]
|
||||
}
|
||||
return groups
|
||||
}
|
||||
|
||||
// Aggregate: compute stats per group
|
||||
function aggregate(groups) {
|
||||
var keys = array(groups)
|
||||
var result = []
|
||||
var i = 0
|
||||
var j = 0
|
||||
var grp = null
|
||||
var total = 0
|
||||
var mn = 0
|
||||
var mx = 0
|
||||
for (i = 0; i < length(keys); i++) {
|
||||
grp = groups[keys[i]]
|
||||
total = 0
|
||||
mn = 999999
|
||||
mx = 0
|
||||
for (j = 0; j < length(grp); j++) {
|
||||
total += grp[j].score
|
||||
if (grp[j].score < mn) mn = grp[j].score
|
||||
if (grp[j].score > mx) mx = grp[j].score
|
||||
}
|
||||
result[] = {
|
||||
group: keys[i],
|
||||
count: length(grp),
|
||||
average: total / length(grp),
|
||||
low: mn,
|
||||
high: mx
|
||||
}
|
||||
}
|
||||
return result
|
||||
}
|
||||
|
||||
// Full pipeline: load → filter → sort → group → aggregate → encode
|
||||
function run_pipeline(n_records) {
|
||||
// Generate data
|
||||
var records = generate_records(n_records)
|
||||
|
||||
// Filter to active records
|
||||
var filtered = filter_records(records, "status", "active")
|
||||
|
||||
// Sort by score
|
||||
filtered = sort(filtered, "score")
|
||||
|
||||
// Limit to first 50
|
||||
if (length(filtered) > 50) {
|
||||
filtered = array(filtered, 0, 50)
|
||||
}
|
||||
|
||||
// Group and aggregate
|
||||
var groups = group_by(filtered, "department")
|
||||
var stats = aggregate(groups)
|
||||
stats = sort(stats, "average")
|
||||
|
||||
// Encode as JSON
|
||||
var output = json.encode(stats)
|
||||
|
||||
return length(output)
|
||||
}
|
||||
|
||||
return {
|
||||
// Small dataset (100 records)
|
||||
cli_pipeline_100: function(n) {
|
||||
var i = 0
|
||||
var x = 0
|
||||
for (i = 0; i < n; i++) {
|
||||
x += run_pipeline(100)
|
||||
}
|
||||
return x
|
||||
},
|
||||
|
||||
// Medium dataset (1000 records)
|
||||
cli_pipeline_1k: function(n) {
|
||||
var i = 0
|
||||
var x = 0
|
||||
for (i = 0; i < n; i++) {
|
||||
x += run_pipeline(1000)
|
||||
}
|
||||
return x
|
||||
},
|
||||
|
||||
// Large dataset (10K records)
|
||||
cli_pipeline_10k: function(n) {
|
||||
var i = 0
|
||||
var x = 0
|
||||
for (i = 0; i < n; i++) {
|
||||
x += run_pipeline(10000)
|
||||
}
|
||||
return x
|
||||
}
|
||||
}
|
||||
162
benches/deltablue.cm
Normal file
162
benches/deltablue.cm
Normal file
@@ -0,0 +1,162 @@
|
||||
// deltablue.cm — Constraint solver kernel (DeltaBlue-inspired)
|
||||
// Dynamic dispatch, pointer chasing, object-heavy workload.
|
||||
|
||||
def REQUIRED = 0
|
||||
def STRONG = 1
|
||||
def NORMAL = 2
|
||||
def WEAK = 3
|
||||
def WEAKEST = 4
|
||||
|
||||
function make_variable(name, value) {
|
||||
return {
|
||||
name: name,
|
||||
value: value,
|
||||
constraints: [],
|
||||
determined_by: null,
|
||||
stay: true,
|
||||
mark: 0
|
||||
}
|
||||
}
|
||||
|
||||
function make_constraint(strength, variables, satisfy_fn) {
|
||||
return {
|
||||
strength: strength,
|
||||
variables: variables,
|
||||
satisfy: satisfy_fn,
|
||||
output: null
|
||||
}
|
||||
}
|
||||
|
||||
// Constraint propagation: simple forward solver
|
||||
function propagate(vars, constraints) {
|
||||
var changed = true
|
||||
var passes = 0
|
||||
var max_passes = length(constraints) * 3
|
||||
var i = 0
|
||||
var c = null
|
||||
var old_val = 0
|
||||
|
||||
while (changed && passes < max_passes) {
|
||||
changed = false
|
||||
passes++
|
||||
for (i = 0; i < length(constraints); i++) {
|
||||
c = constraints[i]
|
||||
old_val = c.output ? c.output.value : null
|
||||
c.satisfy(c)
|
||||
if (c.output && c.output.value != old_val) {
|
||||
changed = true
|
||||
}
|
||||
}
|
||||
}
|
||||
return passes
|
||||
}
|
||||
|
||||
// Build a chain of equality constraints: v[i] = v[i-1] + 1
|
||||
function build_chain(n) {
|
||||
var vars = []
|
||||
var constraints = []
|
||||
var i = 0
|
||||
for (i = 0; i < n; i++) {
|
||||
vars[] = make_variable(`v${i}`, 0)
|
||||
}
|
||||
|
||||
// Set first variable
|
||||
vars[0].value = 1
|
||||
|
||||
var c = null
|
||||
for (i = 1; i < n; i++) {
|
||||
c = make_constraint(NORMAL, [vars[i - 1], vars[i]], function(self) {
|
||||
self.variables[1].value = self.variables[0].value + 1
|
||||
self.output = self.variables[1]
|
||||
})
|
||||
constraints[] = c
|
||||
vars[i].constraints[] = c
|
||||
}
|
||||
|
||||
return {vars: vars, constraints: constraints}
|
||||
}
|
||||
|
||||
// Build a projection: pairs of variables with scaling constraints
|
||||
function build_projection(n) {
|
||||
var src = []
|
||||
var dst = []
|
||||
var constraints = []
|
||||
var i = 0
|
||||
for (i = 0; i < n; i++) {
|
||||
src[] = make_variable(`src${i}`, i * 10)
|
||||
dst[] = make_variable(`dst${i}`, 0)
|
||||
}
|
||||
|
||||
var scale_c = null
|
||||
for (i = 0; i < n; i++) {
|
||||
scale_c = make_constraint(STRONG, [src[i], dst[i]], function(self) {
|
||||
self.variables[1].value = self.variables[0].value * 2 + 1
|
||||
self.output = self.variables[1]
|
||||
})
|
||||
constraints[] = scale_c
|
||||
dst[i].constraints[] = scale_c
|
||||
}
|
||||
|
||||
return {src: src, dst: dst, constraints: constraints}
|
||||
}
|
||||
|
||||
// Edit constraint: change a source, re-propagate
|
||||
function run_edits(system, edits) {
|
||||
var i = 0
|
||||
var total_passes = 0
|
||||
for (i = 0; i < edits; i++) {
|
||||
system.vars[0].value = i
|
||||
total_passes += propagate(system.vars, system.constraints)
|
||||
}
|
||||
return total_passes
|
||||
}
|
||||
|
||||
return {
|
||||
// Chain of 100 variables, propagate
|
||||
chain_100: function(n) {
|
||||
var i = 0
|
||||
var chain = null
|
||||
var x = 0
|
||||
for (i = 0; i < n; i++) {
|
||||
chain = build_chain(100)
|
||||
x += propagate(chain.vars, chain.constraints)
|
||||
}
|
||||
return x
|
||||
},
|
||||
|
||||
// Chain of 500 variables, propagate
|
||||
chain_500: function(n) {
|
||||
var i = 0
|
||||
var chain = null
|
||||
var x = 0
|
||||
for (i = 0; i < n; i++) {
|
||||
chain = build_chain(500)
|
||||
x += propagate(chain.vars, chain.constraints)
|
||||
}
|
||||
return x
|
||||
},
|
||||
|
||||
// Projection of 100 pairs
|
||||
projection_100: function(n) {
|
||||
var i = 0
|
||||
var proj = null
|
||||
var x = 0
|
||||
for (i = 0; i < n; i++) {
|
||||
proj = build_projection(100)
|
||||
x += propagate(proj.src, proj.constraints)
|
||||
}
|
||||
return x
|
||||
},
|
||||
|
||||
// Edit and re-propagate (incremental update)
|
||||
chain_edit_100: function(n) {
|
||||
var chain = build_chain(100)
|
||||
var i = 0
|
||||
var x = 0
|
||||
for (i = 0; i < n; i++) {
|
||||
chain.vars[0].value = i
|
||||
x += propagate(chain.vars, chain.constraints)
|
||||
}
|
||||
return x
|
||||
}
|
||||
}
|
||||
405
benches/encoders.cm
Normal file
405
benches/encoders.cm
Normal file
@@ -0,0 +1,405 @@
|
||||
// encoders.cm — nota/wota/json encode+decode benchmark
|
||||
// Isolates per-type bottlenecks across all three serializers.
|
||||
|
||||
var nota = use('internal/nota')
|
||||
var wota = use('internal/wota')
|
||||
var json = use('json')
|
||||
|
||||
// --- Test data shapes ---
|
||||
|
||||
// Small integers: fast path for all encoders
|
||||
var integers_small = array(100, function(i) { return i + 1 })
|
||||
|
||||
// Floats: stresses nota's snprintf path
|
||||
var floats_array = array(100, function(i) {
|
||||
return 3.14159 * (i + 1) + 0.00001 * i
|
||||
})
|
||||
|
||||
// Short strings in records: per-string overhead, property enumeration
|
||||
var strings_short = array(50, function(i) {
|
||||
var r = {}
|
||||
r[`k${i}`] = `value_${i}`
|
||||
return r
|
||||
})
|
||||
|
||||
// Single long string: throughput test (wota byte loop, nota kim)
|
||||
var long_str = ""
|
||||
var li = 0
|
||||
for (li = 0; li < 100; li++) {
|
||||
long_str = `${long_str}abcdefghijklmnopqrstuvwxyz0123456789ABCDEFGHIJKLMN`
|
||||
}
|
||||
var strings_long = long_str
|
||||
|
||||
// Unicode text: nota's kim encoding, wota's byte packing
|
||||
var strings_unicode = "こんにちは世界 🌍🌎🌏 Ñoño café résumé naïve Ω∑∏ 你好世界"
|
||||
|
||||
// Nested records: cycle detection, property enumeration
|
||||
function make_nested(depth, breadth) {
|
||||
var obj = {}
|
||||
var i = 0
|
||||
var k = null
|
||||
if (depth <= 0) {
|
||||
for (i = 0; i < breadth; i++) {
|
||||
k = `v${i}`
|
||||
obj[k] = i * 2.5
|
||||
}
|
||||
return obj
|
||||
}
|
||||
for (i = 0; i < breadth; i++) {
|
||||
k = `n${i}`
|
||||
obj[k] = make_nested(depth - 1, breadth)
|
||||
}
|
||||
return obj
|
||||
}
|
||||
var nested_records = make_nested(3, 4)
|
||||
|
||||
// Flat record: property enumeration cost
|
||||
var flat_record = {}
|
||||
var fi = 0
|
||||
for (fi = 0; fi < 50; fi++) {
|
||||
flat_record[`prop_${fi}`] = fi * 1.1
|
||||
}
|
||||
|
||||
// Mixed payload: realistic workload
|
||||
var mixed_payload = array(50, function(i) {
|
||||
var r = {}
|
||||
r.id = i
|
||||
r.name = `item_${i}`
|
||||
r.active = i % 2 == 0
|
||||
r.score = i * 3.14
|
||||
r.tags = [`t${i % 5}`, `t${(i + 1) % 5}`]
|
||||
return r
|
||||
})
|
||||
|
||||
// --- Pre-encode for decode benchmarks ---
|
||||
|
||||
var nota_enc_integers = nota.encode(integers_small)
|
||||
var nota_enc_floats = nota.encode(floats_array)
|
||||
var nota_enc_strings_short = nota.encode(strings_short)
|
||||
var nota_enc_strings_long = nota.encode(strings_long)
|
||||
var nota_enc_strings_unicode = nota.encode(strings_unicode)
|
||||
var nota_enc_nested = nota.encode(nested_records)
|
||||
var nota_enc_flat = nota.encode(flat_record)
|
||||
var nota_enc_mixed = nota.encode(mixed_payload)
|
||||
|
||||
var wota_enc_integers = wota.encode(integers_small)
|
||||
var wota_enc_floats = wota.encode(floats_array)
|
||||
var wota_enc_strings_short = wota.encode(strings_short)
|
||||
var wota_enc_strings_long = wota.encode(strings_long)
|
||||
var wota_enc_strings_unicode = wota.encode(strings_unicode)
|
||||
var wota_enc_nested = wota.encode(nested_records)
|
||||
var wota_enc_flat = wota.encode(flat_record)
|
||||
var wota_enc_mixed = wota.encode(mixed_payload)
|
||||
|
||||
var json_enc_integers = json.encode(integers_small)
|
||||
var json_enc_floats = json.encode(floats_array)
|
||||
var json_enc_strings_short = json.encode(strings_short)
|
||||
var json_enc_strings_long = json.encode(strings_long)
|
||||
var json_enc_strings_unicode = json.encode(strings_unicode)
|
||||
var json_enc_nested = json.encode(nested_records)
|
||||
var json_enc_flat = json.encode(flat_record)
|
||||
var json_enc_mixed = json.encode(mixed_payload)
|
||||
|
||||
// --- Benchmark functions ---
|
||||
|
||||
return {
|
||||
// NOTA encode
|
||||
nota_encode_integers: function(n) {
|
||||
var i = 0
|
||||
var r = null
|
||||
for (i = 0; i < n; i++) { r = nota.encode(integers_small) }
|
||||
return r
|
||||
},
|
||||
nota_encode_floats: function(n) {
|
||||
var i = 0
|
||||
var r = null
|
||||
for (i = 0; i < n; i++) { r = nota.encode(floats_array) }
|
||||
return r
|
||||
},
|
||||
nota_encode_strings_short: function(n) {
|
||||
var i = 0
|
||||
var r = null
|
||||
for (i = 0; i < n; i++) { r = nota.encode(strings_short) }
|
||||
return r
|
||||
},
|
||||
nota_encode_strings_long: function(n) {
|
||||
var i = 0
|
||||
var r = null
|
||||
for (i = 0; i < n; i++) { r = nota.encode(strings_long) }
|
||||
return r
|
||||
},
|
||||
nota_encode_strings_unicode: function(n) {
|
||||
var i = 0
|
||||
var r = null
|
||||
for (i = 0; i < n; i++) { r = nota.encode(strings_unicode) }
|
||||
return r
|
||||
},
|
||||
nota_encode_nested: function(n) {
|
||||
var i = 0
|
||||
var r = null
|
||||
for (i = 0; i < n; i++) { r = nota.encode(nested_records) }
|
||||
return r
|
||||
},
|
||||
nota_encode_flat: function(n) {
|
||||
var i = 0
|
||||
var r = null
|
||||
for (i = 0; i < n; i++) { r = nota.encode(flat_record) }
|
||||
return r
|
||||
},
|
||||
nota_encode_mixed: function(n) {
|
||||
var i = 0
|
||||
var r = null
|
||||
for (i = 0; i < n; i++) { r = nota.encode(mixed_payload) }
|
||||
return r
|
||||
},
|
||||
|
||||
// NOTA decode
|
||||
nota_decode_integers: function(n) {
|
||||
var i = 0
|
||||
var r = null
|
||||
for (i = 0; i < n; i++) { r = nota.decode(nota_enc_integers) }
|
||||
return r
|
||||
},
|
||||
nota_decode_floats: function(n) {
|
||||
var i = 0
|
||||
var r = null
|
||||
for (i = 0; i < n; i++) { r = nota.decode(nota_enc_floats) }
|
||||
return r
|
||||
},
|
||||
nota_decode_strings_short: function(n) {
|
||||
var i = 0
|
||||
var r = null
|
||||
for (i = 0; i < n; i++) { r = nota.decode(nota_enc_strings_short) }
|
||||
return r
|
||||
},
|
||||
nota_decode_strings_long: function(n) {
|
||||
var i = 0
|
||||
var r = null
|
||||
for (i = 0; i < n; i++) { r = nota.decode(nota_enc_strings_long) }
|
||||
return r
|
||||
},
|
||||
nota_decode_strings_unicode: function(n) {
|
||||
var i = 0
|
||||
var r = null
|
||||
for (i = 0; i < n; i++) { r = nota.decode(nota_enc_strings_unicode) }
|
||||
return r
|
||||
},
|
||||
nota_decode_nested: function(n) {
|
||||
var i = 0
|
||||
var r = null
|
||||
for (i = 0; i < n; i++) { r = nota.decode(nota_enc_nested) }
|
||||
return r
|
||||
},
|
||||
nota_decode_flat: function(n) {
|
||||
var i = 0
|
||||
var r = null
|
||||
for (i = 0; i < n; i++) { r = nota.decode(nota_enc_flat) }
|
||||
return r
|
||||
},
|
||||
nota_decode_mixed: function(n) {
|
||||
var i = 0
|
||||
var r = null
|
||||
for (i = 0; i < n; i++) { r = nota.decode(nota_enc_mixed) }
|
||||
return r
|
||||
},
|
||||
|
||||
// WOTA encode
|
||||
wota_encode_integers: function(n) {
|
||||
var i = 0
|
||||
var r = null
|
||||
for (i = 0; i < n; i++) { r = wota.encode(integers_small) }
|
||||
return r
|
||||
},
|
||||
wota_encode_floats: function(n) {
|
||||
var i = 0
|
||||
var r = null
|
||||
for (i = 0; i < n; i++) { r = wota.encode(floats_array) }
|
||||
return r
|
||||
},
|
||||
wota_encode_strings_short: function(n) {
|
||||
var i = 0
|
||||
var r = null
|
||||
for (i = 0; i < n; i++) { r = wota.encode(strings_short) }
|
||||
return r
|
||||
},
|
||||
wota_encode_strings_long: function(n) {
|
||||
var i = 0
|
||||
var r = null
|
||||
for (i = 0; i < n; i++) { r = wota.encode(strings_long) }
|
||||
return r
|
||||
},
|
||||
wota_encode_strings_unicode: function(n) {
|
||||
var i = 0
|
||||
var r = null
|
||||
for (i = 0; i < n; i++) { r = wota.encode(strings_unicode) }
|
||||
return r
|
||||
},
|
||||
wota_encode_nested: function(n) {
|
||||
var i = 0
|
||||
var r = null
|
||||
for (i = 0; i < n; i++) { r = wota.encode(nested_records) }
|
||||
return r
|
||||
},
|
||||
wota_encode_flat: function(n) {
|
||||
var i = 0
|
||||
var r = null
|
||||
for (i = 0; i < n; i++) { r = wota.encode(flat_record) }
|
||||
return r
|
||||
},
|
||||
wota_encode_mixed: function(n) {
|
||||
var i = 0
|
||||
var r = null
|
||||
for (i = 0; i < n; i++) { r = wota.encode(mixed_payload) }
|
||||
return r
|
||||
},
|
||||
|
||||
// WOTA decode
|
||||
wota_decode_integers: function(n) {
|
||||
var i = 0
|
||||
var r = null
|
||||
for (i = 0; i < n; i++) { r = wota.decode(wota_enc_integers) }
|
||||
return r
|
||||
},
|
||||
wota_decode_floats: function(n) {
|
||||
var i = 0
|
||||
var r = null
|
||||
for (i = 0; i < n; i++) { r = wota.decode(wota_enc_floats) }
|
||||
return r
|
||||
},
|
||||
wota_decode_strings_short: function(n) {
|
||||
var i = 0
|
||||
var r = null
|
||||
for (i = 0; i < n; i++) { r = wota.decode(wota_enc_strings_short) }
|
||||
return r
|
||||
},
|
||||
wota_decode_strings_long: function(n) {
|
||||
var i = 0
|
||||
var r = null
|
||||
for (i = 0; i < n; i++) { r = wota.decode(wota_enc_strings_long) }
|
||||
return r
|
||||
},
|
||||
wota_decode_strings_unicode: function(n) {
|
||||
var i = 0
|
||||
var r = null
|
||||
for (i = 0; i < n; i++) { r = wota.decode(wota_enc_strings_unicode) }
|
||||
return r
|
||||
},
|
||||
wota_decode_nested: function(n) {
|
||||
var i = 0
|
||||
var r = null
|
||||
for (i = 0; i < n; i++) { r = wota.decode(wota_enc_nested) }
|
||||
return r
|
||||
},
|
||||
wota_decode_flat: function(n) {
|
||||
var i = 0
|
||||
var r = null
|
||||
for (i = 0; i < n; i++) { r = wota.decode(wota_enc_flat) }
|
||||
return r
|
||||
},
|
||||
wota_decode_mixed: function(n) {
|
||||
var i = 0
|
||||
var r = null
|
||||
for (i = 0; i < n; i++) { r = wota.decode(wota_enc_mixed) }
|
||||
return r
|
||||
},
|
||||
|
||||
// JSON encode
|
||||
json_encode_integers: function(n) {
|
||||
var i = 0
|
||||
var r = null
|
||||
for (i = 0; i < n; i++) { r = json.encode(integers_small) }
|
||||
return r
|
||||
},
|
||||
json_encode_floats: function(n) {
|
||||
var i = 0
|
||||
var r = null
|
||||
for (i = 0; i < n; i++) { r = json.encode(floats_array) }
|
||||
return r
|
||||
},
|
||||
json_encode_strings_short: function(n) {
|
||||
var i = 0
|
||||
var r = null
|
||||
for (i = 0; i < n; i++) { r = json.encode(strings_short) }
|
||||
return r
|
||||
},
|
||||
json_encode_strings_long: function(n) {
|
||||
var i = 0
|
||||
var r = null
|
||||
for (i = 0; i < n; i++) { r = json.encode(strings_long) }
|
||||
return r
|
||||
},
|
||||
json_encode_strings_unicode: function(n) {
|
||||
var i = 0
|
||||
var r = null
|
||||
for (i = 0; i < n; i++) { r = json.encode(strings_unicode) }
|
||||
return r
|
||||
},
|
||||
json_encode_nested: function(n) {
|
||||
var i = 0
|
||||
var r = null
|
||||
for (i = 0; i < n; i++) { r = json.encode(nested_records) }
|
||||
return r
|
||||
},
|
||||
json_encode_flat: function(n) {
|
||||
var i = 0
|
||||
var r = null
|
||||
for (i = 0; i < n; i++) { r = json.encode(flat_record) }
|
||||
return r
|
||||
},
|
||||
json_encode_mixed: function(n) {
|
||||
var i = 0
|
||||
var r = null
|
||||
for (i = 0; i < n; i++) { r = json.encode(mixed_payload) }
|
||||
return r
|
||||
},
|
||||
|
||||
// JSON decode
|
||||
json_decode_integers: function(n) {
|
||||
var i = 0
|
||||
var r = null
|
||||
for (i = 0; i < n; i++) { r = json.decode(json_enc_integers) }
|
||||
return r
|
||||
},
|
||||
json_decode_floats: function(n) {
|
||||
var i = 0
|
||||
var r = null
|
||||
for (i = 0; i < n; i++) { r = json.decode(json_enc_floats) }
|
||||
return r
|
||||
},
|
||||
json_decode_strings_short: function(n) {
|
||||
var i = 0
|
||||
var r = null
|
||||
for (i = 0; i < n; i++) { r = json.decode(json_enc_strings_short) }
|
||||
return r
|
||||
},
|
||||
json_decode_strings_long: function(n) {
|
||||
var i = 0
|
||||
var r = null
|
||||
for (i = 0; i < n; i++) { r = json.decode(json_enc_strings_long) }
|
||||
return r
|
||||
},
|
||||
json_decode_strings_unicode: function(n) {
|
||||
var i = 0
|
||||
var r = null
|
||||
for (i = 0; i < n; i++) { r = json.decode(json_enc_strings_unicode) }
|
||||
return r
|
||||
},
|
||||
json_decode_nested: function(n) {
|
||||
var i = 0
|
||||
var r = null
|
||||
for (i = 0; i < n; i++) { r = json.decode(json_enc_nested) }
|
||||
return r
|
||||
},
|
||||
json_decode_flat: function(n) {
|
||||
var i = 0
|
||||
var r = null
|
||||
for (i = 0; i < n; i++) { r = json.decode(json_enc_flat) }
|
||||
return r
|
||||
},
|
||||
json_decode_mixed: function(n) {
|
||||
var i = 0
|
||||
var r = null
|
||||
for (i = 0; i < n; i++) { r = json.decode(json_enc_mixed) }
|
||||
return r
|
||||
}
|
||||
}
|
||||
126
benches/fibonacci.cm
Normal file
126
benches/fibonacci.cm
Normal file
@@ -0,0 +1,126 @@
|
||||
// fibonacci.cm — Fibonacci variants kernel
|
||||
// Tests recursion overhead, memoization patterns, iteration vs recursion.
|
||||
|
||||
// Naive recursive (exponential) — measures call overhead
|
||||
function fib_naive(n) {
|
||||
if (n <= 1) return n
|
||||
return fib_naive(n - 1) + fib_naive(n - 2)
|
||||
}
|
||||
|
||||
// Iterative (linear)
|
||||
function fib_iter(n) {
|
||||
var a = 0
|
||||
var b = 1
|
||||
var i = 0
|
||||
var tmp = 0
|
||||
for (i = 0; i < n; i++) {
|
||||
tmp = a + b
|
||||
a = b
|
||||
b = tmp
|
||||
}
|
||||
return a
|
||||
}
|
||||
|
||||
// Memoized recursive (tests object property lookup + recursion)
|
||||
function make_memo_fib() {
|
||||
var cache = {}
|
||||
var fib = function(n) {
|
||||
var key = text(n)
|
||||
if (cache[key]) return cache[key]
|
||||
var result = null
|
||||
if (n <= 1) {
|
||||
result = n
|
||||
} else {
|
||||
result = fib(n - 1) + fib(n - 2)
|
||||
}
|
||||
cache[key] = result
|
||||
return result
|
||||
}
|
||||
return fib
|
||||
}
|
||||
|
||||
// CPS (continuation passing style) — tests closure creation
|
||||
function fib_cps(n, cont) {
|
||||
if (n <= 1) return cont(n)
|
||||
return fib_cps(n - 1, function(a) {
|
||||
return fib_cps(n - 2, function(b) {
|
||||
return cont(a + b)
|
||||
})
|
||||
})
|
||||
}
|
||||
|
||||
// Matrix exponentiation style (accumulator)
|
||||
function fib_matrix(n) {
|
||||
var a = 1
|
||||
var b = 0
|
||||
var c = 0
|
||||
var d = 1
|
||||
var ta = 0
|
||||
var tb = 0
|
||||
var m = n
|
||||
while (m > 0) {
|
||||
if (m % 2 == 1) {
|
||||
ta = a * d + b * c // wrong but stresses numeric ops
|
||||
tb = b * d + a * c
|
||||
a = ta
|
||||
b = tb
|
||||
}
|
||||
ta = c * c + d * d
|
||||
tb = d * (2 * c + d)
|
||||
c = ta
|
||||
d = tb
|
||||
m = floor(m / 2)
|
||||
}
|
||||
return b
|
||||
}
|
||||
|
||||
return {
|
||||
fib_naive_25: function(n) {
|
||||
var i = 0
|
||||
var x = 0
|
||||
for (i = 0; i < n; i++) x += fib_naive(25)
|
||||
return x
|
||||
},
|
||||
|
||||
fib_naive_30: function(n) {
|
||||
var i = 0
|
||||
var x = 0
|
||||
for (i = 0; i < n; i++) x += fib_naive(30)
|
||||
return x
|
||||
},
|
||||
|
||||
fib_iter_80: function(n) {
|
||||
var i = 0
|
||||
var x = 0
|
||||
for (i = 0; i < n; i++) x += fib_iter(80)
|
||||
return x
|
||||
},
|
||||
|
||||
fib_memo_100: function(n) {
|
||||
var i = 0
|
||||
var x = 0
|
||||
var fib = null
|
||||
for (i = 0; i < n; i++) {
|
||||
fib = make_memo_fib()
|
||||
x += fib(100)
|
||||
}
|
||||
return x
|
||||
},
|
||||
|
||||
fib_cps_20: function(n) {
|
||||
var i = 0
|
||||
var x = 0
|
||||
var identity = function(v) { return v }
|
||||
for (i = 0; i < n; i++) {
|
||||
x += fib_cps(20, identity)
|
||||
}
|
||||
return x
|
||||
},
|
||||
|
||||
fib_matrix_80: function(n) {
|
||||
var i = 0
|
||||
var x = 0
|
||||
for (i = 0; i < n; i++) x += fib_matrix(80)
|
||||
return x
|
||||
}
|
||||
}
|
||||
159
benches/hash_workload.cm
Normal file
159
benches/hash_workload.cm
Normal file
@@ -0,0 +1,159 @@
|
||||
// hash_workload.cm — Hash-heavy / word-count / map-reduce kernel
|
||||
// Stresses record (object) creation, property access, and string handling.
|
||||
|
||||
function make_words(count) {
|
||||
// Generate a repeating word list to simulate text processing
|
||||
var base_words = [
|
||||
"the", "quick", "brown", "fox", "jumps", "over", "lazy", "dog",
|
||||
"and", "cat", "sat", "on", "mat", "with", "hat", "bat",
|
||||
"alpha", "beta", "gamma", "delta", "epsilon", "zeta", "eta", "theta",
|
||||
"hello", "world", "foo", "bar", "baz", "qux", "quux", "corge"
|
||||
]
|
||||
var words = []
|
||||
var i = 0
|
||||
for (i = 0; i < count; i++) {
|
||||
words[] = base_words[i % length(base_words)]
|
||||
}
|
||||
return words
|
||||
}
|
||||
|
||||
// Word frequency count
|
||||
function word_count(words) {
|
||||
var freq = {}
|
||||
var i = 0
|
||||
var w = null
|
||||
for (i = 0; i < length(words); i++) {
|
||||
w = words[i]
|
||||
if (freq[w]) {
|
||||
freq[w] = freq[w] + 1
|
||||
} else {
|
||||
freq[w] = 1
|
||||
}
|
||||
}
|
||||
return freq
|
||||
}
|
||||
|
||||
// Find top-N words by frequency
|
||||
function top_n(freq, n) {
|
||||
var keys = array(freq)
|
||||
var pairs = []
|
||||
var i = 0
|
||||
for (i = 0; i < length(keys); i++) {
|
||||
pairs[] = {word: keys[i], count: freq[keys[i]]}
|
||||
}
|
||||
var sorted = sort(pairs, "count")
|
||||
// Return last N (highest counts)
|
||||
var result = []
|
||||
var start = length(sorted) - n
|
||||
if (start < 0) start = 0
|
||||
for (i = start; i < length(sorted); i++) {
|
||||
result[] = sorted[i]
|
||||
}
|
||||
return result
|
||||
}
|
||||
|
||||
// Histogram: group words by length
|
||||
function group_by_length(words) {
|
||||
var groups = {}
|
||||
var i = 0
|
||||
var w = null
|
||||
var k = null
|
||||
for (i = 0; i < length(words); i++) {
|
||||
w = words[i]
|
||||
k = text(length(w))
|
||||
if (!groups[k]) groups[k] = []
|
||||
groups[k][] = w
|
||||
}
|
||||
return groups
|
||||
}
|
||||
|
||||
// Simple hash table with chaining (stress property access patterns)
|
||||
function hash_table_ops(n) {
|
||||
var table = {}
|
||||
var i = 0
|
||||
var k = null
|
||||
var collisions = 0
|
||||
|
||||
// Insert phase
|
||||
for (i = 0; i < n; i++) {
|
||||
k = `key_${i % 512}`
|
||||
if (table[k]) collisions++
|
||||
table[k] = i
|
||||
}
|
||||
|
||||
// Lookup phase
|
||||
var found = 0
|
||||
for (i = 0; i < n; i++) {
|
||||
k = `key_${i % 512}`
|
||||
if (table[k]) found++
|
||||
}
|
||||
|
||||
// Delete phase
|
||||
var deleted = 0
|
||||
for (i = 0; i < n; i += 3) {
|
||||
k = `key_${i % 512}`
|
||||
if (table[k]) {
|
||||
delete table[k]
|
||||
deleted++
|
||||
}
|
||||
}
|
||||
|
||||
return found - deleted + collisions
|
||||
}
|
||||
|
||||
var words_1k = make_words(1000)
|
||||
var words_10k = make_words(10000)
|
||||
|
||||
return {
|
||||
// Word count on 1K words
|
||||
wordcount_1k: function(n) {
|
||||
var i = 0
|
||||
var freq = null
|
||||
for (i = 0; i < n; i++) {
|
||||
freq = word_count(words_1k)
|
||||
}
|
||||
return freq
|
||||
},
|
||||
|
||||
// Word count on 10K words
|
||||
wordcount_10k: function(n) {
|
||||
var i = 0
|
||||
var freq = null
|
||||
for (i = 0; i < n; i++) {
|
||||
freq = word_count(words_10k)
|
||||
}
|
||||
return freq
|
||||
},
|
||||
|
||||
// Word count + top-10 extraction
|
||||
wordcount_top10: function(n) {
|
||||
var i = 0
|
||||
var freq = null
|
||||
var top = null
|
||||
for (i = 0; i < n; i++) {
|
||||
freq = word_count(words_10k)
|
||||
top = top_n(freq, 10)
|
||||
}
|
||||
return top
|
||||
},
|
||||
|
||||
// Group words by length
|
||||
group_by_len: function(n) {
|
||||
var i = 0
|
||||
var groups = null
|
||||
for (i = 0; i < n; i++) {
|
||||
groups = group_by_length(words_10k)
|
||||
}
|
||||
return groups
|
||||
},
|
||||
|
||||
// Hash table insert/lookup/delete
|
||||
hash_table: function(n) {
|
||||
var i = 0
|
||||
var x = 0
|
||||
for (i = 0; i < n; i++) {
|
||||
x += hash_table_ops(2048)
|
||||
}
|
||||
return x
|
||||
}
|
||||
}
|
||||
167
benches/json_walk.cm
Normal file
167
benches/json_walk.cm
Normal file
@@ -0,0 +1,167 @@
|
||||
// json_walk.cm — JSON parse + walk + serialize kernel
|
||||
// Stresses strings, records, arrays, and recursive traversal.
|
||||
|
||||
var json = use('json')
|
||||
|
||||
function make_nested_object(depth, breadth) {
|
||||
var obj = {}
|
||||
var i = 0
|
||||
var k = null
|
||||
if (depth <= 0) {
|
||||
for (i = 0; i < breadth; i++) {
|
||||
k = `key_${i}`
|
||||
obj[k] = i * 3.14
|
||||
}
|
||||
return obj
|
||||
}
|
||||
for (i = 0; i < breadth; i++) {
|
||||
k = `node_${i}`
|
||||
obj[k] = make_nested_object(depth - 1, breadth)
|
||||
}
|
||||
obj.value = depth
|
||||
obj.name = `level_${depth}`
|
||||
return obj
|
||||
}
|
||||
|
||||
function make_array_data(size) {
|
||||
var arr = []
|
||||
var i = 0
|
||||
for (i = 0; i < size; i++) {
|
||||
arr[] = {
|
||||
id: i,
|
||||
name: `item_${i}`,
|
||||
active: i % 2 == 0,
|
||||
score: i * 1.5,
|
||||
tags: [`tag_${i % 5}`, `tag_${(i + 1) % 5}`]
|
||||
}
|
||||
}
|
||||
return arr
|
||||
}
|
||||
|
||||
// Walk an object tree, counting nodes
|
||||
function walk_count(obj) {
|
||||
var count = 1
|
||||
var keys = null
|
||||
var i = 0
|
||||
var v = null
|
||||
if (is_object(obj)) {
|
||||
keys = array(obj)
|
||||
for (i = 0; i < length(keys); i++) {
|
||||
v = obj[keys[i]]
|
||||
if (is_object(v) || is_array(v)) {
|
||||
count += walk_count(v)
|
||||
}
|
||||
}
|
||||
} else if (is_array(obj)) {
|
||||
for (i = 0; i < length(obj); i++) {
|
||||
v = obj[i]
|
||||
if (is_object(v) || is_array(v)) {
|
||||
count += walk_count(v)
|
||||
}
|
||||
}
|
||||
}
|
||||
return count
|
||||
}
|
||||
|
||||
// Walk and extract all numbers
|
||||
function walk_sum(obj) {
|
||||
var sum = 0
|
||||
var keys = null
|
||||
var i = 0
|
||||
var v = null
|
||||
if (is_object(obj)) {
|
||||
keys = array(obj)
|
||||
for (i = 0; i < length(keys); i++) {
|
||||
v = obj[keys[i]]
|
||||
if (is_number(v)) {
|
||||
sum += v
|
||||
} else if (is_object(v) || is_array(v)) {
|
||||
sum += walk_sum(v)
|
||||
}
|
||||
}
|
||||
} else if (is_array(obj)) {
|
||||
for (i = 0; i < length(obj); i++) {
|
||||
v = obj[i]
|
||||
if (is_number(v)) {
|
||||
sum += v
|
||||
} else if (is_object(v) || is_array(v)) {
|
||||
sum += walk_sum(v)
|
||||
}
|
||||
}
|
||||
}
|
||||
return sum
|
||||
}
|
||||
|
||||
// Pre-build test data strings
|
||||
var nested_obj = make_nested_object(3, 4)
|
||||
var nested_json = json.encode(nested_obj)
|
||||
var array_data = make_array_data(200)
|
||||
var array_json = json.encode(array_data)
|
||||
|
||||
return {
|
||||
// Parse nested JSON
|
||||
json_parse_nested: function(n) {
|
||||
var i = 0
|
||||
var obj = null
|
||||
for (i = 0; i < n; i++) {
|
||||
obj = json.decode(nested_json)
|
||||
}
|
||||
return obj
|
||||
},
|
||||
|
||||
// Parse array-of-records JSON
|
||||
json_parse_array: function(n) {
|
||||
var i = 0
|
||||
var arr = null
|
||||
for (i = 0; i < n; i++) {
|
||||
arr = json.decode(array_json)
|
||||
}
|
||||
return arr
|
||||
},
|
||||
|
||||
// Encode nested object to JSON
|
||||
json_encode_nested: function(n) {
|
||||
var i = 0
|
||||
var s = null
|
||||
for (i = 0; i < n; i++) {
|
||||
s = json.encode(nested_obj)
|
||||
}
|
||||
return s
|
||||
},
|
||||
|
||||
// Encode array to JSON
|
||||
json_encode_array: function(n) {
|
||||
var i = 0
|
||||
var s = null
|
||||
for (i = 0; i < n; i++) {
|
||||
s = json.encode(array_data)
|
||||
}
|
||||
return s
|
||||
},
|
||||
|
||||
// Parse + walk + count
|
||||
json_roundtrip_walk: function(n) {
|
||||
var i = 0
|
||||
var obj = null
|
||||
var count = 0
|
||||
for (i = 0; i < n; i++) {
|
||||
obj = json.decode(nested_json)
|
||||
count += walk_count(obj)
|
||||
}
|
||||
return count
|
||||
},
|
||||
|
||||
// Parse + sum all numbers + re-encode
|
||||
json_roundtrip_full: function(n) {
|
||||
var i = 0
|
||||
var obj = null
|
||||
var sum = 0
|
||||
var out = null
|
||||
for (i = 0; i < n; i++) {
|
||||
obj = json.decode(array_json)
|
||||
sum += walk_sum(obj)
|
||||
out = json.encode(obj)
|
||||
}
|
||||
return sum
|
||||
}
|
||||
}
|
||||
302
benches/micro_core.cm
Normal file
302
benches/micro_core.cm
Normal file
@@ -0,0 +1,302 @@
|
||||
// micro_core.cm — direct microbenchmarks for core ops
|
||||
|
||||
function blackhole(sink, x) {
|
||||
return (sink + (x | 0)) | 0
|
||||
}
|
||||
|
||||
function make_obj_xy(x, y) {
|
||||
return {x: x, y: y}
|
||||
}
|
||||
|
||||
function make_obj_yx(x, y) {
|
||||
// Different insertion order to force a different shape
|
||||
return {y: y, x: x}
|
||||
}
|
||||
|
||||
function make_packed_array(n) {
|
||||
var a = []
|
||||
var i = 0
|
||||
for (i = 0; i < n; i++) a[] = i
|
||||
return a
|
||||
}
|
||||
|
||||
function make_holey_array(n) {
|
||||
var a = []
|
||||
var i = 0
|
||||
for (i = 0; i < n; i += 2) a[i] = i
|
||||
return a
|
||||
}
|
||||
|
||||
return {
|
||||
loop_empty: function(n) {
|
||||
var sink = 0
|
||||
var i = 0
|
||||
for (i = 0; i < n; i++) {}
|
||||
return blackhole(sink, n)
|
||||
},
|
||||
|
||||
i32_add: function(n) {
|
||||
var sink = 0
|
||||
var x = 1
|
||||
var i = 0
|
||||
for (i = 0; i < n; i++) x = (x + 3) | 0
|
||||
return blackhole(sink, x)
|
||||
},
|
||||
|
||||
f64_add: function(n) {
|
||||
var sink = 0
|
||||
var x = 1.0
|
||||
var i = 0
|
||||
for (i = 0; i < n; i++) x = x + 3.14159
|
||||
return blackhole(sink, x | 0)
|
||||
},
|
||||
|
||||
mixed_add: function(n) {
|
||||
var sink = 0
|
||||
var x = 1
|
||||
var i = 0
|
||||
for (i = 0; i < n; i++) x = x + 0.25
|
||||
return blackhole(sink, x | 0)
|
||||
},
|
||||
|
||||
bit_ops: function(n) {
|
||||
var sink = 0
|
||||
var x = 0x12345678
|
||||
var i = 0
|
||||
for (i = 0; i < n; i++) x = ((x << 5) ^ (x >>> 3)) | 0
|
||||
return blackhole(sink, x)
|
||||
},
|
||||
|
||||
overflow_path: function(n) {
|
||||
var sink = 0
|
||||
var x = 0x70000000
|
||||
var i = 0
|
||||
for (i = 0; i < n; i++) x = (x + 0x10000000) | 0
|
||||
return blackhole(sink, x)
|
||||
},
|
||||
|
||||
call_direct: function(n) {
|
||||
var sink = 0
|
||||
var f = function(a) { return (a + 1) | 0 }
|
||||
var x = 0
|
||||
var i = 0
|
||||
for (i = 0; i < n; i++) x = f(x)
|
||||
return blackhole(sink, x)
|
||||
},
|
||||
|
||||
call_indirect: function(n) {
|
||||
var sink = 0
|
||||
var f = function(a) { return (a + 1) | 0 }
|
||||
var g = f
|
||||
var x = 0
|
||||
var i = 0
|
||||
for (i = 0; i < n; i++) x = g(x)
|
||||
return blackhole(sink, x)
|
||||
},
|
||||
|
||||
call_closure: function(n) {
|
||||
var sink = 0
|
||||
var make_adder = function(k) {
|
||||
return function(a) { return (a + k) | 0 }
|
||||
}
|
||||
var add3 = make_adder(3)
|
||||
var x = 0
|
||||
var i = 0
|
||||
for (i = 0; i < n; i++) x = add3(x)
|
||||
return blackhole(sink, x)
|
||||
},
|
||||
|
||||
array_read_packed: function(n) {
|
||||
var sink = 0
|
||||
var a = make_packed_array(1024)
|
||||
var x = 0
|
||||
var i = 0
|
||||
for (i = 0; i < n; i++) x = (x + a[i & 1023]) | 0
|
||||
return blackhole(sink, x)
|
||||
},
|
||||
|
||||
array_write_packed: function(n) {
|
||||
var sink = 0
|
||||
var a = make_packed_array(1024)
|
||||
var i = 0
|
||||
for (i = 0; i < n; i++) a[i & 1023] = i
|
||||
return blackhole(sink, a[17] | 0)
|
||||
},
|
||||
|
||||
array_read_holey: function(n) {
|
||||
var sink = 0
|
||||
var a = make_holey_array(2048)
|
||||
var x = 0
|
||||
var i = 0
|
||||
var v = null
|
||||
for (i = 0; i < n; i++) {
|
||||
v = a[(i & 2047)]
|
||||
if (v) x = (x + v) | 0
|
||||
}
|
||||
return blackhole(sink, x)
|
||||
},
|
||||
|
||||
array_push_steady: function(n) {
|
||||
var sink = 0
|
||||
var x = 0
|
||||
var j = 0
|
||||
var i = 0
|
||||
var a = null
|
||||
for (j = 0; j < n; j++) {
|
||||
a = []
|
||||
for (i = 0; i < 256; i++) a[] = i
|
||||
x = (x + length(a)) | 0
|
||||
}
|
||||
return blackhole(sink, x)
|
||||
},
|
||||
|
||||
array_indexed_sum: function(n) {
|
||||
var sink = 0
|
||||
var a = make_packed_array(1024)
|
||||
var x = 0
|
||||
var j = 0
|
||||
var i = 0
|
||||
for (j = 0; j < n; j++) {
|
||||
x = 0
|
||||
for (i = 0; i < 1024; i++) {
|
||||
x = (x + a[i]) | 0
|
||||
}
|
||||
}
|
||||
return blackhole(sink, x)
|
||||
},
|
||||
|
||||
prop_read_mono: function(n) {
|
||||
var sink = 0
|
||||
var o = make_obj_xy(1, 2)
|
||||
var x = 0
|
||||
var i = 0
|
||||
for (i = 0; i < n; i++) x = (x + o.x) | 0
|
||||
return blackhole(sink, x)
|
||||
},
|
||||
|
||||
prop_read_poly_2: function(n) {
|
||||
var sink = 0
|
||||
var a = make_obj_xy(1, 2)
|
||||
var b = make_obj_yx(1, 2)
|
||||
var x = 0
|
||||
var i = 0
|
||||
var o = null
|
||||
for (i = 0; i < n; i++) {
|
||||
o = (i & 1) == 0 ? a : b
|
||||
x = (x + o.x) | 0
|
||||
}
|
||||
return blackhole(sink, x)
|
||||
},
|
||||
|
||||
prop_read_poly_4: function(n) {
|
||||
var sink = 0
|
||||
var shapes = [
|
||||
{x: 1, y: 2},
|
||||
{y: 2, x: 1},
|
||||
{x: 1, z: 3, y: 2},
|
||||
{w: 0, x: 1, y: 2}
|
||||
]
|
||||
var x = 0
|
||||
var i = 0
|
||||
for (i = 0; i < n; i++) {
|
||||
x = (x + shapes[i & 3].x) | 0
|
||||
}
|
||||
return blackhole(sink, x)
|
||||
},
|
||||
|
||||
string_concat_small: function(n) {
|
||||
var sink = 0
|
||||
var x = 0
|
||||
var j = 0
|
||||
var i = 0
|
||||
var s = null
|
||||
for (j = 0; j < n; j++) {
|
||||
s = ""
|
||||
for (i = 0; i < 16; i++) s = s + "x"
|
||||
x = (x + length(s)) | 0
|
||||
}
|
||||
return blackhole(sink, x)
|
||||
},
|
||||
|
||||
string_concat_medium: function(n) {
|
||||
var sink = 0
|
||||
var x = 0
|
||||
var j = 0
|
||||
var i = 0
|
||||
var s = null
|
||||
for (j = 0; j < n; j++) {
|
||||
s = ""
|
||||
for (i = 0; i < 100; i++) s = s + "abcdefghij"
|
||||
x = (x + length(s)) | 0
|
||||
}
|
||||
return blackhole(sink, x)
|
||||
},
|
||||
|
||||
string_slice: function(n) {
|
||||
var sink = 0
|
||||
var base = "the quick brown fox jumps over the lazy dog"
|
||||
var x = 0
|
||||
var i = 0
|
||||
var s = null
|
||||
for (i = 0; i < n; i++) {
|
||||
s = text(base, i % 10, i % 10 + 10)
|
||||
x = (x + length(s)) | 0
|
||||
}
|
||||
return blackhole(sink, x)
|
||||
},
|
||||
|
||||
guard_hot_number: function(n) {
|
||||
var sink = 0
|
||||
var x = 1
|
||||
var i = 0
|
||||
for (i = 0; i < n; i++) x = x + 1
|
||||
return blackhole(sink, x | 0)
|
||||
},
|
||||
|
||||
guard_mixed_types: function(n) {
|
||||
var sink = 0
|
||||
var vals = [1, "a", 2, "b", 3, "c", 4, "d"]
|
||||
var x = 0
|
||||
var i = 0
|
||||
for (i = 0; i < n; i++) {
|
||||
if (is_number(vals[i & 7])) x = (x + vals[i & 7]) | 0
|
||||
}
|
||||
return blackhole(sink, x)
|
||||
},
|
||||
|
||||
reduce_sum: function(n) {
|
||||
var sink = 0
|
||||
var a = make_packed_array(256)
|
||||
var x = 0
|
||||
var i = 0
|
||||
for (i = 0; i < n; i++) {
|
||||
x = (x + reduce(a, function(acc, v) { return acc + v }, 0)) | 0
|
||||
}
|
||||
return blackhole(sink, x)
|
||||
},
|
||||
|
||||
filter_evens: function(n) {
|
||||
var sink = 0
|
||||
var a = make_packed_array(256)
|
||||
var x = 0
|
||||
var i = 0
|
||||
for (i = 0; i < n; i++) {
|
||||
x = (x + length(filter(a, function(v) { return v % 2 == 0 }))) | 0
|
||||
}
|
||||
return blackhole(sink, x)
|
||||
},
|
||||
|
||||
arrfor_sum: function(n) {
|
||||
var sink = 0
|
||||
var a = make_packed_array(256)
|
||||
var x = 0
|
||||
var i = 0
|
||||
var sum = 0
|
||||
for (i = 0; i < n; i++) {
|
||||
sum = 0
|
||||
arrfor(a, function(v) { sum += v })
|
||||
x = (x + sum) | 0
|
||||
}
|
||||
return blackhole(sink, x)
|
||||
}
|
||||
}
|
||||
494
benches/micro_ops.cm
Normal file
494
benches/micro_ops.cm
Normal file
@@ -0,0 +1,494 @@
|
||||
// micro_ops.cm — microbenchmarks for core operations
|
||||
|
||||
function blackhole(sink, x) {
|
||||
return (sink + (x | 0)) | 0
|
||||
}
|
||||
|
||||
function make_obj_xy(x, y) {
|
||||
return {x: x, y: y}
|
||||
}
|
||||
|
||||
function make_obj_yx(x, y) {
|
||||
// Different insertion order to force a different shape
|
||||
return {y: y, x: x}
|
||||
}
|
||||
|
||||
function make_shapes(n) {
|
||||
var out = []
|
||||
var i = 0
|
||||
var o = null
|
||||
for (i = 0; i < n; i++) {
|
||||
o = {a: i}
|
||||
o[`p${i}`] = i
|
||||
push(out, o)
|
||||
}
|
||||
return out
|
||||
}
|
||||
|
||||
function make_packed_array(n) {
|
||||
var a = []
|
||||
var i = 0
|
||||
for (i = 0; i < n; i++) push(a, i)
|
||||
return a
|
||||
}
|
||||
|
||||
function make_holey_array(n) {
|
||||
var a = []
|
||||
var i = 0
|
||||
for (i = 0; i < n; i += 2) a[i] = i
|
||||
return a
|
||||
}
|
||||
|
||||
return {
|
||||
// 0) Baseline loop cost
|
||||
loop_empty: function(n) {
|
||||
var sink = 0
|
||||
var i = 0
|
||||
for (i = 0; i < n; i++) {}
|
||||
return blackhole(sink, n)
|
||||
},
|
||||
|
||||
// 1) Numeric pipelines
|
||||
i32_add: function(n) {
|
||||
var sink = 0
|
||||
var x = 1
|
||||
var i = 0
|
||||
for (i = 0; i < n; i++) x = (x + 3) | 0
|
||||
return blackhole(sink, x)
|
||||
},
|
||||
|
||||
f64_add: function(n) {
|
||||
var sink = 0
|
||||
var x = 1.0
|
||||
var i = 0
|
||||
for (i = 0; i < n; i++) x = x + 3.14159
|
||||
return blackhole(sink, x | 0)
|
||||
},
|
||||
|
||||
mixed_add: function(n) {
|
||||
var sink = 0
|
||||
var x = 1
|
||||
var i = 0
|
||||
for (i = 0; i < n; i++) x = x + 0.25
|
||||
return blackhole(sink, x | 0)
|
||||
},
|
||||
|
||||
bit_ops: function(n) {
|
||||
var sink = 0
|
||||
var x = 0x12345678
|
||||
var i = 0
|
||||
for (i = 0; i < n; i++) x = ((x << 5) ^ (x >>> 3)) | 0
|
||||
return blackhole(sink, x)
|
||||
},
|
||||
|
||||
overflow_path: function(n) {
|
||||
var sink = 0
|
||||
var x = 0x70000000
|
||||
var i = 0
|
||||
for (i = 0; i < n; i++) x = (x + 0x10000000) | 0
|
||||
return blackhole(sink, x)
|
||||
},
|
||||
|
||||
// 2) Branching
|
||||
branch_predictable: function(n) {
|
||||
var sink = 0
|
||||
var x = 0
|
||||
var i = 0
|
||||
for (i = 0; i < n; i++) {
|
||||
if ((i & 7) != 0) x++
|
||||
else x += 2
|
||||
}
|
||||
return blackhole(sink, x)
|
||||
},
|
||||
|
||||
branch_alternating: function(n) {
|
||||
var sink = 0
|
||||
var x = 0
|
||||
var i = 0
|
||||
for (i = 0; i < n; i++) {
|
||||
if ((i & 1) == 0) x++
|
||||
else x += 2
|
||||
}
|
||||
return blackhole(sink, x)
|
||||
},
|
||||
|
||||
// 3) Calls
|
||||
call_direct: function(n) {
|
||||
var sink = 0
|
||||
var f = function(a) { return (a + 1) | 0 }
|
||||
var x = 0
|
||||
var i = 0
|
||||
for (i = 0; i < n; i++) x = f(x)
|
||||
return blackhole(sink, x)
|
||||
},
|
||||
|
||||
call_indirect: function(n) {
|
||||
var sink = 0
|
||||
var f = function(a) { return (a + 1) | 0 }
|
||||
var g = f
|
||||
var x = 0
|
||||
var i = 0
|
||||
for (i = 0; i < n; i++) x = g(x)
|
||||
return blackhole(sink, x)
|
||||
},
|
||||
|
||||
call_closure: function(n) {
|
||||
var sink = 0
|
||||
var make_adder = function(k) {
|
||||
return function(a) { return (a + k) | 0 }
|
||||
}
|
||||
var add3 = make_adder(3)
|
||||
var x = 0
|
||||
var i = 0
|
||||
for (i = 0; i < n; i++) x = add3(x)
|
||||
return blackhole(sink, x)
|
||||
},
|
||||
|
||||
call_multi_arity: function(n) {
|
||||
var sink = 0
|
||||
var f0 = function() { return 1 }
|
||||
var f1 = function(a) { return a + 1 }
|
||||
var f2 = function(a, b) { return a + b }
|
||||
var f3 = function(a, b, c) { return a + b + c }
|
||||
var f4 = function(a, b, c, d) { return a + b + c + d }
|
||||
var x = 0
|
||||
var i = 0
|
||||
for (i = 0; i < n; i++) {
|
||||
x = (x + f0() + f1(i) + f2(i, 1) + f3(i, 1, 2) + f4(i, 1, 2, 3)) | 0
|
||||
}
|
||||
return blackhole(sink, x)
|
||||
},
|
||||
|
||||
// 4) Object props (ICs / shapes)
|
||||
prop_read_mono: function(n) {
|
||||
var sink = 0
|
||||
var o = make_obj_xy(1, 2)
|
||||
var x = 0
|
||||
var i = 0
|
||||
for (i = 0; i < n; i++) x = (x + o.x) | 0
|
||||
return blackhole(sink, x)
|
||||
},
|
||||
|
||||
prop_read_poly_2: function(n) {
|
||||
var sink = 0
|
||||
var a = make_obj_xy(1, 2)
|
||||
var b = make_obj_yx(1, 2)
|
||||
var x = 0
|
||||
var i = 0
|
||||
var o = null
|
||||
for (i = 0; i < n; i++) {
|
||||
o = (i & 1) == 0 ? a : b
|
||||
x = (x + o.x) | 0
|
||||
}
|
||||
return blackhole(sink, x)
|
||||
},
|
||||
|
||||
prop_read_poly_4: function(n) {
|
||||
var sink = 0
|
||||
var shapes = [
|
||||
{x: 1, y: 2},
|
||||
{y: 2, x: 1},
|
||||
{x: 1, z: 3, y: 2},
|
||||
{w: 0, x: 1, y: 2}
|
||||
]
|
||||
var x = 0
|
||||
var i = 0
|
||||
for (i = 0; i < n; i++) {
|
||||
x = (x + shapes[i & 3].x) | 0
|
||||
}
|
||||
return blackhole(sink, x)
|
||||
},
|
||||
|
||||
prop_read_mega: function(n) {
|
||||
var sink = 0
|
||||
var objs = make_shapes(32)
|
||||
var x = 0
|
||||
var i = 0
|
||||
for (i = 0; i < n; i++) {
|
||||
x = (x + objs[i & 31].a) | 0
|
||||
}
|
||||
return blackhole(sink, x)
|
||||
},
|
||||
|
||||
prop_write_mono: function(n) {
|
||||
var sink = 0
|
||||
var o = make_obj_xy(1, 2)
|
||||
var i = 0
|
||||
for (i = 0; i < n; i++) o.x = (o.x + 1) | 0
|
||||
return blackhole(sink, o.x)
|
||||
},
|
||||
|
||||
// 5) Arrays
|
||||
array_read_packed: function(n) {
|
||||
var sink = 0
|
||||
var a = make_packed_array(1024)
|
||||
var x = 0
|
||||
var i = 0
|
||||
for (i = 0; i < n; i++) x = (x + a[i & 1023]) | 0
|
||||
return blackhole(sink, x)
|
||||
},
|
||||
|
||||
array_write_packed: function(n) {
|
||||
var sink = 0
|
||||
var a = make_packed_array(1024)
|
||||
var i = 0
|
||||
for (i = 0; i < n; i++) a[i & 1023] = i
|
||||
return blackhole(sink, a[17] | 0)
|
||||
},
|
||||
|
||||
array_read_holey: function(n) {
|
||||
var sink = 0
|
||||
var a = make_holey_array(2048)
|
||||
var x = 0
|
||||
var i = 0
|
||||
var v = null
|
||||
for (i = 0; i < n; i++) {
|
||||
v = a[(i & 2047)]
|
||||
if (v) x = (x + v) | 0
|
||||
}
|
||||
return blackhole(sink, x)
|
||||
},
|
||||
|
||||
array_push_steady: function(n) {
|
||||
var sink = 0
|
||||
var x = 0
|
||||
var j = 0
|
||||
var i = 0
|
||||
var a = null
|
||||
for (j = 0; j < n; j++) {
|
||||
a = []
|
||||
for (i = 0; i < 256; i++) push(a, i)
|
||||
x = (x + length(a)) | 0
|
||||
}
|
||||
return blackhole(sink, x)
|
||||
},
|
||||
|
||||
array_push_pop: function(n) {
|
||||
var sink = 0
|
||||
var a = []
|
||||
var x = 0
|
||||
var i = 0
|
||||
var v = 0
|
||||
for (i = 0; i < n; i++) {
|
||||
push(a, i)
|
||||
if (length(a) > 64) {
|
||||
v = a[]
|
||||
x = (x + v) | 0
|
||||
}
|
||||
}
|
||||
return blackhole(sink, x)
|
||||
},
|
||||
|
||||
array_indexed_sum: function(n) {
|
||||
var sink = 0
|
||||
var a = make_packed_array(1024)
|
||||
var x = 0
|
||||
var j = 0
|
||||
var i = 0
|
||||
for (j = 0; j < n; j++) {
|
||||
x = 0
|
||||
for (i = 0; i < 1024; i++) {
|
||||
x = (x + a[i]) | 0
|
||||
}
|
||||
}
|
||||
return blackhole(sink, x)
|
||||
},
|
||||
|
||||
// 6) Strings
|
||||
string_concat_small: function(n) {
|
||||
var sink = 0
|
||||
var x = 0
|
||||
var j = 0
|
||||
var i = 0
|
||||
var s = null
|
||||
for (j = 0; j < n; j++) {
|
||||
s = ""
|
||||
for (i = 0; i < 16; i++) s = s + "x"
|
||||
x = (x + length(s)) | 0
|
||||
}
|
||||
return blackhole(sink, x)
|
||||
},
|
||||
|
||||
string_concat_medium: function(n) {
|
||||
var sink = 0
|
||||
var x = 0
|
||||
var j = 0
|
||||
var i = 0
|
||||
var s = null
|
||||
for (j = 0; j < n; j++) {
|
||||
s = ""
|
||||
for (i = 0; i < 100; i++) s = s + "abcdefghij"
|
||||
x = (x + length(s)) | 0
|
||||
}
|
||||
return blackhole(sink, x)
|
||||
},
|
||||
|
||||
string_interpolation: function(n) {
|
||||
var sink = 0
|
||||
var x = 0
|
||||
var i = 0
|
||||
var s = null
|
||||
for (i = 0; i < n; i++) {
|
||||
s = `item_${i}_value_${i * 2}`
|
||||
x = (x + length(s)) | 0
|
||||
}
|
||||
return blackhole(sink, x)
|
||||
},
|
||||
|
||||
string_slice: function(n) {
|
||||
var sink = 0
|
||||
var base = "the quick brown fox jumps over the lazy dog"
|
||||
var x = 0
|
||||
var i = 0
|
||||
var s = null
|
||||
for (i = 0; i < n; i++) {
|
||||
s = text(base, i % 10, i % 10 + 10)
|
||||
x = (x + length(s)) | 0
|
||||
}
|
||||
return blackhole(sink, x)
|
||||
},
|
||||
|
||||
// 7) Allocation / GC pressure
|
||||
alloc_tiny_objects: function(n) {
|
||||
var sink = 0
|
||||
var x = 0
|
||||
var i = 0
|
||||
var o = null
|
||||
for (i = 0; i < n; i++) {
|
||||
o = {a: i, b: i + 1, c: i + 2}
|
||||
x = (x + o.b) | 0
|
||||
}
|
||||
return blackhole(sink, x)
|
||||
},
|
||||
|
||||
alloc_linked_list: function(n) {
|
||||
var sink = 0
|
||||
var head = null
|
||||
var i = 0
|
||||
var x = 0
|
||||
var p = null
|
||||
for (i = 0; i < n; i++) head = {v: i, next: head}
|
||||
x = 0
|
||||
p = head
|
||||
while (p) {
|
||||
x = (x + p.v) | 0
|
||||
p = p.next
|
||||
}
|
||||
return blackhole(sink, x)
|
||||
},
|
||||
|
||||
alloc_arrays: function(n) {
|
||||
var sink = 0
|
||||
var x = 0
|
||||
var i = 0
|
||||
var a = null
|
||||
for (i = 0; i < n; i++) {
|
||||
a = [i, i + 1, i + 2, i + 3]
|
||||
x = (x + a[2]) | 0
|
||||
}
|
||||
return blackhole(sink, x)
|
||||
},
|
||||
|
||||
alloc_short_lived: function(n) {
|
||||
var sink = 0
|
||||
var x = 0
|
||||
var i = 0
|
||||
var o = null
|
||||
// Allocate objects that immediately become garbage
|
||||
for (i = 0; i < n; i++) {
|
||||
o = {val: i, data: {inner: i + 1}}
|
||||
x = (x + o.data.inner) | 0
|
||||
}
|
||||
return blackhole(sink, x)
|
||||
},
|
||||
|
||||
alloc_long_lived_pressure: function(n) {
|
||||
var sink = 0
|
||||
var store = []
|
||||
var x = 0
|
||||
var i = 0
|
||||
var o = null
|
||||
// Keep first 1024 objects alive, churn the rest
|
||||
for (i = 0; i < n; i++) {
|
||||
o = {val: i, data: i * 2}
|
||||
if (i < 1024) {
|
||||
push(store, o)
|
||||
}
|
||||
x = (x + o.data) | 0
|
||||
}
|
||||
return blackhole(sink, x)
|
||||
},
|
||||
|
||||
// 8) Meme (prototype clone)
|
||||
meme_clone_read: function(n) {
|
||||
var sink = 0
|
||||
var base = {x: 1, y: 2}
|
||||
var x = 0
|
||||
var i = 0
|
||||
var o = null
|
||||
for (i = 0; i < n; i++) {
|
||||
o = meme(base)
|
||||
x = (x + o.x) | 0
|
||||
}
|
||||
return blackhole(sink, x)
|
||||
},
|
||||
|
||||
// 9) Guard / type check paths
|
||||
guard_hot_number: function(n) {
|
||||
// Monomorphic number path — guards should hoist
|
||||
var sink = 0
|
||||
var x = 1
|
||||
var i = 0
|
||||
for (i = 0; i < n; i++) x = x + 1
|
||||
return blackhole(sink, x | 0)
|
||||
},
|
||||
|
||||
guard_mixed_types: function(n) {
|
||||
// Alternating number/text — guards must stay
|
||||
var sink = 0
|
||||
var vals = [1, "a", 2, "b", 3, "c", 4, "d"]
|
||||
var x = 0
|
||||
var i = 0
|
||||
for (i = 0; i < n; i++) {
|
||||
if (is_number(vals[i & 7])) x = (x + vals[i & 7]) | 0
|
||||
}
|
||||
return blackhole(sink, x)
|
||||
},
|
||||
|
||||
// 10) Reduce / higher-order
|
||||
reduce_sum: function(n) {
|
||||
var sink = 0
|
||||
var a = make_packed_array(256)
|
||||
var x = 0
|
||||
var i = 0
|
||||
for (i = 0; i < n; i++) {
|
||||
x = (x + reduce(a, function(acc, v) { return acc + v }, 0)) | 0
|
||||
}
|
||||
return blackhole(sink, x)
|
||||
},
|
||||
|
||||
filter_evens: function(n) {
|
||||
var sink = 0
|
||||
var a = make_packed_array(256)
|
||||
var x = 0
|
||||
var i = 0
|
||||
for (i = 0; i < n; i++) {
|
||||
x = (x + length(filter(a, function(v) { return v % 2 == 0 }))) | 0
|
||||
}
|
||||
return blackhole(sink, x)
|
||||
},
|
||||
|
||||
arrfor_sum: function(n) {
|
||||
var sink = 0
|
||||
var a = make_packed_array(256)
|
||||
var x = 0
|
||||
var i = 0
|
||||
var sum = 0
|
||||
for (i = 0; i < n; i++) {
|
||||
sum = 0
|
||||
arrfor(a, function(v) { sum += v })
|
||||
x = (x + sum) | 0
|
||||
}
|
||||
return blackhole(sink, x)
|
||||
}
|
||||
}
|
||||
249
benches/module_load.cm
Normal file
249
benches/module_load.cm
Normal file
@@ -0,0 +1,249 @@
|
||||
// module_load.cm — Module loading simulation (macro benchmark)
|
||||
// Simulates parsing many small modules, linking, and running.
|
||||
// Tests the "build scenario" pattern.
|
||||
|
||||
var json = use('json')
|
||||
|
||||
// Simulate a small module: parse token stream + build AST + evaluate
|
||||
function tokenize(src) {
|
||||
var tokens = []
|
||||
var i = 0
|
||||
var ch = null
|
||||
var chars = array(src)
|
||||
var buf = ""
|
||||
|
||||
for (i = 0; i < length(chars); i++) {
|
||||
ch = chars[i]
|
||||
if (ch == " " || ch == "\n" || ch == "\t") {
|
||||
if (length(buf) > 0) {
|
||||
tokens[] = buf
|
||||
buf = ""
|
||||
}
|
||||
} else if (ch == "(" || ch == ")" || ch == "+" || ch == "-"
|
||||
|| ch == "*" || ch == "=" || ch == ";" || ch == ",") {
|
||||
if (length(buf) > 0) {
|
||||
tokens[] = buf
|
||||
buf = ""
|
||||
}
|
||||
tokens[] = ch
|
||||
} else {
|
||||
buf = buf + ch
|
||||
}
|
||||
}
|
||||
if (length(buf) > 0) tokens[] = buf
|
||||
return tokens
|
||||
}
|
||||
|
||||
// Build a simple AST from tokens
|
||||
function parse_tokens(tokens) {
|
||||
var ast = []
|
||||
var i = 0
|
||||
var tok = null
|
||||
var node = null
|
||||
for (i = 0; i < length(tokens); i++) {
|
||||
tok = tokens[i]
|
||||
if (tok == "var" || tok == "def") {
|
||||
node = {type: "decl", kind: tok, name: null, value: null}
|
||||
i++
|
||||
if (i < length(tokens)) node.name = tokens[i]
|
||||
i++ // skip =
|
||||
i++
|
||||
if (i < length(tokens)) node.value = tokens[i]
|
||||
ast[] = node
|
||||
} else if (tok == "return") {
|
||||
node = {type: "return", value: null}
|
||||
i++
|
||||
if (i < length(tokens)) node.value = tokens[i]
|
||||
ast[] = node
|
||||
} else if (tok == "function") {
|
||||
node = {type: "func", name: null, body: []}
|
||||
i++
|
||||
if (i < length(tokens)) node.name = tokens[i]
|
||||
// Skip to matching )
|
||||
while (i < length(tokens) && tokens[i] != ")") i++
|
||||
ast[] = node
|
||||
} else {
|
||||
ast[] = {type: "expr", value: tok}
|
||||
}
|
||||
}
|
||||
return ast
|
||||
}
|
||||
|
||||
// Evaluate: simple symbol table + resolution
|
||||
function evaluate(ast, env) {
|
||||
var result = null
|
||||
var i = 0
|
||||
var node = null
|
||||
for (i = 0; i < length(ast); i++) {
|
||||
node = ast[i]
|
||||
if (node.type == "decl") {
|
||||
env[node.name] = node.value
|
||||
} else if (node.type == "return") {
|
||||
result = node.value
|
||||
if (env[result]) result = env[result]
|
||||
} else if (node.type == "func") {
|
||||
env[node.name] = node
|
||||
}
|
||||
}
|
||||
return result
|
||||
}
|
||||
|
||||
// Generate fake module source code
|
||||
function generate_module(id, dep_count) {
|
||||
var src = ""
|
||||
var i = 0
|
||||
src = src + "var _id = " + text(id) + ";\n"
|
||||
for (i = 0; i < dep_count; i++) {
|
||||
src = src + "var dep" + text(i) + " = use(mod_" + text(i) + ");\n"
|
||||
}
|
||||
src = src + "var x = " + text(id * 17) + ";\n"
|
||||
src = src + "var y = " + text(id * 31) + ";\n"
|
||||
src = src + "function compute(a, b) { return a + b; }\n"
|
||||
src = src + "var result = compute(x, y);\n"
|
||||
src = src + "return result;\n"
|
||||
return src
|
||||
}
|
||||
|
||||
// Simulate loading N modules with dependency chains
|
||||
function simulate_build(n_modules, deps_per_module) {
|
||||
var modules = []
|
||||
var loaded = {}
|
||||
var i = 0
|
||||
var j = 0
|
||||
var src = null
|
||||
var tokens = null
|
||||
var ast = null
|
||||
var env = null
|
||||
var result = null
|
||||
var total_tokens = 0
|
||||
var total_nodes = 0
|
||||
|
||||
// Generate all module sources
|
||||
for (i = 0; i < n_modules; i++) {
|
||||
src = generate_module(i, deps_per_module)
|
||||
modules[] = src
|
||||
}
|
||||
|
||||
// "Load" each module: tokenize → parse → evaluate
|
||||
for (i = 0; i < n_modules; i++) {
|
||||
tokens = tokenize(modules[i])
|
||||
total_tokens += length(tokens)
|
||||
|
||||
ast = parse_tokens(tokens)
|
||||
total_nodes += length(ast)
|
||||
|
||||
env = {}
|
||||
// Resolve dependencies
|
||||
for (j = 0; j < deps_per_module; j++) {
|
||||
if (j < i) {
|
||||
env["dep" + text(j)] = loaded["mod_" + text(j)]
|
||||
}
|
||||
}
|
||||
|
||||
result = evaluate(ast, env)
|
||||
loaded["mod_" + text(i)] = result
|
||||
}
|
||||
|
||||
return {
|
||||
modules: n_modules,
|
||||
total_tokens: total_tokens,
|
||||
total_nodes: total_nodes,
|
||||
last_result: result
|
||||
}
|
||||
}
|
||||
|
||||
// Dependency graph analysis (topological sort simulation)
|
||||
function topo_sort(n_modules, deps_per_module) {
|
||||
// Build adjacency list
|
||||
var adj = {}
|
||||
var in_degree = {}
|
||||
var i = 0
|
||||
var j = 0
|
||||
var name = null
|
||||
var dep = null
|
||||
|
||||
for (i = 0; i < n_modules; i++) {
|
||||
name = "mod_" + text(i)
|
||||
adj[name] = []
|
||||
in_degree[name] = 0
|
||||
}
|
||||
|
||||
for (i = 0; i < n_modules; i++) {
|
||||
name = "mod_" + text(i)
|
||||
for (j = 0; j < deps_per_module; j++) {
|
||||
if (j < i) {
|
||||
dep = "mod_" + text(j)
|
||||
adj[dep][] = name
|
||||
in_degree[name] = in_degree[name] + 1
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Kahn's algorithm
|
||||
var queue = []
|
||||
var keys = array(in_degree)
|
||||
for (i = 0; i < length(keys); i++) {
|
||||
if (in_degree[keys[i]] == 0) queue[] = keys[i]
|
||||
}
|
||||
|
||||
var order = []
|
||||
var current = null
|
||||
var neighbors = null
|
||||
var qi = 0
|
||||
while (qi < length(queue)) {
|
||||
current = queue[qi]
|
||||
qi++
|
||||
order[] = current
|
||||
neighbors = adj[current]
|
||||
if (neighbors) {
|
||||
for (i = 0; i < length(neighbors); i++) {
|
||||
in_degree[neighbors[i]] = in_degree[neighbors[i]] - 1
|
||||
if (in_degree[neighbors[i]] == 0) queue[] = neighbors[i]
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return order
|
||||
}
|
||||
|
||||
return {
|
||||
// Small build: 50 modules, 3 deps each
|
||||
build_50: function(n) {
|
||||
var i = 0
|
||||
var result = null
|
||||
for (i = 0; i < n; i++) {
|
||||
result = simulate_build(50, 3)
|
||||
}
|
||||
return result
|
||||
},
|
||||
|
||||
// Medium build: 200 modules, 5 deps each
|
||||
build_200: function(n) {
|
||||
var i = 0
|
||||
var result = null
|
||||
for (i = 0; i < n; i++) {
|
||||
result = simulate_build(200, 5)
|
||||
}
|
||||
return result
|
||||
},
|
||||
|
||||
// Large build: 500 modules, 5 deps each
|
||||
build_500: function(n) {
|
||||
var i = 0
|
||||
var result = null
|
||||
for (i = 0; i < n; i++) {
|
||||
result = simulate_build(500, 5)
|
||||
}
|
||||
return result
|
||||
},
|
||||
|
||||
// Topo sort of 500 module dependency graph
|
||||
topo_sort_500: function(n) {
|
||||
var i = 0
|
||||
var order = null
|
||||
for (i = 0; i < n; i++) {
|
||||
order = topo_sort(500, 5)
|
||||
}
|
||||
return order
|
||||
}
|
||||
}
|
||||
160
benches/nbody.cm
Normal file
160
benches/nbody.cm
Normal file
@@ -0,0 +1,160 @@
|
||||
// nbody.cm — N-body gravitational simulation kernel
|
||||
// Pure numeric + allocation workload. Classic VM benchmark.
|
||||
|
||||
var math = use('math/radians')
|
||||
|
||||
def PI = 3.141592653589793
|
||||
def SOLAR_MASS = 4 * PI * PI
|
||||
def DAYS_PER_YEAR = 365.24
|
||||
|
||||
function make_system() {
|
||||
// Sun + 4 Jovian planets
|
||||
var sun = {x: 0, y: 0, z: 0, vx: 0, vy: 0, vz: 0, mass: SOLAR_MASS}
|
||||
|
||||
var jupiter = {
|
||||
x: 4.84143144246472090,
|
||||
y: -1.16032004402742839,
|
||||
z: -0.103622044471123109,
|
||||
vx: 0.00166007664274403694 * DAYS_PER_YEAR,
|
||||
vy: 0.00769901118419740425 * DAYS_PER_YEAR,
|
||||
vz: -0.0000690460016972063023 * DAYS_PER_YEAR,
|
||||
mass: 0.000954791938424326609 * SOLAR_MASS
|
||||
}
|
||||
|
||||
var saturn = {
|
||||
x: 8.34336671824457987,
|
||||
y: 4.12479856412430479,
|
||||
z: -0.403523417114321381,
|
||||
vx: -0.00276742510726862411 * DAYS_PER_YEAR,
|
||||
vy: 0.00499852801234917238 * DAYS_PER_YEAR,
|
||||
vz: 0.0000230417297573763929 * DAYS_PER_YEAR,
|
||||
mass: 0.000285885980666130812 * SOLAR_MASS
|
||||
}
|
||||
|
||||
var uranus = {
|
||||
x: 12.8943695621391310,
|
||||
y: -15.1111514016986312,
|
||||
z: -0.223307578892655734,
|
||||
vx: 0.00296460137564761618 * DAYS_PER_YEAR,
|
||||
vy: 0.00237847173959480950 * DAYS_PER_YEAR,
|
||||
vz: -0.0000296589568540237556 * DAYS_PER_YEAR,
|
||||
mass: 0.0000436624404335156298 * SOLAR_MASS
|
||||
}
|
||||
|
||||
var neptune = {
|
||||
x: 15.3796971148509165,
|
||||
y: -25.9193146099879641,
|
||||
z: 0.179258772950371181,
|
||||
vx: 0.00268067772490389322 * DAYS_PER_YEAR,
|
||||
vy: 0.00162824170038242295 * DAYS_PER_YEAR,
|
||||
vz: -0.0000951592254519715870 * DAYS_PER_YEAR,
|
||||
mass: 0.0000515138902046611451 * SOLAR_MASS
|
||||
}
|
||||
|
||||
var bodies = [sun, jupiter, saturn, uranus, neptune]
|
||||
|
||||
// Offset momentum
|
||||
var px = 0
|
||||
var py = 0
|
||||
var pz = 0
|
||||
var i = 0
|
||||
for (i = 0; i < length(bodies); i++) {
|
||||
px += bodies[i].vx * bodies[i].mass
|
||||
py += bodies[i].vy * bodies[i].mass
|
||||
pz += bodies[i].vz * bodies[i].mass
|
||||
}
|
||||
sun.vx = -px / SOLAR_MASS
|
||||
sun.vy = -py / SOLAR_MASS
|
||||
sun.vz = -pz / SOLAR_MASS
|
||||
|
||||
return bodies
|
||||
}
|
||||
|
||||
function advance(bodies, dt) {
|
||||
var n = length(bodies)
|
||||
var i = 0
|
||||
var j = 0
|
||||
var bi = null
|
||||
var bj = null
|
||||
var dx = 0
|
||||
var dy = 0
|
||||
var dz = 0
|
||||
var dist_sq = 0
|
||||
var dist = 0
|
||||
var mag = 0
|
||||
|
||||
for (i = 0; i < n; i++) {
|
||||
bi = bodies[i]
|
||||
for (j = i + 1; j < n; j++) {
|
||||
bj = bodies[j]
|
||||
dx = bi.x - bj.x
|
||||
dy = bi.y - bj.y
|
||||
dz = bi.z - bj.z
|
||||
dist_sq = dx * dx + dy * dy + dz * dz
|
||||
dist = math.sqrt(dist_sq)
|
||||
mag = dt / (dist_sq * dist)
|
||||
|
||||
bi.vx -= dx * bj.mass * mag
|
||||
bi.vy -= dy * bj.mass * mag
|
||||
bi.vz -= dz * bj.mass * mag
|
||||
bj.vx += dx * bi.mass * mag
|
||||
bj.vy += dy * bi.mass * mag
|
||||
bj.vz += dz * bi.mass * mag
|
||||
}
|
||||
}
|
||||
|
||||
for (i = 0; i < n; i++) {
|
||||
bi = bodies[i]
|
||||
bi.x += dt * bi.vx
|
||||
bi.y += dt * bi.vy
|
||||
bi.z += dt * bi.vz
|
||||
}
|
||||
}
|
||||
|
||||
function energy(bodies) {
|
||||
var e = 0
|
||||
var n = length(bodies)
|
||||
var i = 0
|
||||
var j = 0
|
||||
var bi = null
|
||||
var bj = null
|
||||
var dx = 0
|
||||
var dy = 0
|
||||
var dz = 0
|
||||
for (i = 0; i < n; i++) {
|
||||
bi = bodies[i]
|
||||
e += 0.5 * bi.mass * (bi.vx * bi.vx + bi.vy * bi.vy + bi.vz * bi.vz)
|
||||
for (j = i + 1; j < n; j++) {
|
||||
bj = bodies[j]
|
||||
dx = bi.x - bj.x
|
||||
dy = bi.y - bj.y
|
||||
dz = bi.z - bj.z
|
||||
e -= (bi.mass * bj.mass) / math.sqrt(dx * dx + dy * dy + dz * dz)
|
||||
}
|
||||
}
|
||||
return e
|
||||
}
|
||||
|
||||
return {
|
||||
nbody_1k: function(n) {
|
||||
var i = 0
|
||||
var j = 0
|
||||
var bodies = null
|
||||
for (i = 0; i < n; i++) {
|
||||
bodies = make_system()
|
||||
for (j = 0; j < 1000; j++) advance(bodies, 0.01)
|
||||
energy(bodies)
|
||||
}
|
||||
},
|
||||
|
||||
nbody_10k: function(n) {
|
||||
var i = 0
|
||||
var j = 0
|
||||
var bodies = null
|
||||
for (i = 0; i < n; i++) {
|
||||
bodies = make_system()
|
||||
for (j = 0; j < 10000; j++) advance(bodies, 0.01)
|
||||
energy(bodies)
|
||||
}
|
||||
}
|
||||
}
|
||||
154
benches/ray_tracer.cm
Normal file
154
benches/ray_tracer.cm
Normal file
@@ -0,0 +1,154 @@
|
||||
// ray_tracer.cm — Simple ray tracer kernel
|
||||
// Control flow + numeric + allocation. Classic VM benchmark.
|
||||
|
||||
var math = use('math/radians')
|
||||
|
||||
function vec(x, y, z) {
|
||||
return {x: x, y: y, z: z}
|
||||
}
|
||||
|
||||
function vadd(a, b) {
|
||||
return {x: a.x + b.x, y: a.y + b.y, z: a.z + b.z}
|
||||
}
|
||||
|
||||
function vsub(a, b) {
|
||||
return {x: a.x - b.x, y: a.y - b.y, z: a.z - b.z}
|
||||
}
|
||||
|
||||
function vmul(v, s) {
|
||||
return {x: v.x * s, y: v.y * s, z: v.z * s}
|
||||
}
|
||||
|
||||
function vdot(a, b) {
|
||||
return a.x * b.x + a.y * b.y + a.z * b.z
|
||||
}
|
||||
|
||||
function vnorm(v) {
|
||||
var len = math.sqrt(vdot(v, v))
|
||||
if (len == 0) return vec(0, 0, 0)
|
||||
return vmul(v, 1 / len)
|
||||
}
|
||||
|
||||
function make_sphere(center, radius, color) {
|
||||
return {
|
||||
center: center,
|
||||
radius: radius,
|
||||
color: color
|
||||
}
|
||||
}
|
||||
|
||||
function intersect_sphere(origin, dir, sphere) {
|
||||
var oc = vsub(origin, sphere.center)
|
||||
var b = vdot(oc, dir)
|
||||
var c = vdot(oc, oc) - sphere.radius * sphere.radius
|
||||
var disc = b * b - c
|
||||
if (disc < 0) return -1
|
||||
var sq = math.sqrt(disc)
|
||||
var t1 = -b - sq
|
||||
var t2 = -b + sq
|
||||
if (t1 > 0.001) return t1
|
||||
if (t2 > 0.001) return t2
|
||||
return -1
|
||||
}
|
||||
|
||||
function make_scene() {
|
||||
var spheres = [
|
||||
make_sphere(vec(0, -1, 5), 1, vec(1, 0, 0)),
|
||||
make_sphere(vec(2, 0, 6), 1, vec(0, 1, 0)),
|
||||
make_sphere(vec(-2, 0, 4), 1, vec(0, 0, 1)),
|
||||
make_sphere(vec(0, 1, 4.5), 0.5, vec(1, 1, 0)),
|
||||
make_sphere(vec(1, -0.5, 3), 0.3, vec(1, 0, 1)),
|
||||
make_sphere(vec(0, -101, 5), 100, vec(0.5, 0.5, 0.5))
|
||||
]
|
||||
var light = vnorm(vec(1, 1, -1))
|
||||
return {spheres: spheres, light: light}
|
||||
}
|
||||
|
||||
function trace(origin, dir, scene) {
|
||||
var closest_t = 999999
|
||||
var closest_sphere = null
|
||||
var i = 0
|
||||
var t = 0
|
||||
for (i = 0; i < length(scene.spheres); i++) {
|
||||
t = intersect_sphere(origin, dir, scene.spheres[i])
|
||||
if (t > 0 && t < closest_t) {
|
||||
closest_t = t
|
||||
closest_sphere = scene.spheres[i]
|
||||
}
|
||||
}
|
||||
|
||||
if (!closest_sphere) return vec(0.2, 0.3, 0.5) // sky color
|
||||
|
||||
var hit = vadd(origin, vmul(dir, closest_t))
|
||||
var normal = vnorm(vsub(hit, closest_sphere.center))
|
||||
var diffuse = vdot(normal, scene.light)
|
||||
if (diffuse < 0) diffuse = 0
|
||||
|
||||
// Shadow check
|
||||
var shadow_origin = vadd(hit, vmul(normal, 0.001))
|
||||
var in_shadow = false
|
||||
for (i = 0; i < length(scene.spheres); i++) {
|
||||
if (scene.spheres[i] != closest_sphere) {
|
||||
t = intersect_sphere(shadow_origin, scene.light, scene.spheres[i])
|
||||
if (t > 0) {
|
||||
in_shadow = true
|
||||
break
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
var ambient = 0.15
|
||||
var intensity = in_shadow ? ambient : ambient + diffuse * 0.85
|
||||
return vmul(closest_sphere.color, intensity)
|
||||
}
|
||||
|
||||
function render(width, height, scene) {
|
||||
var aspect = width / height
|
||||
var fov = 1.0
|
||||
var total_r = 0
|
||||
var total_g = 0
|
||||
var total_b = 0
|
||||
var y = 0
|
||||
var x = 0
|
||||
var u = 0
|
||||
var v = 0
|
||||
var dir = null
|
||||
var color = null
|
||||
var origin = vec(0, 0, 0)
|
||||
|
||||
for (y = 0; y < height; y++) {
|
||||
for (x = 0; x < width; x++) {
|
||||
u = (2 * (x + 0.5) / width - 1) * aspect * fov
|
||||
v = (1 - 2 * (y + 0.5) / height) * fov
|
||||
dir = vnorm(vec(u, v, 1))
|
||||
color = trace(origin, dir, scene)
|
||||
total_r += color.x
|
||||
total_g += color.y
|
||||
total_b += color.z
|
||||
}
|
||||
}
|
||||
|
||||
return {r: total_r, g: total_g, b: total_b}
|
||||
}
|
||||
|
||||
var scene = make_scene()
|
||||
|
||||
return {
|
||||
raytrace_32x32: function(n) {
|
||||
var i = 0
|
||||
var result = null
|
||||
for (i = 0; i < n; i++) {
|
||||
result = render(32, 32, scene)
|
||||
}
|
||||
return result
|
||||
},
|
||||
|
||||
raytrace_64x64: function(n) {
|
||||
var i = 0
|
||||
var result = null
|
||||
for (i = 0; i < n; i++) {
|
||||
result = render(64, 64, scene)
|
||||
}
|
||||
return result
|
||||
}
|
||||
}
|
||||
251
benches/richards.cm
Normal file
251
benches/richards.cm
Normal file
@@ -0,0 +1,251 @@
|
||||
// richards.cm — Richards benchmark (scheduler simulation)
|
||||
// Object-ish workload: dynamic dispatch, state machines, queuing.
|
||||
|
||||
def IDLE = 0
|
||||
def WORKER = 1
|
||||
def HANDLER_A = 2
|
||||
def HANDLER_B = 3
|
||||
def DEVICE_A = 4
|
||||
def DEVICE_B = 5
|
||||
def NUM_TASKS = 6
|
||||
|
||||
def TASK_RUNNING = 0
|
||||
def TASK_WAITING = 1
|
||||
def TASK_HELD = 2
|
||||
def TASK_SUSPENDED = 3
|
||||
|
||||
function make_packet(link, id, kind) {
|
||||
return {link: link, id: id, kind: kind, datum: 0, data: array(4, 0)}
|
||||
}
|
||||
|
||||
function scheduler() {
|
||||
var tasks = array(NUM_TASKS, null)
|
||||
var current = null
|
||||
var queue_count = 0
|
||||
var hold_count = 0
|
||||
var v1 = 0
|
||||
var v2 = 0
|
||||
var w_id = HANDLER_A
|
||||
var w_datum = 0
|
||||
var h_a_queue = null
|
||||
var h_a_count = 0
|
||||
var h_b_queue = null
|
||||
var h_b_count = 0
|
||||
var dev_a_pkt = null
|
||||
var dev_b_pkt = null
|
||||
|
||||
var find_next = function() {
|
||||
var best = null
|
||||
var i = 0
|
||||
for (i = 0; i < NUM_TASKS; i++) {
|
||||
if (tasks[i] && tasks[i].state == TASK_RUNNING) {
|
||||
if (!best || tasks[i].priority > best.priority) {
|
||||
best = tasks[i]
|
||||
}
|
||||
}
|
||||
}
|
||||
return best
|
||||
}
|
||||
|
||||
var hold_self = function() {
|
||||
hold_count++
|
||||
if (current) current.state = TASK_HELD
|
||||
return find_next()
|
||||
}
|
||||
|
||||
var release = function(id) {
|
||||
var t = tasks[id]
|
||||
if (!t) return find_next()
|
||||
if (t.state == TASK_HELD) t.state = TASK_RUNNING
|
||||
if (t.priority > (current ? current.priority : -1)) return t
|
||||
return current
|
||||
}
|
||||
|
||||
var queue_packet = function(pkt) {
|
||||
var t = tasks[pkt.id]
|
||||
var p = null
|
||||
if (!t) return find_next()
|
||||
queue_count++
|
||||
pkt.link = null
|
||||
pkt.id = current ? current.id : 0
|
||||
if (!t.queue) {
|
||||
t.queue = pkt
|
||||
t.state = TASK_RUNNING
|
||||
if (t.priority > (current ? current.priority : -1)) return t
|
||||
} else {
|
||||
p = t.queue
|
||||
while (p.link) p = p.link
|
||||
p.link = pkt
|
||||
}
|
||||
return current
|
||||
}
|
||||
|
||||
// Idle task
|
||||
tasks[IDLE] = {id: IDLE, priority: 0, queue: null, state: TASK_RUNNING,
|
||||
hold_count: 0, queue_count: 0,
|
||||
fn: function(pkt) {
|
||||
v1--
|
||||
if (v1 == 0) return hold_self()
|
||||
if ((v2 & 1) == 0) {
|
||||
v2 = v2 >> 1
|
||||
return release(DEVICE_A)
|
||||
}
|
||||
v2 = (v2 >> 1) ^ 0xD008
|
||||
return release(DEVICE_B)
|
||||
}
|
||||
}
|
||||
|
||||
// Worker task
|
||||
tasks[WORKER] = {id: WORKER, priority: 1000, queue: null, state: TASK_SUSPENDED,
|
||||
hold_count: 0, queue_count: 0,
|
||||
fn: function(pkt) {
|
||||
var i = 0
|
||||
if (!pkt) return hold_self()
|
||||
w_id = (w_id == HANDLER_A) ? HANDLER_B : HANDLER_A
|
||||
pkt.id = w_id
|
||||
pkt.datum = 0
|
||||
for (i = 0; i < 4; i++) {
|
||||
w_datum++
|
||||
if (w_datum > 26) w_datum = 1
|
||||
pkt.data[i] = 65 + w_datum
|
||||
}
|
||||
return queue_packet(pkt)
|
||||
}
|
||||
}
|
||||
|
||||
// Handler A
|
||||
tasks[HANDLER_A] = {id: HANDLER_A, priority: 2000, queue: null, state: TASK_SUSPENDED,
|
||||
hold_count: 0, queue_count: 0,
|
||||
fn: function(pkt) {
|
||||
var p = null
|
||||
if (pkt) { h_a_queue = pkt; h_a_count++ }
|
||||
if (h_a_queue) {
|
||||
p = h_a_queue
|
||||
h_a_queue = p.link
|
||||
if (h_a_count < 3) return queue_packet(p)
|
||||
return release(DEVICE_A)
|
||||
}
|
||||
return hold_self()
|
||||
}
|
||||
}
|
||||
|
||||
// Handler B
|
||||
tasks[HANDLER_B] = {id: HANDLER_B, priority: 3000, queue: null, state: TASK_SUSPENDED,
|
||||
hold_count: 0, queue_count: 0,
|
||||
fn: function(pkt) {
|
||||
var p = null
|
||||
if (pkt) { h_b_queue = pkt; h_b_count++ }
|
||||
if (h_b_queue) {
|
||||
p = h_b_queue
|
||||
h_b_queue = p.link
|
||||
if (h_b_count < 3) return queue_packet(p)
|
||||
return release(DEVICE_B)
|
||||
}
|
||||
return hold_self()
|
||||
}
|
||||
}
|
||||
|
||||
// Device A
|
||||
tasks[DEVICE_A] = {id: DEVICE_A, priority: 4000, queue: null, state: TASK_SUSPENDED,
|
||||
hold_count: 0, queue_count: 0,
|
||||
fn: function(pkt) {
|
||||
var p = null
|
||||
if (pkt) { dev_a_pkt = pkt; return hold_self() }
|
||||
if (dev_a_pkt) {
|
||||
p = dev_a_pkt
|
||||
dev_a_pkt = null
|
||||
return queue_packet(p)
|
||||
}
|
||||
return hold_self()
|
||||
}
|
||||
}
|
||||
|
||||
// Device B
|
||||
tasks[DEVICE_B] = {id: DEVICE_B, priority: 5000, queue: null, state: TASK_SUSPENDED,
|
||||
hold_count: 0, queue_count: 0,
|
||||
fn: function(pkt) {
|
||||
var p = null
|
||||
if (pkt) { dev_b_pkt = pkt; return hold_self() }
|
||||
if (dev_b_pkt) {
|
||||
p = dev_b_pkt
|
||||
dev_b_pkt = null
|
||||
return queue_packet(p)
|
||||
}
|
||||
return hold_self()
|
||||
}
|
||||
}
|
||||
|
||||
var run = function(iterations) {
|
||||
var i = 0
|
||||
var pkt1 = null
|
||||
var pkt2 = null
|
||||
var steps = 0
|
||||
var pkt = null
|
||||
var next = null
|
||||
|
||||
v1 = iterations
|
||||
v2 = 0xBEEF
|
||||
queue_count = 0
|
||||
hold_count = 0
|
||||
w_id = HANDLER_A
|
||||
w_datum = 0
|
||||
h_a_queue = null
|
||||
h_a_count = 0
|
||||
h_b_queue = null
|
||||
h_b_count = 0
|
||||
dev_a_pkt = null
|
||||
dev_b_pkt = null
|
||||
|
||||
for (i = 0; i < NUM_TASKS; i++) {
|
||||
if (tasks[i]) {
|
||||
tasks[i].state = (i == IDLE) ? TASK_RUNNING : TASK_SUSPENDED
|
||||
tasks[i].queue = null
|
||||
}
|
||||
}
|
||||
|
||||
pkt1 = make_packet(null, WORKER, 1)
|
||||
pkt2 = make_packet(pkt1, WORKER, 1)
|
||||
tasks[WORKER].queue = pkt2
|
||||
tasks[WORKER].state = TASK_RUNNING
|
||||
|
||||
current = find_next()
|
||||
while (current && steps < iterations * 10) {
|
||||
pkt = current.queue
|
||||
if (pkt) {
|
||||
current.queue = pkt.link
|
||||
current.queue_count++
|
||||
}
|
||||
next = current.fn(pkt)
|
||||
if (next) current = next
|
||||
else current = find_next()
|
||||
steps++
|
||||
}
|
||||
return {queue_count: queue_count, hold_count: hold_count, steps: steps}
|
||||
}
|
||||
|
||||
return {run: run}
|
||||
}
|
||||
|
||||
return {
|
||||
richards_100: function(n) {
|
||||
var i = 0
|
||||
var s = null
|
||||
var result = null
|
||||
for (i = 0; i < n; i++) {
|
||||
s = scheduler()
|
||||
result = s.run(100)
|
||||
}
|
||||
return result
|
||||
},
|
||||
|
||||
richards_1k: function(n) {
|
||||
var i = 0
|
||||
var s = null
|
||||
var result = null
|
||||
for (i = 0; i < n; i++) {
|
||||
s = scheduler()
|
||||
result = s.run(1000)
|
||||
}
|
||||
return result
|
||||
}
|
||||
}
|
||||
180
benches/sorting.cm
Normal file
180
benches/sorting.cm
Normal file
@@ -0,0 +1,180 @@
|
||||
// sorting.cm — Sorting and searching kernel
|
||||
// Array manipulation, comparison-heavy, allocation patterns.
|
||||
|
||||
function make_random_array(n, seed) {
|
||||
var a = []
|
||||
var x = seed
|
||||
var i = 0
|
||||
for (i = 0; i < n; i++) {
|
||||
x = ((x * 1103515245 + 12345) & 0x7FFFFFFF) | 0
|
||||
a[] = x % 10000
|
||||
}
|
||||
return a
|
||||
}
|
||||
|
||||
function make_descending(n) {
|
||||
var a = []
|
||||
var i = 0
|
||||
for (i = n - 1; i >= 0; i--) a[] = i
|
||||
return a
|
||||
}
|
||||
|
||||
// Manual quicksort (tests recursion + array mutation)
|
||||
function qsort(arr, lo, hi) {
|
||||
var i = lo
|
||||
var j = hi
|
||||
var pivot = arr[floor((lo + hi) / 2)]
|
||||
var tmp = 0
|
||||
if (lo >= hi) return null
|
||||
while (i <= j) {
|
||||
while (arr[i] < pivot) i++
|
||||
while (arr[j] > pivot) j--
|
||||
if (i <= j) {
|
||||
tmp = arr[i]
|
||||
arr[i] = arr[j]
|
||||
arr[j] = tmp
|
||||
i++
|
||||
j--
|
||||
}
|
||||
}
|
||||
if (lo < j) qsort(arr, lo, j)
|
||||
if (i < hi) qsort(arr, i, hi)
|
||||
return null
|
||||
}
|
||||
|
||||
// Merge sort (tests allocation + array creation)
|
||||
function msort(arr) {
|
||||
var n = length(arr)
|
||||
if (n <= 1) return arr
|
||||
var mid = floor(n / 2)
|
||||
var left = msort(array(arr, 0, mid))
|
||||
var right = msort(array(arr, mid, n))
|
||||
return merge(left, right)
|
||||
}
|
||||
|
||||
function merge(a, b) {
|
||||
var result = []
|
||||
var i = 0
|
||||
var j = 0
|
||||
while (i < length(a) && j < length(b)) {
|
||||
if (a[i] <= b[j]) {
|
||||
result[] = a[i]
|
||||
i++
|
||||
} else {
|
||||
result[] = b[j]
|
||||
j++
|
||||
}
|
||||
}
|
||||
while (i < length(a)) {
|
||||
result[] = a[i]
|
||||
i++
|
||||
}
|
||||
while (j < length(b)) {
|
||||
result[] = b[j]
|
||||
j++
|
||||
}
|
||||
return result
|
||||
}
|
||||
|
||||
// Binary search
|
||||
function bsearch(arr, target) {
|
||||
var lo = 0
|
||||
var hi = length(arr) - 1
|
||||
var mid = 0
|
||||
while (lo <= hi) {
|
||||
mid = floor((lo + hi) / 2)
|
||||
if (arr[mid] == target) return mid
|
||||
if (arr[mid] < target) lo = mid + 1
|
||||
else hi = mid - 1
|
||||
}
|
||||
return -1
|
||||
}
|
||||
|
||||
// Sort objects by field
|
||||
function sort_records(n) {
|
||||
var records = []
|
||||
var x = 42
|
||||
var i = 0
|
||||
for (i = 0; i < n; i++) {
|
||||
x = ((x * 1103515245 + 12345) & 0x7FFFFFFF) | 0
|
||||
records[] = {id: i, score: x % 10000, name: `item_${i}`}
|
||||
}
|
||||
return sort(records, "score")
|
||||
}
|
||||
|
||||
return {
|
||||
// Quicksort 1K random integers
|
||||
qsort_1k: function(n) {
|
||||
var i = 0
|
||||
var a = null
|
||||
for (i = 0; i < n; i++) {
|
||||
a = make_random_array(1000, i)
|
||||
qsort(a, 0, length(a) - 1)
|
||||
}
|
||||
return a
|
||||
},
|
||||
|
||||
// Quicksort 10K random integers
|
||||
qsort_10k: function(n) {
|
||||
var i = 0
|
||||
var a = null
|
||||
for (i = 0; i < n; i++) {
|
||||
a = make_random_array(10000, i)
|
||||
qsort(a, 0, length(a) - 1)
|
||||
}
|
||||
return a
|
||||
},
|
||||
|
||||
// Merge sort 1K (allocation heavy)
|
||||
msort_1k: function(n) {
|
||||
var i = 0
|
||||
var result = null
|
||||
for (i = 0; i < n; i++) {
|
||||
result = msort(make_random_array(1000, i))
|
||||
}
|
||||
return result
|
||||
},
|
||||
|
||||
// Built-in sort 1K
|
||||
builtin_sort_1k: function(n) {
|
||||
var i = 0
|
||||
var result = null
|
||||
for (i = 0; i < n; i++) {
|
||||
result = sort(make_random_array(1000, i))
|
||||
}
|
||||
return result
|
||||
},
|
||||
|
||||
// Sort worst case (descending → ascending)
|
||||
sort_worst_case: function(n) {
|
||||
var i = 0
|
||||
var a = null
|
||||
for (i = 0; i < n; i++) {
|
||||
a = make_descending(1000)
|
||||
qsort(a, 0, length(a) - 1)
|
||||
}
|
||||
return a
|
||||
},
|
||||
|
||||
// Binary search in sorted array
|
||||
bsearch_1k: function(n) {
|
||||
var sorted = make_random_array(1000, 42)
|
||||
sorted = sort(sorted)
|
||||
var found = 0
|
||||
var i = 0
|
||||
for (i = 0; i < n; i++) {
|
||||
if (bsearch(sorted, sorted[i % 1000]) >= 0) found++
|
||||
}
|
||||
return found
|
||||
},
|
||||
|
||||
// Sort records by field
|
||||
sort_records_500: function(n) {
|
||||
var i = 0
|
||||
var result = null
|
||||
for (i = 0; i < n; i++) {
|
||||
result = sort_records(500)
|
||||
}
|
||||
return result
|
||||
}
|
||||
}
|
||||
82
benches/spectral_norm.cm
Normal file
82
benches/spectral_norm.cm
Normal file
@@ -0,0 +1,82 @@
|
||||
// spectral_norm.cm — Spectral norm kernel
|
||||
// Pure numeric, dense array access, mathematical computation.
|
||||
|
||||
var math = use('math/radians')
|
||||
|
||||
function eval_a(i, j) {
|
||||
return 1.0 / ((i + j) * (i + j + 1) / 2 + i + 1)
|
||||
}
|
||||
|
||||
function eval_a_times_u(n, u, au) {
|
||||
var i = 0
|
||||
var j = 0
|
||||
var sum = 0
|
||||
for (i = 0; i < n; i++) {
|
||||
sum = 0
|
||||
for (j = 0; j < n; j++) {
|
||||
sum += eval_a(i, j) * u[j]
|
||||
}
|
||||
au[i] = sum
|
||||
}
|
||||
}
|
||||
|
||||
function eval_at_times_u(n, u, atu) {
|
||||
var i = 0
|
||||
var j = 0
|
||||
var sum = 0
|
||||
for (i = 0; i < n; i++) {
|
||||
sum = 0
|
||||
for (j = 0; j < n; j++) {
|
||||
sum += eval_a(j, i) * u[j]
|
||||
}
|
||||
atu[i] = sum
|
||||
}
|
||||
}
|
||||
|
||||
function eval_ata_times_u(n, u, atau) {
|
||||
var v = array(n, 0)
|
||||
eval_a_times_u(n, u, v)
|
||||
eval_at_times_u(n, v, atau)
|
||||
}
|
||||
|
||||
function spectral_norm(n) {
|
||||
var u = array(n, 1)
|
||||
var v = array(n, 0)
|
||||
var i = 0
|
||||
var vbv = 0
|
||||
var vv = 0
|
||||
|
||||
for (i = 0; i < 10; i++) {
|
||||
eval_ata_times_u(n, u, v)
|
||||
eval_ata_times_u(n, v, u)
|
||||
}
|
||||
|
||||
vbv = 0
|
||||
vv = 0
|
||||
for (i = 0; i < n; i++) {
|
||||
vbv += u[i] * v[i]
|
||||
vv += v[i] * v[i]
|
||||
}
|
||||
|
||||
return math.sqrt(vbv / vv)
|
||||
}
|
||||
|
||||
return {
|
||||
spectral_100: function(n) {
|
||||
var i = 0
|
||||
var result = 0
|
||||
for (i = 0; i < n; i++) {
|
||||
result = spectral_norm(100)
|
||||
}
|
||||
return result
|
||||
},
|
||||
|
||||
spectral_200: function(n) {
|
||||
var i = 0
|
||||
var result = 0
|
||||
for (i = 0; i < n; i++) {
|
||||
result = spectral_norm(200)
|
||||
}
|
||||
return result
|
||||
}
|
||||
}
|
||||
188
benches/string_processing.cm
Normal file
188
benches/string_processing.cm
Normal file
@@ -0,0 +1,188 @@
|
||||
// string_processing.cm — String-heavy kernel
|
||||
// Concat, split, search, replace, interning path stress.
|
||||
|
||||
function make_lorem(paragraphs) {
|
||||
var base = "Lorem ipsum dolor sit amet consectetur adipiscing elit sed do eiusmod tempor incididunt ut labore et dolore magna aliqua Ut enim ad minim veniam quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat"
|
||||
var result = ""
|
||||
var i = 0
|
||||
for (i = 0; i < paragraphs; i++) {
|
||||
if (i > 0) result = result + " "
|
||||
result = result + base
|
||||
}
|
||||
return result
|
||||
}
|
||||
|
||||
// Build a lookup table from text
|
||||
function build_index(txt) {
|
||||
var words = array(txt, " ")
|
||||
var index = {}
|
||||
var i = 0
|
||||
var w = null
|
||||
for (i = 0; i < length(words); i++) {
|
||||
w = words[i]
|
||||
if (!index[w]) {
|
||||
index[w] = []
|
||||
}
|
||||
index[w][] = i
|
||||
}
|
||||
return index
|
||||
}
|
||||
|
||||
// Levenshtein-like distance (simplified)
|
||||
function edit_distance(a, b) {
|
||||
var la = length(a)
|
||||
var lb = length(b)
|
||||
if (la == 0) return lb
|
||||
if (lb == 0) return la
|
||||
|
||||
// Use flat array for 2 rows of DP matrix
|
||||
var prev = array(lb + 1, 0)
|
||||
var curr = array(lb + 1, 0)
|
||||
var i = 0
|
||||
var j = 0
|
||||
var cost = 0
|
||||
var del = 0
|
||||
var ins = 0
|
||||
var sub = 0
|
||||
var tmp = null
|
||||
var ca = array(a)
|
||||
var cb = array(b)
|
||||
|
||||
for (j = 0; j <= lb; j++) prev[j] = j
|
||||
for (i = 1; i <= la; i++) {
|
||||
curr[0] = i
|
||||
for (j = 1; j <= lb; j++) {
|
||||
cost = ca[i - 1] == cb[j - 1] ? 0 : 1
|
||||
del = prev[j] + 1
|
||||
ins = curr[j - 1] + 1
|
||||
sub = prev[j - 1] + cost
|
||||
curr[j] = del
|
||||
if (ins < curr[j]) curr[j] = ins
|
||||
if (sub < curr[j]) curr[j] = sub
|
||||
}
|
||||
tmp = prev
|
||||
prev = curr
|
||||
curr = tmp
|
||||
}
|
||||
return prev[lb]
|
||||
}
|
||||
|
||||
var lorem_5 = make_lorem(5)
|
||||
var lorem_20 = make_lorem(20)
|
||||
|
||||
return {
|
||||
// Split text into words and count
|
||||
string_split_count: function(n) {
|
||||
var i = 0
|
||||
var words = null
|
||||
var count = 0
|
||||
for (i = 0; i < n; i++) {
|
||||
words = array(lorem_5, " ")
|
||||
count += length(words)
|
||||
}
|
||||
return count
|
||||
},
|
||||
|
||||
// Build word index (split + hash + array ops)
|
||||
string_index_build: function(n) {
|
||||
var i = 0
|
||||
var idx = null
|
||||
for (i = 0; i < n; i++) {
|
||||
idx = build_index(lorem_5)
|
||||
}
|
||||
return idx
|
||||
},
|
||||
|
||||
// Search for substrings
|
||||
string_search: function(n) {
|
||||
var targets = ["dolor", "minim", "quis", "magna", "ipsum"]
|
||||
var i = 0
|
||||
var j = 0
|
||||
var count = 0
|
||||
for (i = 0; i < n; i++) {
|
||||
for (j = 0; j < length(targets); j++) {
|
||||
if (search(lorem_20, targets[j])) count++
|
||||
}
|
||||
}
|
||||
return count
|
||||
},
|
||||
|
||||
// Replace operations
|
||||
string_replace: function(n) {
|
||||
var i = 0
|
||||
var result = null
|
||||
for (i = 0; i < n; i++) {
|
||||
result = replace(lorem_5, "dolor", "DOLOR")
|
||||
result = replace(result, "ipsum", "IPSUM")
|
||||
result = replace(result, "amet", "AMET")
|
||||
}
|
||||
return result
|
||||
},
|
||||
|
||||
// String concatenation builder
|
||||
string_builder: function(n) {
|
||||
var i = 0
|
||||
var j = 0
|
||||
var s = null
|
||||
var total = 0
|
||||
for (i = 0; i < n; i++) {
|
||||
s = ""
|
||||
for (j = 0; j < 50; j++) {
|
||||
s = s + "key=" + text(j) + "&value=" + text(j * 17) + "&"
|
||||
}
|
||||
total += length(s)
|
||||
}
|
||||
return total
|
||||
},
|
||||
|
||||
// Edit distance (DP + array + string ops)
|
||||
edit_distance: function(n) {
|
||||
var words = ["kitten", "sitting", "saturday", "sunday", "intention", "execution"]
|
||||
var i = 0
|
||||
var j = 0
|
||||
var total = 0
|
||||
for (i = 0; i < n; i++) {
|
||||
for (j = 0; j < length(words) - 1; j++) {
|
||||
total += edit_distance(words[j], words[j + 1])
|
||||
}
|
||||
}
|
||||
return total
|
||||
},
|
||||
|
||||
// Upper/lower/trim chain
|
||||
string_transforms: function(n) {
|
||||
var src = " Hello World "
|
||||
var i = 0
|
||||
var x = 0
|
||||
var result = null
|
||||
for (i = 0; i < n; i++) {
|
||||
result = trim(src)
|
||||
result = upper(result)
|
||||
result = lower(result)
|
||||
x += length(result)
|
||||
}
|
||||
return x
|
||||
},
|
||||
|
||||
// Starts_with / ends_with (interning path)
|
||||
string_prefix_suffix: function(n) {
|
||||
var strs = [
|
||||
"application/json",
|
||||
"text/html",
|
||||
"image/png",
|
||||
"application/xml",
|
||||
"text/plain"
|
||||
]
|
||||
var i = 0
|
||||
var j = 0
|
||||
var count = 0
|
||||
for (i = 0; i < n; i++) {
|
||||
for (j = 0; j < length(strs); j++) {
|
||||
if (starts_with(strs[j], "application/")) count++
|
||||
if (ends_with(strs[j], "/json")) count++
|
||||
if (starts_with(strs[j], "text/")) count++
|
||||
}
|
||||
}
|
||||
return count
|
||||
}
|
||||
}
|
||||
137
benches/tree_ops.cm
Normal file
137
benches/tree_ops.cm
Normal file
@@ -0,0 +1,137 @@
|
||||
// tree_ops.cm — Tree data structure operations kernel
|
||||
// Pointer chasing, recursion, allocation patterns.
|
||||
|
||||
// Binary tree: create, walk, transform, check
|
||||
function make_tree(depth) {
|
||||
if (depth <= 0) return {val: 1, left: null, right: null}
|
||||
return {
|
||||
val: depth,
|
||||
left: make_tree(depth - 1),
|
||||
right: make_tree(depth - 1)
|
||||
}
|
||||
}
|
||||
|
||||
function tree_check(node) {
|
||||
if (!node) return 0
|
||||
if (!node.left) return node.val
|
||||
return node.val + tree_check(node.left) - tree_check(node.right)
|
||||
}
|
||||
|
||||
function tree_sum(node) {
|
||||
if (!node) return 0
|
||||
return node.val + tree_sum(node.left) + tree_sum(node.right)
|
||||
}
|
||||
|
||||
function tree_depth(node) {
|
||||
if (!node) return 0
|
||||
var l = tree_depth(node.left)
|
||||
var r = tree_depth(node.right)
|
||||
return 1 + (l > r ? l : r)
|
||||
}
|
||||
|
||||
function tree_count(node) {
|
||||
if (!node) return 0
|
||||
return 1 + tree_count(node.left) + tree_count(node.right)
|
||||
}
|
||||
|
||||
// Transform tree: map values
|
||||
function tree_map(node, fn) {
|
||||
if (!node) return null
|
||||
return {
|
||||
val: fn(node.val),
|
||||
left: tree_map(node.left, fn),
|
||||
right: tree_map(node.right, fn)
|
||||
}
|
||||
}
|
||||
|
||||
// Flatten tree to array (in-order)
|
||||
function tree_flatten(node, result) {
|
||||
if (!node) return null
|
||||
tree_flatten(node.left, result)
|
||||
result[] = node.val
|
||||
tree_flatten(node.right, result)
|
||||
return null
|
||||
}
|
||||
|
||||
// Build sorted tree from array (balanced)
|
||||
function build_balanced(arr, lo, hi) {
|
||||
if (lo > hi) return null
|
||||
var mid = floor((lo + hi) / 2)
|
||||
return {
|
||||
val: arr[mid],
|
||||
left: build_balanced(arr, lo, mid - 1),
|
||||
right: build_balanced(arr, mid + 1, hi)
|
||||
}
|
||||
}
|
||||
|
||||
// Find a value in BST
|
||||
function bst_find(node, val) {
|
||||
if (!node) return false
|
||||
if (val == node.val) return true
|
||||
if (val < node.val) return bst_find(node.left, val)
|
||||
return bst_find(node.right, val)
|
||||
}
|
||||
|
||||
return {
|
||||
// Binary tree create + check (allocation heavy)
|
||||
tree_create_check: function(n) {
|
||||
var i = 0
|
||||
var t = null
|
||||
var x = 0
|
||||
for (i = 0; i < n; i++) {
|
||||
t = make_tree(10)
|
||||
x += tree_check(t)
|
||||
}
|
||||
return x
|
||||
},
|
||||
|
||||
// Deep tree traversals
|
||||
tree_traversal: function(n) {
|
||||
var t = make_tree(12)
|
||||
var x = 0
|
||||
var i = 0
|
||||
for (i = 0; i < n; i++) {
|
||||
x += tree_sum(t) + tree_depth(t) + tree_count(t)
|
||||
}
|
||||
return x
|
||||
},
|
||||
|
||||
// Tree map (create new tree from old)
|
||||
tree_transform: function(n) {
|
||||
var t = make_tree(10)
|
||||
var i = 0
|
||||
var mapped = null
|
||||
for (i = 0; i < n; i++) {
|
||||
mapped = tree_map(t, function(v) { return v * 2 + 1 })
|
||||
}
|
||||
return mapped
|
||||
},
|
||||
|
||||
// Flatten + rebuild (array <-> tree conversion)
|
||||
tree_flatten_rebuild: function(n) {
|
||||
var t = make_tree(10)
|
||||
var i = 0
|
||||
var flat = null
|
||||
var rebuilt = null
|
||||
for (i = 0; i < n; i++) {
|
||||
flat = []
|
||||
tree_flatten(t, flat)
|
||||
rebuilt = build_balanced(flat, 0, length(flat) - 1)
|
||||
}
|
||||
return rebuilt
|
||||
},
|
||||
|
||||
// BST search (pointer chasing)
|
||||
bst_search: function(n) {
|
||||
// Build a balanced BST of 1024 elements
|
||||
var data = []
|
||||
var i = 0
|
||||
for (i = 0; i < 1024; i++) data[] = i
|
||||
var bst = build_balanced(data, 0, 1023)
|
||||
var found = 0
|
||||
for (i = 0; i < n; i++) {
|
||||
if (bst_find(bst, i % 1024)) found++
|
||||
}
|
||||
return found
|
||||
}
|
||||
}
|
||||
46
benchmarks/binarytree.ce
Normal file
46
benchmarks/binarytree.ce
Normal file
@@ -0,0 +1,46 @@
|
||||
function mainThread() {
|
||||
var maxDepth = max(6, Number(arg[0] || 16));
|
||||
var stretchDepth = maxDepth + 1;
|
||||
var check = itemCheck(bottomUpTree(stretchDepth));
|
||||
var longLivedTree = null
|
||||
var depth = null
|
||||
var iterations = null
|
||||
log.console(`stretch tree of depth ${stretchDepth}\t check: ${check}`);
|
||||
|
||||
longLivedTree = bottomUpTree(maxDepth);
|
||||
|
||||
for (depth = 4; depth <= maxDepth; depth += 2) {
|
||||
iterations = 1 << maxDepth - depth + 4;
|
||||
work(iterations, depth);
|
||||
}
|
||||
|
||||
log.console(`long lived tree of depth ${maxDepth}\t check: ${itemCheck(longLivedTree)}`);
|
||||
}
|
||||
|
||||
function work(iterations, depth) {
|
||||
var check = 0;
|
||||
var i = 0
|
||||
for (i = 0; i < iterations; i++)
|
||||
check += itemCheck(bottomUpTree(depth));
|
||||
log.console(`${iterations}\t trees of depth ${depth}\t check: ${check}`);
|
||||
}
|
||||
|
||||
function TreeNode(left, right) {
|
||||
return {left, right};
|
||||
}
|
||||
|
||||
function itemCheck(node) {
|
||||
if (node.left == null)
|
||||
return 1;
|
||||
return 1 + itemCheck(node.left) + itemCheck(node.right);
|
||||
}
|
||||
|
||||
function bottomUpTree(depth) {
|
||||
return depth > 0
|
||||
? TreeNode(bottomUpTree(depth - 1), bottomUpTree(depth - 1))
|
||||
: TreeNode(null, null);
|
||||
}
|
||||
|
||||
mainThread()
|
||||
|
||||
$stop()
|
||||
28
benchmarks/eratosthenes.ce
Normal file
28
benchmarks/eratosthenes.ce
Normal file
@@ -0,0 +1,28 @@
|
||||
var blob = use('blob')
|
||||
var math = use('math/radians')
|
||||
|
||||
var i = 0
|
||||
var j = 0
|
||||
|
||||
function eratosthenes (n) {
|
||||
var sieve = blob(n, true)
|
||||
var sqrtN = whole(math.sqrt(n));
|
||||
|
||||
for (i = 2; i <= sqrtN; i++)
|
||||
if (sieve.read_logical(i))
|
||||
for (j = i * i; j <= n; j += i)
|
||||
sieve.write_bit(j, false);
|
||||
|
||||
return sieve;
|
||||
}
|
||||
|
||||
var sieve = eratosthenes(10000000);
|
||||
stone(sieve)
|
||||
|
||||
var c = 0
|
||||
for (i = 0; i < length(sieve); i++)
|
||||
if (sieve.read_logical(i)) c++
|
||||
|
||||
log.console(c)
|
||||
|
||||
$stop()
|
||||
65
benchmarks/fannkuch.ce
Normal file
65
benchmarks/fannkuch.ce
Normal file
@@ -0,0 +1,65 @@
|
||||
function fannkuch(n) {
|
||||
var perm1 = [n]
|
||||
var i = 0
|
||||
var k = null
|
||||
var r = null
|
||||
var t = null
|
||||
var p0 = null
|
||||
var j = null
|
||||
var more = null
|
||||
for (i = 0; i < n; i++) perm1[i] = i
|
||||
var perm = [n]
|
||||
var count = [n]
|
||||
var f = 0
|
||||
var flips = 0
|
||||
var nperm = 0
|
||||
var checksum = 0
|
||||
|
||||
r = n
|
||||
while (r > 0) {
|
||||
i = 0
|
||||
while (r != 1) { count[r-1] = r; r -= 1 }
|
||||
while (i < n) { perm[i] = perm1[i]; i += 1 }
|
||||
|
||||
f = 0
|
||||
k = perm[0]
|
||||
while (k != 0) {
|
||||
i = 0
|
||||
while (2*i < k) {
|
||||
t = perm[i]; perm[i] = perm[k-i]; perm[k-i] = t
|
||||
i += 1
|
||||
}
|
||||
k = perm[0]
|
||||
f += 1
|
||||
}
|
||||
if (f > flips) flips = f
|
||||
if ((nperm & 0x1) == 0) checksum += f; else checksum -= f
|
||||
|
||||
more = true
|
||||
while (more) {
|
||||
if (r == n) {
|
||||
log.console( checksum )
|
||||
return flips
|
||||
}
|
||||
p0 = perm1[0]
|
||||
i = 0
|
||||
while (i < r) {
|
||||
j = i + 1
|
||||
perm1[i] = perm1[j]
|
||||
i = j
|
||||
}
|
||||
perm1[r] = p0
|
||||
|
||||
count[r] -= 1
|
||||
if (count[r] > 0) more = false; else r += 1
|
||||
}
|
||||
nperm += 1
|
||||
}
|
||||
return flips;
|
||||
}
|
||||
|
||||
var n = arg[0] || 10
|
||||
|
||||
log.console(`Pfannkuchen(${n}) = ${fannkuch(n)}`)
|
||||
|
||||
$stop()
|
||||
16
benchmarks/fib.ce
Normal file
16
benchmarks/fib.ce
Normal file
@@ -0,0 +1,16 @@
|
||||
var time = use('time')
|
||||
|
||||
function fib(n) {
|
||||
if (n<2) return n
|
||||
return fib(n-1) + fib(n-2)
|
||||
}
|
||||
|
||||
var now = time.number()
|
||||
var arr = [1,2,3,4,5]
|
||||
arrfor(arr, function(i) {
|
||||
log.console(fib(28))
|
||||
})
|
||||
|
||||
log.console(`elapsed: ${time.number()-now}`)
|
||||
|
||||
$stop()
|
||||
377
benchmarks/js_perf.ce
Normal file
377
benchmarks/js_perf.ce
Normal file
@@ -0,0 +1,377 @@
|
||||
var time = use('time')
|
||||
var math = use('math/radians')
|
||||
|
||||
def iterations = {
|
||||
simple: 10000000,
|
||||
medium: 1000000,
|
||||
complex: 100000
|
||||
};
|
||||
|
||||
function measureTime(fn) {
|
||||
var start = time.number();
|
||||
fn();
|
||||
var end = time.number();
|
||||
return (end - start);
|
||||
}
|
||||
|
||||
function benchPropertyAccess() {
|
||||
var obj = {
|
||||
a: 1, b: 2, c: 3, d: 4, e: 5,
|
||||
nested: { x: 10, y: 20, z: 30 }
|
||||
};
|
||||
|
||||
var readTime = measureTime(function() {
|
||||
var sum = 0;
|
||||
var i = 0
|
||||
for (i = 0; i < iterations.simple; i++) {
|
||||
sum += obj.a + obj.b + obj.c + obj.d + obj.e;
|
||||
sum += obj.nested.x + obj.nested.y + obj.nested.z;
|
||||
}
|
||||
});
|
||||
|
||||
var writeTime = measureTime(function() {
|
||||
var i = 0
|
||||
for (i = 0; i < iterations.simple; i++) {
|
||||
obj.a = i;
|
||||
obj.b = i + 1;
|
||||
obj.c = i + 2;
|
||||
obj.nested.x = i * 2;
|
||||
obj.nested.y = i * 3;
|
||||
}
|
||||
});
|
||||
|
||||
return { readTime: readTime, writeTime: writeTime };
|
||||
}
|
||||
|
||||
function benchFunctionCalls() {
|
||||
function add(a, b) { return a + b; }
|
||||
function multiply(a, b) { return a * b; }
|
||||
function complexCalc(a, b, c) { return (a + b) * c / 2; }
|
||||
|
||||
var obj = {
|
||||
method: function(x) { return x * 2; },
|
||||
nested: {
|
||||
deepMethod: function(x, y) { return x + y; }
|
||||
}
|
||||
};
|
||||
|
||||
var simpleCallTime = measureTime(function() {
|
||||
var result = 0;
|
||||
var i = 0
|
||||
for (i = 0; i < iterations.simple; i++) {
|
||||
result = add(i, 1);
|
||||
result = multiply(result, 2);
|
||||
}
|
||||
});
|
||||
|
||||
var methodCallTime = measureTime(function() {
|
||||
var result = 0;
|
||||
var i = 0
|
||||
for (i = 0; i < iterations.simple; i++) {
|
||||
result = obj.method(i);
|
||||
result = obj.nested.deepMethod(result, i);
|
||||
}
|
||||
});
|
||||
|
||||
var complexCallTime = measureTime(function() {
|
||||
var result = 0;
|
||||
var i = 0
|
||||
for (i = 0; i < iterations.medium; i++) {
|
||||
result = complexCalc(i, i + 1, i + 2);
|
||||
}
|
||||
});
|
||||
|
||||
return {
|
||||
simpleCallTime: simpleCallTime,
|
||||
methodCallTime: methodCallTime,
|
||||
complexCallTime: complexCallTime
|
||||
};
|
||||
}
|
||||
|
||||
function benchArrayOps() {
|
||||
var i = 0
|
||||
|
||||
var pushTime = measureTime(function() {
|
||||
var arr = [];
|
||||
var j = 0
|
||||
for (j = 0; j < iterations.medium; j++) {
|
||||
arr[] = j;
|
||||
}
|
||||
});
|
||||
|
||||
var arr = [];
|
||||
for (i = 0; i < 10000; i++) arr[] = i;
|
||||
|
||||
var accessTime = measureTime(function() {
|
||||
var sum = 0;
|
||||
var j = 0
|
||||
for (j = 0; j < iterations.medium; j++) {
|
||||
sum += arr[j % 10000];
|
||||
}
|
||||
});
|
||||
|
||||
var iterateTime = measureTime(function() {
|
||||
var sum = 0;
|
||||
var j = 0
|
||||
var k = 0
|
||||
for (j = 0; j < 1000; j++) {
|
||||
for (k = 0; k < length(arr); k++) {
|
||||
sum += arr[k];
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
return {
|
||||
pushTime: pushTime,
|
||||
accessTime: accessTime,
|
||||
iterateTime: iterateTime
|
||||
};
|
||||
}
|
||||
|
||||
function benchObjectCreation() {
|
||||
var literalTime = measureTime(function() {
|
||||
var i = 0
|
||||
var obj = null
|
||||
for (i = 0; i < iterations.medium; i++) {
|
||||
obj = { x: i, y: i * 2, z: i * 3 };
|
||||
}
|
||||
});
|
||||
|
||||
function Point(x, y) {
|
||||
return {x,y}
|
||||
}
|
||||
|
||||
var defructorTime = measureTime(function() {
|
||||
var i = 0
|
||||
var p = null
|
||||
for (i = 0; i < iterations.medium; i++) {
|
||||
p = Point(i, i * 2);
|
||||
}
|
||||
});
|
||||
|
||||
var protoObj = {
|
||||
x: 0,
|
||||
y: 0,
|
||||
move: function(dx, dy) {
|
||||
this.x += dx;
|
||||
this.y += dy;
|
||||
}
|
||||
};
|
||||
|
||||
var prototypeTime = measureTime(function() {
|
||||
var i = 0
|
||||
var obj = null
|
||||
for (i = 0; i < iterations.medium; i++) {
|
||||
obj = meme(protoObj);
|
||||
obj.x = i;
|
||||
obj.y = i * 2;
|
||||
}
|
||||
});
|
||||
|
||||
return {
|
||||
literalTime: literalTime,
|
||||
defructorTime: defructorTime,
|
||||
prototypeTime: prototypeTime
|
||||
};
|
||||
}
|
||||
|
||||
function benchStringOps() {
|
||||
var i = 0
|
||||
var strings = [];
|
||||
|
||||
var concatTime = measureTime(function() {
|
||||
var str = "";
|
||||
var j = 0
|
||||
for (j = 0; j < iterations.complex; j++) {
|
||||
str = "test" + j + "value";
|
||||
}
|
||||
});
|
||||
|
||||
for (i = 0; i < 1000; i++) {
|
||||
strings[] = "string" + i;
|
||||
}
|
||||
|
||||
var joinTime = measureTime(function() {
|
||||
var j = 0
|
||||
var result = null
|
||||
for (j = 0; j < iterations.complex; j++) {
|
||||
result = text(strings, ",");
|
||||
}
|
||||
});
|
||||
|
||||
var splitTime = measureTime(function() {
|
||||
var str = "a,b,c,d,e,f,g,h,i,j,k,l,m,n,o,p";
|
||||
var j = 0
|
||||
var parts = null
|
||||
for (j = 0; j < iterations.medium; j++) {
|
||||
parts = array(str, ",");
|
||||
}
|
||||
});
|
||||
|
||||
return {
|
||||
concatTime: concatTime,
|
||||
joinTime: joinTime,
|
||||
splitTime: splitTime
|
||||
};
|
||||
}
|
||||
|
||||
function benchArithmetic() {
|
||||
var intMathTime = measureTime(function() {
|
||||
var result = 1;
|
||||
var i = 0
|
||||
for (i = 0; i < iterations.simple; i++) {
|
||||
result = ((result + i) * 2 - 1) / 3;
|
||||
result = result % 1000 + 1;
|
||||
}
|
||||
});
|
||||
|
||||
var floatMathTime = measureTime(function() {
|
||||
var result = 1.5;
|
||||
var i = 0
|
||||
for (i = 0; i < iterations.simple; i++) {
|
||||
result = math.sine(result) + math.cosine(i * 0.01);
|
||||
result = math.sqrt(abs(result)) + 0.1;
|
||||
}
|
||||
});
|
||||
|
||||
var bitwiseTime = measureTime(function() {
|
||||
var result = 0;
|
||||
var i = 0
|
||||
for (i = 0; i < iterations.simple; i++) {
|
||||
result = (result ^ i) & 0xFFFF;
|
||||
result = (result << 1) | (result >> 15);
|
||||
}
|
||||
});
|
||||
|
||||
return {
|
||||
intMathTime: intMathTime,
|
||||
floatMathTime: floatMathTime,
|
||||
bitwiseTime: bitwiseTime
|
||||
};
|
||||
}
|
||||
|
||||
function benchClosures() {
|
||||
var i = 0
|
||||
|
||||
function makeAdder(x) {
|
||||
return function(y) { return x + y; };
|
||||
}
|
||||
|
||||
var closureCreateTime = measureTime(function() {
|
||||
var funcs = [];
|
||||
var j = 0
|
||||
for (j = 0; j < iterations.medium; j++) {
|
||||
funcs[] = makeAdder(j);
|
||||
}
|
||||
});
|
||||
|
||||
var adders = [];
|
||||
for (i = 0; i < 1000; i++) {
|
||||
adders[] = makeAdder(i);
|
||||
}
|
||||
|
||||
var closureCallTime = measureTime(function() {
|
||||
var sum = 0;
|
||||
var j = 0
|
||||
for (j = 0; j < iterations.medium; j++) {
|
||||
sum += adders[j % 1000](j);
|
||||
}
|
||||
});
|
||||
|
||||
return {
|
||||
closureCreateTime: closureCreateTime,
|
||||
closureCallTime: closureCallTime
|
||||
};
|
||||
}
|
||||
|
||||
log.console("JavaScript Performance Benchmark");
|
||||
log.console("======================\n");
|
||||
|
||||
log.console("BENCHMARK: Property Access");
|
||||
var propResults = benchPropertyAccess();
|
||||
log.console(" Read time: " + propResults.readTime.toFixed(3) + "s => " +
|
||||
(iterations.simple / propResults.readTime).toFixed(1) + " reads/sec [" +
|
||||
(propResults.readTime / iterations.simple * 1e9).toFixed(1) + " ns/op]");
|
||||
log.console(" Write time: " + propResults.writeTime.toFixed(3) + "s => " +
|
||||
(iterations.simple / propResults.writeTime).toFixed(1) + " writes/sec [" +
|
||||
(propResults.writeTime / iterations.simple * 1e9).toFixed(1) + " ns/op]");
|
||||
log.console("");
|
||||
|
||||
log.console("BENCHMARK: Function Calls");
|
||||
var funcResults = benchFunctionCalls();
|
||||
log.console(" Simple calls: " + funcResults.simpleCallTime.toFixed(3) + "s => " +
|
||||
(iterations.simple / funcResults.simpleCallTime).toFixed(1) + " calls/sec [" +
|
||||
(funcResults.simpleCallTime / iterations.simple * 1e9).toFixed(1) + " ns/op]");
|
||||
log.console(" Method calls: " + funcResults.methodCallTime.toFixed(3) + "s => " +
|
||||
(iterations.simple / funcResults.methodCallTime).toFixed(1) + " calls/sec [" +
|
||||
(funcResults.methodCallTime / iterations.simple * 1e9).toFixed(1) + " ns/op]");
|
||||
log.console(" Complex calls: " + funcResults.complexCallTime.toFixed(3) + "s => " +
|
||||
(iterations.medium / funcResults.complexCallTime).toFixed(1) + " calls/sec [" +
|
||||
(funcResults.complexCallTime / iterations.medium * 1e9).toFixed(1) + " ns/op]");
|
||||
log.console("");
|
||||
|
||||
log.console("BENCHMARK: Array Operations");
|
||||
var arrayResults = benchArrayOps();
|
||||
log.console(" Push: " + arrayResults.pushTime.toFixed(3) + "s => " +
|
||||
(iterations.medium / arrayResults.pushTime).toFixed(1) + " pushes/sec [" +
|
||||
(arrayResults.pushTime / iterations.medium * 1e9).toFixed(1) + " ns/op]");
|
||||
log.console(" Access: " + arrayResults.accessTime.toFixed(3) + "s => " +
|
||||
(iterations.medium / arrayResults.accessTime).toFixed(1) + " accesses/sec [" +
|
||||
(arrayResults.accessTime / iterations.medium * 1e9).toFixed(1) + " ns/op]");
|
||||
log.console(" Iterate: " + arrayResults.iterateTime.toFixed(3) + "s => " +
|
||||
(1000 / arrayResults.iterateTime).toFixed(1) + " full iterations/sec");
|
||||
log.console("");
|
||||
|
||||
log.console("BENCHMARK: Object Creation");
|
||||
var objResults = benchObjectCreation();
|
||||
log.console(" Literal: " + objResults.literalTime.toFixed(3) + "s => " +
|
||||
(iterations.medium / objResults.literalTime).toFixed(1) + " creates/sec [" +
|
||||
(objResults.literalTime / iterations.medium * 1e9).toFixed(1) + " ns/op]");
|
||||
log.console(" Constructor: " + objResults.defructorTime.toFixed(3) + "s => " +
|
||||
(iterations.medium / objResults.defructorTime).toFixed(1) + " creates/sec [" +
|
||||
(objResults.defructorTime / iterations.medium * 1e9).toFixed(1) + " ns/op]");
|
||||
log.console(" Prototype: " + objResults.prototypeTime.toFixed(3) + "s => " +
|
||||
(iterations.medium / objResults.prototypeTime).toFixed(1) + " creates/sec [" +
|
||||
(objResults.prototypeTime / iterations.medium * 1e9).toFixed(1) + " ns/op]");
|
||||
log.console("");
|
||||
|
||||
log.console("BENCHMARK: String Operations");
|
||||
var strResults = benchStringOps();
|
||||
log.console(" Concat: " + strResults.concatTime.toFixed(3) + "s => " +
|
||||
(iterations.complex / strResults.concatTime).toFixed(1) + " concats/sec [" +
|
||||
(strResults.concatTime / iterations.complex * 1e9).toFixed(1) + " ns/op]");
|
||||
log.console(" Join: " + strResults.joinTime.toFixed(3) + "s => " +
|
||||
(iterations.complex / strResults.joinTime).toFixed(1) + " joins/sec [" +
|
||||
(strResults.joinTime / iterations.complex * 1e9).toFixed(1) + " ns/op]");
|
||||
log.console(" Split: " + strResults.splitTime.toFixed(3) + "s => " +
|
||||
(iterations.medium / strResults.splitTime).toFixed(1) + " splits/sec [" +
|
||||
(strResults.splitTime / iterations.medium * 1e9).toFixed(1) + " ns/op]");
|
||||
log.console("");
|
||||
|
||||
log.console("BENCHMARK: Arithmetic Operations");
|
||||
var mathResults = benchArithmetic();
|
||||
log.console(" Integer math: " + mathResults.intMathTime.toFixed(3) + "s => " +
|
||||
(iterations.simple / mathResults.intMathTime).toFixed(1) + " ops/sec [" +
|
||||
(mathResults.intMathTime / iterations.simple * 1e9).toFixed(1) + " ns/op]");
|
||||
log.console(" Float math: " + mathResults.floatMathTime.toFixed(3) + "s => " +
|
||||
(iterations.simple / mathResults.floatMathTime).toFixed(1) + " ops/sec [" +
|
||||
(mathResults.floatMathTime / iterations.simple * 1e9).toFixed(1) + " ns/op]");
|
||||
log.console(" Bitwise: " + mathResults.bitwiseTime.toFixed(3) + "s => " +
|
||||
(iterations.simple / mathResults.bitwiseTime).toFixed(1) + " ops/sec [" +
|
||||
(mathResults.bitwiseTime / iterations.simple * 1e9).toFixed(1) + " ns/op]");
|
||||
log.console("");
|
||||
|
||||
log.console("BENCHMARK: Closures");
|
||||
var closureResults = benchClosures();
|
||||
log.console(" Create: " + closureResults.closureCreateTime.toFixed(3) + "s => " +
|
||||
(iterations.medium / closureResults.closureCreateTime).toFixed(1) + " creates/sec [" +
|
||||
(closureResults.closureCreateTime / iterations.medium * 1e9).toFixed(1) + " ns/op]");
|
||||
log.console(" Call: " + closureResults.closureCallTime.toFixed(3) + "s => " +
|
||||
(iterations.medium / closureResults.closureCallTime).toFixed(1) + " calls/sec [" +
|
||||
(closureResults.closureCallTime / iterations.medium * 1e9).toFixed(1) + " ns/op]");
|
||||
log.console("");
|
||||
|
||||
log.console("---------------------------------------------------------");
|
||||
log.console("Benchmark complete.\n");
|
||||
|
||||
$stop()
|
||||
46
benchmarks/mandelbrot.ce
Normal file
46
benchmarks/mandelbrot.ce
Normal file
@@ -0,0 +1,46 @@
|
||||
var blob = use('blob')
|
||||
|
||||
var iter = 50
|
||||
var limit = 2.0
|
||||
var zr = null
|
||||
var zi = null
|
||||
var cr = null
|
||||
var ci = null
|
||||
var tr = null
|
||||
var ti = null
|
||||
var y = 0
|
||||
var x = 0
|
||||
var i = 0
|
||||
var row = null
|
||||
|
||||
var h = Number(arg[0]) || 500
|
||||
var w = h
|
||||
|
||||
log.console(`P4\n${w} ${h}`);
|
||||
|
||||
for (y = 0; y < h; ++y) {
|
||||
row = blob(w);
|
||||
|
||||
for (x = 0; x < w; ++x) {
|
||||
zr = 0; zi = 0; tr = 0; ti = 0;
|
||||
cr = 2 * x / w - 1.5;
|
||||
ci = 2 * y / h - 1;
|
||||
for (i = 0; i < iter && (tr + ti <= limit * limit); ++i) {
|
||||
zi = 2 * zr * zi + ci;
|
||||
zr = tr - ti + cr;
|
||||
tr = zr * zr;
|
||||
ti = zi * zi;
|
||||
}
|
||||
|
||||
if (tr + ti <= limit * limit)
|
||||
row.write_bit(1);
|
||||
else
|
||||
row.write_bit(0);
|
||||
}
|
||||
|
||||
stone(row)
|
||||
|
||||
log.console(text(row, 'b'));
|
||||
}
|
||||
|
||||
$stop()
|
||||
15
benchmarks/montecarlo.ce
Normal file
15
benchmarks/montecarlo.ce
Normal file
@@ -0,0 +1,15 @@
|
||||
var math = use('math/radians')
|
||||
var N = 1000000;
|
||||
var num = 0;
|
||||
var i = 0
|
||||
var x = null
|
||||
var y = null
|
||||
for (i = 0; i < N; i++) {
|
||||
x = 2 * $random();
|
||||
y = $random();
|
||||
if (y < math.sine(x * x))
|
||||
num++;
|
||||
}
|
||||
log.console(2 * num / N);
|
||||
|
||||
$stop()
|
||||
182
benchmarks/nbody.ce
Normal file
182
benchmarks/nbody.ce
Normal file
@@ -0,0 +1,182 @@
|
||||
var math = use('math/radians')
|
||||
var SOLAR_MASS = 4 * pi * pi;
|
||||
var DAYS_PER_YEAR = 365.24;
|
||||
|
||||
function Body(p) {
|
||||
return {x: p.x, y: p.y, z: p.z, vx: p.vx, vy: p.vy, vz: p.vz, mass: p.mass};
|
||||
}
|
||||
|
||||
function Jupiter() {
|
||||
return Body({
|
||||
x: 4.84143144246472090e+00,
|
||||
y: -1.16032004402742839e+00,
|
||||
z: -1.03622044471123109e-01,
|
||||
vx: 1.66007664274403694e-03 * DAYS_PER_YEAR,
|
||||
vy: 7.69901118419740425e-03 * DAYS_PER_YEAR,
|
||||
vz: -6.90460016972063023e-05 * DAYS_PER_YEAR,
|
||||
mass: 9.54791938424326609e-04 * SOLAR_MASS
|
||||
});
|
||||
}
|
||||
|
||||
function Saturn() {
|
||||
return Body({
|
||||
x: 8.34336671824457987e+00,
|
||||
y: 4.12479856412430479e+00,
|
||||
z: -4.03523417114321381e-01,
|
||||
vx: -2.76742510726862411e-03 * DAYS_PER_YEAR,
|
||||
vy: 4.99852801234917238e-03 * DAYS_PER_YEAR,
|
||||
vz: 2.30417297573763929e-05 * DAYS_PER_YEAR,
|
||||
mass: 2.85885980666130812e-04 * SOLAR_MASS
|
||||
});
|
||||
}
|
||||
|
||||
function Uranus() {
|
||||
return Body({
|
||||
x: 1.28943695621391310e+01,
|
||||
y: -1.51111514016986312e+01,
|
||||
z: -2.23307578892655734e-01,
|
||||
vx: 2.96460137564761618e-03 * DAYS_PER_YEAR,
|
||||
vy: 2.37847173959480950e-03 * DAYS_PER_YEAR,
|
||||
vz: -2.96589568540237556e-05 * DAYS_PER_YEAR,
|
||||
mass: 4.36624404335156298e-05 * SOLAR_MASS
|
||||
});
|
||||
}
|
||||
|
||||
function Neptune() {
|
||||
return Body({
|
||||
x: 1.53796971148509165e+01,
|
||||
y: -2.59193146099879641e+01,
|
||||
z: 1.79258772950371181e-01,
|
||||
vx: 2.68067772490389322e-03 * DAYS_PER_YEAR,
|
||||
vy: 1.62824170038242295e-03 * DAYS_PER_YEAR,
|
||||
vz: -9.51592254519715870e-05 * DAYS_PER_YEAR,
|
||||
mass: 5.15138902046611451e-05 * SOLAR_MASS
|
||||
});
|
||||
}
|
||||
|
||||
function Sun() {
|
||||
return Body({x: 0.0, y: 0.0, z: 0.0, vx: 0.0, vy: 0.0, vz: 0.0, mass: SOLAR_MASS});
|
||||
}
|
||||
|
||||
var bodies = Array(Sun(), Jupiter(), Saturn(), Uranus(), Neptune());
|
||||
|
||||
function offsetMomentum() {
|
||||
var px = 0;
|
||||
var py = 0;
|
||||
var pz = 0;
|
||||
var size = length(bodies);
|
||||
var i = 0
|
||||
var body = null
|
||||
var mass = null
|
||||
for (i = 0; i < size; i++) {
|
||||
body = bodies[i];
|
||||
mass = body.mass;
|
||||
px += body.vx * mass;
|
||||
py += body.vy * mass;
|
||||
pz += body.vz * mass;
|
||||
}
|
||||
|
||||
body = bodies[0];
|
||||
body.vx = -px / SOLAR_MASS;
|
||||
body.vy = -py / SOLAR_MASS;
|
||||
body.vz = -pz / SOLAR_MASS;
|
||||
}
|
||||
|
||||
function advance(dt) {
|
||||
var size = length(bodies);
|
||||
var i = 0
|
||||
var j = 0
|
||||
var bodyi = null
|
||||
var bodyj = null
|
||||
var vxi = null
|
||||
var vyi = null
|
||||
var vzi = null
|
||||
var dx = null
|
||||
var dy = null
|
||||
var dz = null
|
||||
var d2 = null
|
||||
var mag = null
|
||||
var massj = null
|
||||
var massi = null
|
||||
var body = null
|
||||
|
||||
for (i = 0; i < size; i++) {
|
||||
bodyi = bodies[i];
|
||||
vxi = bodyi.vx;
|
||||
vyi = bodyi.vy;
|
||||
vzi = bodyi.vz;
|
||||
for (j = i + 1; j < size; j++) {
|
||||
bodyj = bodies[j];
|
||||
dx = bodyi.x - bodyj.x;
|
||||
dy = bodyi.y - bodyj.y;
|
||||
dz = bodyi.z - bodyj.z;
|
||||
|
||||
d2 = dx * dx + dy * dy + dz * dz;
|
||||
mag = dt / (d2 * math.sqrt(d2));
|
||||
|
||||
massj = bodyj.mass;
|
||||
vxi -= dx * massj * mag;
|
||||
vyi -= dy * massj * mag;
|
||||
vzi -= dz * massj * mag;
|
||||
|
||||
massi = bodyi.mass;
|
||||
bodyj.vx += dx * massi * mag;
|
||||
bodyj.vy += dy * massi * mag;
|
||||
bodyj.vz += dz * massi * mag;
|
||||
}
|
||||
bodyi.vx = vxi;
|
||||
bodyi.vy = vyi;
|
||||
bodyi.vz = vzi;
|
||||
}
|
||||
|
||||
for (i = 0; i < size; i++) {
|
||||
body = bodies[i];
|
||||
body.x += dt * body.vx;
|
||||
body.y += dt * body.vy;
|
||||
body.z += dt * body.vz;
|
||||
}
|
||||
}
|
||||
|
||||
function energy() {
|
||||
var e = 0;
|
||||
var size = length(bodies);
|
||||
var i = 0
|
||||
var j = 0
|
||||
var bodyi = null
|
||||
var bodyj = null
|
||||
var dx = null
|
||||
var dy = null
|
||||
var dz = null
|
||||
var distance = null
|
||||
|
||||
for (i = 0; i < size; i++) {
|
||||
bodyi = bodies[i];
|
||||
|
||||
e += 0.5 * bodyi.mass * ( bodyi.vx * bodyi.vx +
|
||||
bodyi.vy * bodyi.vy + bodyi.vz * bodyi.vz );
|
||||
|
||||
for (j = i + 1; j < size; j++) {
|
||||
bodyj = bodies[j];
|
||||
dx = bodyi.x - bodyj.x;
|
||||
dy = bodyi.y - bodyj.y;
|
||||
dz = bodyi.z - bodyj.z;
|
||||
|
||||
distance = math.sqrt(dx * dx + dy * dy + dz * dz);
|
||||
e -= (bodyi.mass * bodyj.mass) / distance;
|
||||
}
|
||||
}
|
||||
return e;
|
||||
}
|
||||
|
||||
var n = arg[0] || 100000
|
||||
var i = 0
|
||||
|
||||
offsetMomentum();
|
||||
|
||||
log.console(`n = ${n}`)
|
||||
log.console(energy().toFixed(9))
|
||||
for (i = 0; i < n; i++)
|
||||
advance(0.01);
|
||||
log.console(energy().toFixed(9))
|
||||
|
||||
$stop()
|
||||
@@ -1,76 +1,75 @@
|
||||
var nota = use('nota')
|
||||
var os = use('os')
|
||||
var io = use('io')
|
||||
var nota = use('internal/nota')
|
||||
var os = use('internal/os')
|
||||
var io = use('fd')
|
||||
var json = use('json')
|
||||
|
||||
var ll = io.slurp('benchmarks/nota.json')
|
||||
|
||||
var newarr = []
|
||||
var accstr = ""
|
||||
for (var i = 0; i < 10000; i++) {
|
||||
var i = 0
|
||||
var start = null
|
||||
var jll = null
|
||||
var jsonStr = null
|
||||
var nll = null
|
||||
var oll = null
|
||||
for (i = 0; i < 10000; i++) {
|
||||
accstr += i;
|
||||
newarr.push(i.toString())
|
||||
newarr[] = text(i)
|
||||
}
|
||||
// Arrays to store timing results
|
||||
var jsonDecodeTimes = [];
|
||||
var jsonEncodeTimes = [];
|
||||
var notaEncodeTimes = [];
|
||||
var notaDecodeTimes = [];
|
||||
var notaSizes = [];
|
||||
|
||||
// Run 100 tests
|
||||
for (let i = 0; i < 100; i++) {
|
||||
// JSON Decode test
|
||||
let start = os.now();
|
||||
var jll = json.decode(ll);
|
||||
jsonDecodeTimes.push((os.now() - start) * 1000);
|
||||
|
||||
// JSON Encode test
|
||||
for (i = 0; i < 100; i++) {
|
||||
start = os.now();
|
||||
let jsonStr = JSON.stringify(jll);
|
||||
jsonEncodeTimes.push((os.now() - start) * 1000);
|
||||
jll = json.decode(ll);
|
||||
jsonDecodeTimes[] = (os.now() - start) * 1000;
|
||||
|
||||
// NOTA Encode test
|
||||
start = os.now();
|
||||
var nll = nota.encode(jll);
|
||||
notaEncodeTimes.push((os.now() - start) * 1000);
|
||||
jsonStr = JSON.stringify(jll);
|
||||
jsonEncodeTimes[] = (os.now() - start) * 1000;
|
||||
|
||||
// NOTA Decode test
|
||||
start = os.now();
|
||||
var oll = nota.decode(nll);
|
||||
notaDecodeTimes.push((os.now() - start) * 1000);
|
||||
nll = nota.encode(jll);
|
||||
notaEncodeTimes[] = (os.now() - start) * 1000;
|
||||
|
||||
start = os.now();
|
||||
oll = nota.decode(nll);
|
||||
notaDecodeTimes[] = (os.now() - start) * 1000;
|
||||
}
|
||||
|
||||
// Calculate statistics
|
||||
function getStats(arr) {
|
||||
const avg = arr.reduce((a, b) => a + b) / arr.length;
|
||||
const min = Math.min(...arr);
|
||||
const max = Math.max(...arr);
|
||||
return { avg, min, max };
|
||||
return {
|
||||
avg: reduce(arr, (a,b) => a+b, 0) / length(arr),
|
||||
min: reduce(arr, min),
|
||||
max: reduce(arr, max)
|
||||
};
|
||||
}
|
||||
|
||||
// Pretty print results
|
||||
log.console("\n=== Performance Test Results (100 iterations) ===");
|
||||
log.console("\n== Performance Test Results (100 iterations) ==");
|
||||
log.console("\nJSON Decoding (ms):");
|
||||
const jsonDecStats = getStats(jsonDecodeTimes);
|
||||
def jsonDecStats = getStats(jsonDecodeTimes);
|
||||
log.console(`Average: ${jsonDecStats.avg.toFixed(2)} ms`);
|
||||
log.console(`Min: ${jsonDecStats.min.toFixed(2)} ms`);
|
||||
log.console(`Max: ${jsonDecStats.max.toFixed(2)} ms`);
|
||||
|
||||
log.console("\nJSON Encoding (ms):");
|
||||
const jsonEncStats = getStats(jsonEncodeTimes);
|
||||
def jsonEncStats = getStats(jsonEncodeTimes);
|
||||
log.console(`Average: ${jsonEncStats.avg.toFixed(2)} ms`);
|
||||
log.console(`Min: ${jsonEncStats.min.toFixed(2)} ms`);
|
||||
log.console(`Max: ${jsonEncStats.max.toFixed(2)} ms`);
|
||||
|
||||
log.console("\nNOTA Encoding (ms):");
|
||||
const notaEncStats = getStats(notaEncodeTimes);
|
||||
def notaEncStats = getStats(notaEncodeTimes);
|
||||
log.console(`Average: ${notaEncStats.avg.toFixed(2)} ms`);
|
||||
log.console(`Min: ${notaEncStats.min.toFixed(2)} ms`);
|
||||
log.console(`Max: ${notaEncStats.max.toFixed(2)} ms`);
|
||||
|
||||
log.console("\nNOTA Decoding (ms):");
|
||||
const notaDecStats = getStats(notaDecodeTimes);
|
||||
def notaDecStats = getStats(notaDecodeTimes);
|
||||
log.console(`Average: ${notaDecStats.avg.toFixed(2)} ms`);
|
||||
log.console(`Min: ${notaDecStats.min.toFixed(2)} ms`);
|
||||
log.console(`Max: ${notaDecStats.max.toFixed(2)} ms`);
|
||||
|
||||
|
||||
2132
benchmarks/nota.json
2132
benchmarks/nota.json
File diff suppressed because it is too large
Load Diff
64
benchmarks/spectral-norm.ce
Normal file
64
benchmarks/spectral-norm.ce
Normal file
@@ -0,0 +1,64 @@
|
||||
def math = use('math/radians');
|
||||
|
||||
function A(i,j) {
|
||||
return 1/((i+j)*(i+j+1)/2+i+1);
|
||||
}
|
||||
|
||||
function Au(u,v) {
|
||||
var i = 0
|
||||
var j = 0
|
||||
var t = null
|
||||
for (i = 0; i < length(u); ++i) {
|
||||
t = 0;
|
||||
for (j = 0; j < length(u); ++j)
|
||||
t += A(i,j) * u[j];
|
||||
|
||||
v[i] = t;
|
||||
}
|
||||
}
|
||||
|
||||
function Atu(u,v) {
|
||||
var i = 0
|
||||
var j = 0
|
||||
var t = null
|
||||
for (i = 0; i < length(u); ++i) {
|
||||
t = 0;
|
||||
for (j = 0; j < length(u); ++j)
|
||||
t += A(j,i) * u[j];
|
||||
|
||||
v[i] = t;
|
||||
}
|
||||
}
|
||||
|
||||
function AtAu(u,v,w) {
|
||||
Au(u,w);
|
||||
Atu(w,v);
|
||||
}
|
||||
|
||||
function spectralnorm(n) {
|
||||
var i = 0
|
||||
var u = []
|
||||
var v = []
|
||||
var w = []
|
||||
var vv = 0
|
||||
var vBv = 0
|
||||
for (i = 0; i < n; ++i) {
|
||||
u[i] = 1; v[i] = 0; w[i] = 0;
|
||||
}
|
||||
|
||||
for (i = 0; i < 10; ++i) {
|
||||
AtAu(u,v,w);
|
||||
AtAu(v,u,w);
|
||||
}
|
||||
|
||||
for (i = 0; i < n; ++i) {
|
||||
vBv += u[i]*v[i];
|
||||
vv += v[i]*v[i];
|
||||
}
|
||||
|
||||
return math.sqrt(vBv/vv);
|
||||
}
|
||||
|
||||
log.console(spectralnorm(arg[0]).toFixed(9));
|
||||
|
||||
$stop()
|
||||
@@ -1,42 +1,23 @@
|
||||
//
|
||||
// wota_benchmark.js
|
||||
//
|
||||
// Usage in QuickJS:
|
||||
// qjs wota_benchmark.js
|
||||
//
|
||||
// Prerequisite:
|
||||
var wota = use('wota');
|
||||
var os = use('os');
|
||||
// or otherwise ensure `wota` and `os` are available.
|
||||
// Make sure wota_benchmark.js is loaded after wota.js or combined with it.
|
||||
//
|
||||
var wota = use('internal/wota');
|
||||
var os = use('internal/os');
|
||||
|
||||
var i = 0
|
||||
|
||||
// Helper to run a function repeatedly and measure total time in seconds.
|
||||
// Returns elapsed time in seconds.
|
||||
function measureTime(fn, iterations) {
|
||||
let t1 = os.now();
|
||||
for (let i = 0; i < iterations; i++) {
|
||||
var t1 = os.now();
|
||||
for (i = 0; i < iterations; i++) {
|
||||
fn();
|
||||
}
|
||||
let t2 = os.now();
|
||||
var t2 = os.now();
|
||||
return t2 - t1;
|
||||
}
|
||||
|
||||
// We'll define a function that does `encode -> decode` for a given value:
|
||||
function roundTripWota(value) {
|
||||
let encoded = wota.encode(value);
|
||||
let decoded = wota.decode(encoded);
|
||||
// Not doing a deep compare here, just measuring performance.
|
||||
// (We trust the test suite to verify correctness.)
|
||||
var encoded = wota.encode(value);
|
||||
var decoded = wota.decode(encoded);
|
||||
}
|
||||
|
||||
// A small suite of data we want to benchmark. Each entry includes:
|
||||
// name: label for printing
|
||||
// data: the test value(s) to encode/decode
|
||||
// iterations: how many times to loop
|
||||
//
|
||||
// You can tweak these as you like for heavier or lighter tests.
|
||||
const benchmarks = [
|
||||
def benchmarks = [
|
||||
{
|
||||
name: "Small Integers",
|
||||
data: [0, 42, -1, 2023],
|
||||
@@ -62,45 +43,28 @@ const benchmarks = [
|
||||
},
|
||||
{
|
||||
name: "Large Array (1k numbers)",
|
||||
// A thousand random numbers
|
||||
data: [ Array.from({length:1000}, (_, i) => i * 0.5) ],
|
||||
data: [ array(1000, i => i *0.5) ],
|
||||
iterations: 1000
|
||||
},
|
||||
{
|
||||
name: "Large Binary Blob (256KB)",
|
||||
// A 256KB ArrayBuffer
|
||||
data: [ new Uint8Array(256 * 1024).buffer ],
|
||||
iterations: 200
|
||||
}
|
||||
];
|
||||
|
||||
// Print a header
|
||||
log.console("Wota Encode/Decode Benchmark");
|
||||
log.console("============================\n");
|
||||
log.console("===================\n");
|
||||
|
||||
arrfor(benchmarks, function(bench) {
|
||||
var totalIterations = bench.iterations * length(bench.data);
|
||||
|
||||
// We'll run each benchmark scenario in turn.
|
||||
for (let bench of benchmarks) {
|
||||
// We'll measure how long it takes to do 'iterations' *for each test value*
|
||||
// in bench.data. The total loop count is `bench.iterations * bench.data.length`.
|
||||
// Then we compute an overall encode+decode throughput (ops/s).
|
||||
let totalIterations = bench.iterations * bench.data.length;
|
||||
|
||||
// We'll define a function that does a roundTrip for *each* data item in bench.data
|
||||
// to measure in one loop iteration. Then we multiply by bench.iterations.
|
||||
function runAllData() {
|
||||
for (let val of bench.data) {
|
||||
roundTripWota(val);
|
||||
}
|
||||
arrfor(bench.data, roundTripWota)
|
||||
}
|
||||
|
||||
let elapsedSec = measureTime(runAllData, bench.iterations);
|
||||
let opsPerSec = (totalIterations / elapsedSec).toFixed(1);
|
||||
var elapsedSec = measureTime(runAllData, bench.iterations);
|
||||
var opsPerSec = (totalIterations / elapsedSec).toFixed(1);
|
||||
|
||||
log.console(`${bench.name}:`);
|
||||
log.console(` Iterations: ${bench.iterations} × ${bench.data.length} data items = ${totalIterations}`);
|
||||
log.console(` Iterations: ${bench.iterations} × ${length(bench.data)} data items = ${totalIterations}`);
|
||||
log.console(` Elapsed: ${elapsedSec.toFixed(3)} s`);
|
||||
log.console(` Throughput: ${opsPerSec} encode+decode ops/sec\n`);
|
||||
}
|
||||
})
|
||||
|
||||
// All done
|
||||
log.console("Benchmark completed.\n");
|
||||
|
||||
@@ -1,80 +1,67 @@
|
||||
//
|
||||
// benchmark_wota_nota_json.js
|
||||
//
|
||||
// Usage in QuickJS:
|
||||
// qjs benchmark_wota_nota_json.js
|
||||
//
|
||||
// Ensure wota, nota, json, and os are all available, e.g.:
|
||||
var wota = use('wota');
|
||||
var nota = use('nota');
|
||||
var json = use('json');
|
||||
var os = use('os');
|
||||
//
|
||||
var wota = use('internal/wota');
|
||||
var nota = use('internal/nota');
|
||||
var json = use('json');
|
||||
var jswota = use('jswota')
|
||||
var os = use('internal/os');
|
||||
|
||||
////////////////////////////////////////////////////////////////////////////////
|
||||
// 1. Setup "libraries" array to easily switch among Wota, Nota, and JSON
|
||||
////////////////////////////////////////////////////////////////////////////////
|
||||
if (length(arg) != 2) {
|
||||
log.console('Usage: cell benchmark_wota_nota_json.ce <LibraryName> <ScenarioName>');
|
||||
$stop()
|
||||
}
|
||||
|
||||
const libraries = [
|
||||
var lib_name = arg[0];
|
||||
var scenario_name = arg[1];
|
||||
|
||||
def libraries = [
|
||||
{
|
||||
name: "Wota",
|
||||
name: "wota",
|
||||
encode: wota.encode,
|
||||
decode: wota.decode,
|
||||
// Wota produces an ArrayBuffer. We'll count `buffer.byteLength` as size.
|
||||
getSize(encoded) {
|
||||
return encoded.byteLength;
|
||||
return length(encoded);
|
||||
}
|
||||
},
|
||||
{
|
||||
name: "Nota",
|
||||
name: "nota",
|
||||
encode: nota.encode,
|
||||
decode: nota.decode,
|
||||
// Nota also produces an ArrayBuffer:
|
||||
getSize(encoded) {
|
||||
return encoded.byteLength;
|
||||
return length(encoded);
|
||||
}
|
||||
},
|
||||
{
|
||||
name: "JSON",
|
||||
name: "json",
|
||||
encode: json.encode,
|
||||
decode: json.decode,
|
||||
// JSON produces a JS string. We'll measure its UTF-16 code unit length
|
||||
// as a rough "size". Alternatively, you could convert to UTF-8 for
|
||||
// a more accurate byte size. Here we just use `string.length`.
|
||||
getSize(encodedStr) {
|
||||
return encodedStr.length;
|
||||
return length(encodedStr);
|
||||
}
|
||||
}
|
||||
];
|
||||
|
||||
////////////////////////////////////////////////////////////////////////////////
|
||||
// 2. Test data sets (similar to wota benchmarks).
|
||||
// Each scenario has { name, data, iterations }
|
||||
////////////////////////////////////////////////////////////////////////////////
|
||||
|
||||
const benchmarks = [
|
||||
def benchmarks = [
|
||||
{
|
||||
name: "Empty object",
|
||||
name: "empty",
|
||||
data: [{}, {}, {}, {}],
|
||||
iterations: 10000
|
||||
},
|
||||
{
|
||||
name: "Small Integers",
|
||||
name: "integers",
|
||||
data: [0, 42, -1, 2023],
|
||||
iterations: 100000
|
||||
},
|
||||
{
|
||||
name: "Floating point",
|
||||
name: "floats",
|
||||
data: [0.1, 1e-50, 3.14159265359],
|
||||
iterations: 100000
|
||||
},
|
||||
{
|
||||
name: "Strings (short, emoji)",
|
||||
data: ["Hello, Wota!", "short", "Emoji: \u{1f600}\u{1f64f}"],
|
||||
name: "strings",
|
||||
data: ["Hello, wota!", "short", "Emoji: \u{1f600}\u{1f64f}"],
|
||||
iterations: 100000
|
||||
},
|
||||
{
|
||||
name: "Small Objects",
|
||||
name: "objects",
|
||||
data: [
|
||||
{ a:1, b:2.2, c:"3", d:false },
|
||||
{ x:42, y:null, z:"test" }
|
||||
@@ -82,108 +69,86 @@ const benchmarks = [
|
||||
iterations: 50000
|
||||
},
|
||||
{
|
||||
name: "Nested Arrays",
|
||||
name: "nested",
|
||||
data: [ [ [ [1,2], [3,4] ] ], [[[]]], [1, [2, [3, [4]]]] ],
|
||||
iterations: 50000
|
||||
},
|
||||
{
|
||||
name: "Large Array (1k integers)",
|
||||
data: [ Array.from({length:1000}, (_, i) => i) ],
|
||||
name: "large_array",
|
||||
data: [ array(1000, i => i) ],
|
||||
iterations: 1000
|
||||
},
|
||||
{
|
||||
name: "Large Binary Blob (256KB)",
|
||||
data: [ new Uint8Array(256 * 1024).buffer ],
|
||||
iterations: 200
|
||||
}
|
||||
];
|
||||
|
||||
////////////////////////////////////////////////////////////////////////////////
|
||||
// 3. Utility: measureTime(fn) => how long fn() takes in seconds.
|
||||
////////////////////////////////////////////////////////////////////////////////
|
||||
|
||||
function measureTime(fn) {
|
||||
let start = os.now();
|
||||
var start = os.now();
|
||||
fn();
|
||||
let end = os.now();
|
||||
return (end - start); // in seconds
|
||||
var end = os.now();
|
||||
return (end - start);
|
||||
}
|
||||
|
||||
////////////////////////////////////////////////////////////////////////////////
|
||||
// 4. For each library, we run each benchmark scenario and measure:
|
||||
// - Encoding time (seconds)
|
||||
// - Decoding time (seconds)
|
||||
// - Total encoded size (bytes or code units for JSON)
|
||||
//
|
||||
////////////////////////////////////////////////////////////////////////////////
|
||||
|
||||
function runBenchmarkForLibrary(lib, bench) {
|
||||
// We'll encode and decode each item in `bench.data`.
|
||||
// We do 'bench.iterations' times. Then sum up total time.
|
||||
var encodedList = [];
|
||||
var totalSize = 0;
|
||||
var i = 0
|
||||
var j = 0
|
||||
var e = null
|
||||
|
||||
// Pre-store the encoded results for all items so we can measure decode time
|
||||
// in a separate pass. Also measure total size once.
|
||||
let encodedList = [];
|
||||
let totalSize = 0;
|
||||
|
||||
// 1) Measure ENCODING
|
||||
let encodeTime = measureTime(() => {
|
||||
for (let i = 0; i < bench.iterations; i++) {
|
||||
// For each data item, encode it
|
||||
for (let j = 0; j < bench.data.length; j++) {
|
||||
let e = lib.encode(bench.data[j]);
|
||||
// store only in the very first iteration, so we can decode them later
|
||||
// but do not store them every iteration or we blow up memory.
|
||||
if (i === 0) {
|
||||
encodedList.push(e);
|
||||
var encodeTime = measureTime(() => {
|
||||
for (i = 0; i < bench.iterations; i++) {
|
||||
for (j = 0; j < length(bench.data); j++) {
|
||||
e = lib.encode(bench.data[j]);
|
||||
if (i == 0) {
|
||||
encodedList[] = e;
|
||||
totalSize += lib.getSize(e);
|
||||
}
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
// 2) Measure DECODING
|
||||
let decodeTime = measureTime(() => {
|
||||
for (let i = 0; i < bench.iterations; i++) {
|
||||
// decode everything we stored during the first iteration
|
||||
for (let e of encodedList) {
|
||||
let decoded = lib.decode(e);
|
||||
// not verifying correctness here, just measuring speed
|
||||
}
|
||||
var decodeTime = measureTime(() => {
|
||||
for (i = 0; i < bench.iterations; i++) {
|
||||
arrfor(encodedList, lib.decode)
|
||||
}
|
||||
});
|
||||
|
||||
return { encodeTime, decodeTime, totalSize };
|
||||
}
|
||||
|
||||
////////////////////////////////////////////////////////////////////////////////
|
||||
// 5. Main driver: run across all benchmarks, for each library.
|
||||
////////////////////////////////////////////////////////////////////////////////
|
||||
var lib = libraries[find(libraries, l => l.name == lib_name)];
|
||||
var bench = benchmarks[find(benchmarks, b => b.name == scenario_name)];
|
||||
|
||||
log.console("Benchmark: Wota vs Nota vs JSON");
|
||||
log.console("================================\n");
|
||||
|
||||
for (let bench of benchmarks) {
|
||||
log.console(`SCENARIO: ${bench.name}`);
|
||||
log.console(` Data length: ${bench.data.length} | Iterations: ${bench.iterations}\n`);
|
||||
|
||||
for (let lib of libraries) {
|
||||
let { encodeTime, decodeTime, totalSize } = runBenchmarkForLibrary(lib, bench);
|
||||
|
||||
// We'll compute total operations = bench.iterations * bench.data.length
|
||||
let totalOps = bench.iterations * bench.data.length;
|
||||
let encOpsPerSec = (totalOps / encodeTime).toFixed(1);
|
||||
let decOpsPerSec = (totalOps / decodeTime).toFixed(1);
|
||||
|
||||
log.console(` ${lib.name}:`);
|
||||
log.console(` Encode time: ${encodeTime.toFixed(3)}s => ${encOpsPerSec} encodes/sec [${(encodeTime/bench.iterations)*1000000000} ns/try]`);
|
||||
log.console(` Decode time: ${decodeTime.toFixed(3)}s => ${decOpsPerSec} decodes/sec [${(decodeTime/bench.iterations)*1000000000}/try]`);
|
||||
log.console(` Total size: ${totalSize} bytes (or code units for JSON)`);
|
||||
log.console("");
|
||||
}
|
||||
log.console("---------------------------------------------------------\n");
|
||||
if (!lib) {
|
||||
log.console('Unknown library:', lib_name);
|
||||
log.console('Available libraries:', text(array(libraries, l => l.name), ', '));
|
||||
$stop()
|
||||
}
|
||||
|
||||
log.console("Benchmark complete.\n");
|
||||
if (!bench) {
|
||||
log.console('Unknown scenario:', scenario_name);
|
||||
log.console('Available scenarios:', text(array(benchmarks, b => b.name), ', '));
|
||||
$stop()
|
||||
}
|
||||
|
||||
os.exit()
|
||||
var bench_result = runBenchmarkForLibrary(lib, bench);
|
||||
var encodeTime = bench_result.encodeTime;
|
||||
var decodeTime = bench_result.decodeTime;
|
||||
var totalSize = bench_result.totalSize;
|
||||
|
||||
var totalOps = bench.iterations * length(bench.data);
|
||||
var result = {
|
||||
lib: lib_name,
|
||||
scenario: scenario_name,
|
||||
encodeTime: encodeTime,
|
||||
decodeTime: decodeTime,
|
||||
totalSize: totalSize,
|
||||
totalOps: totalOps,
|
||||
encodeOpsPerSec: totalOps / encodeTime,
|
||||
decodeOpsPerSec: totalOps / decodeTime,
|
||||
encodeNsPerOp: (encodeTime / totalOps) * 1e9,
|
||||
decodeNsPerOp: (decodeTime / totalOps) * 1e9
|
||||
};
|
||||
|
||||
log.console(result);
|
||||
|
||||
$stop()
|
||||
|
||||
192
boot.ce
Normal file
192
boot.ce
Normal file
@@ -0,0 +1,192 @@
|
||||
// cell boot [--native] <program> - Pre-compile all module dependencies in parallel
|
||||
//
|
||||
// Discovers all transitive module dependencies for a program,
|
||||
// checks which are not yet cached, and compiles uncached ones
|
||||
// in parallel using worker actors composed via parallel() requestors.
|
||||
//
|
||||
// Also used as a child actor by engine.cm for auto-boot.
|
||||
|
||||
var shop = use('internal/shop')
|
||||
var fd = use('fd')
|
||||
var pkg_tools = use('package')
|
||||
var build = use('build')
|
||||
|
||||
var is_native = false
|
||||
var target_prog = null
|
||||
var target_pkg = null
|
||||
var i = 0
|
||||
|
||||
// Child actor mode: receive message from engine.cm
|
||||
var _child_mode = false
|
||||
var run_boot = null
|
||||
|
||||
$receiver(function(msg) {
|
||||
_child_mode = true
|
||||
is_native = msg.native || false
|
||||
target_prog = msg.program
|
||||
target_pkg = msg.package
|
||||
run_boot()
|
||||
})
|
||||
|
||||
// CLI mode: parse arguments
|
||||
if (args && length(args) > 0) {
|
||||
for (i = 0; i < length(args); i = i + 1) {
|
||||
if (args[i] == '--native') {
|
||||
is_native = true
|
||||
} else if (args[i] == '--help' || args[i] == '-h') {
|
||||
log.console("Usage: cell boot [--native] <program>")
|
||||
log.console("")
|
||||
log.console("Pre-compile all module dependencies for a program.")
|
||||
log.console("Uncached modules are compiled in parallel.")
|
||||
$stop()
|
||||
} else if (!starts_with(args[i], '-')) {
|
||||
target_prog = args[i]
|
||||
}
|
||||
}
|
||||
if (!target_prog) {
|
||||
log.error("boot: no program specified")
|
||||
$stop()
|
||||
}
|
||||
}
|
||||
|
||||
// Discover all transitive module dependencies for a file
|
||||
function discover_deps(file_path) {
|
||||
return shop.trace_deps(file_path)
|
||||
}
|
||||
|
||||
// Filter out already-cached modules
|
||||
function filter_uncached(deps) {
|
||||
var uncached = []
|
||||
var j = 0
|
||||
var s = null
|
||||
|
||||
j = 0
|
||||
while (j < length(deps.scripts)) {
|
||||
s = deps.scripts[j]
|
||||
if (is_native) {
|
||||
if (!shop.is_native_cached(s.path, s.package)) {
|
||||
uncached[] = {type: 'native_script', path: s.path, package: s.package}
|
||||
}
|
||||
} else {
|
||||
if (!shop.is_cached(s.path)) {
|
||||
uncached[] = {type: 'script', path: s.path, package: s.package}
|
||||
}
|
||||
}
|
||||
j = j + 1
|
||||
}
|
||||
|
||||
// Expand C packages into individual files for parallel compilation
|
||||
var target = build.detect_host_target()
|
||||
var pkg = null
|
||||
var c_files = null
|
||||
var k = 0
|
||||
j = 0
|
||||
while (j < length(deps.c_packages)) {
|
||||
pkg = deps.c_packages[j]
|
||||
if (pkg != 'core') {
|
||||
c_files = pkg_tools.get_c_files(pkg, target, true)
|
||||
k = 0
|
||||
while (k < length(c_files)) {
|
||||
uncached[] = {type: 'c_file', package: pkg, file: c_files[k]}
|
||||
k = k + 1
|
||||
}
|
||||
}
|
||||
j = j + 1
|
||||
}
|
||||
|
||||
return uncached
|
||||
}
|
||||
|
||||
function item_name(item) {
|
||||
if (item.path) return item.path
|
||||
if (item.file) return item.package + '/' + item.file
|
||||
return item.package
|
||||
}
|
||||
|
||||
// Create a requestor that spawns a compile_worker actor for one item
|
||||
function make_compile_requestor(item) {
|
||||
var worker = null
|
||||
var name = item_name(item)
|
||||
return function(callback, value) {
|
||||
log.console('boot: spawning worker for ' + name)
|
||||
$start(function(event) {
|
||||
if (event.type == 'greet') {
|
||||
worker = event.actor
|
||||
send(event.actor, {
|
||||
type: item.type,
|
||||
path: item.path,
|
||||
package: item.package,
|
||||
file: item.file
|
||||
})
|
||||
}
|
||||
if (event.type == 'stop') {
|
||||
callback(name)
|
||||
}
|
||||
if (event.type == 'disrupt') {
|
||||
log.error('boot: worker failed for ' + name)
|
||||
callback(null, {message: 'compile failed: ' + name})
|
||||
}
|
||||
}, 'compile_worker')
|
||||
return function cancel(reason) {
|
||||
if (worker) $stop(worker)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
run_boot = function() {
|
||||
var prog_path = null
|
||||
var prog_info = null
|
||||
var deps = null
|
||||
var uncached = null
|
||||
var requestors = null
|
||||
var p = null
|
||||
|
||||
// Resolve the program path
|
||||
if (target_prog) {
|
||||
p = target_prog
|
||||
if (ends_with(p, '.ce')) p = text(p, 0, -3)
|
||||
prog_info = shop.resolve_program ? shop.resolve_program(p, target_pkg) : null
|
||||
if (prog_info) {
|
||||
prog_path = prog_info.path
|
||||
if (!target_pkg && prog_info.pkg) target_pkg = prog_info.pkg
|
||||
} else {
|
||||
prog_path = p + '.ce'
|
||||
if (!fd.is_file(prog_path)) {
|
||||
prog_path = null
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if (!prog_path || !fd.is_file(prog_path)) {
|
||||
log.error('boot: could not find program: ' + text(target_prog || ''))
|
||||
$stop()
|
||||
return
|
||||
}
|
||||
|
||||
// Discover all transitive deps
|
||||
deps = discover_deps(prog_path)
|
||||
uncached = filter_uncached(deps)
|
||||
|
||||
if (length(uncached) == 0) {
|
||||
log.console('boot: all modules cached')
|
||||
$stop()
|
||||
return
|
||||
}
|
||||
|
||||
// Compile uncached modules in parallel using worker actors
|
||||
log.console('boot: ' + text(length(uncached)) + ' modules to compile')
|
||||
requestors = array(uncached, make_compile_requestor)
|
||||
parallel(requestors)(function(results, reason) {
|
||||
if (reason) {
|
||||
log.error('boot: ' + (reason.message || text(reason)))
|
||||
} else {
|
||||
log.console('boot: compiled ' + text(length(results)) + ' modules')
|
||||
}
|
||||
$stop()
|
||||
}, null)
|
||||
}
|
||||
|
||||
// CLI mode: start immediately
|
||||
if (!_child_mode && target_prog) {
|
||||
run_boot()
|
||||
}
|
||||
5417
boot/bootstrap.cm.mcode
Normal file
5417
boot/bootstrap.cm.mcode
Normal file
File diff suppressed because one or more lines are too long
5963
boot/fold.cm.mcode
Normal file
5963
boot/fold.cm.mcode
Normal file
File diff suppressed because it is too large
Load Diff
14655
boot/mcode.cm.mcode
Normal file
14655
boot/mcode.cm.mcode
Normal file
File diff suppressed because one or more lines are too long
12058
boot/parse.cm.mcode
Normal file
12058
boot/parse.cm.mcode
Normal file
File diff suppressed because one or more lines are too long
2764
boot/qbe.cm.mcode
Normal file
2764
boot/qbe.cm.mcode
Normal file
File diff suppressed because it is too large
Load Diff
34991
boot/qbe_emit.cm.mcode
Normal file
34991
boot/qbe_emit.cm.mcode
Normal file
File diff suppressed because one or more lines are too long
16839
boot/streamline.cm.mcode
Normal file
16839
boot/streamline.cm.mcode
Normal file
File diff suppressed because one or more lines are too long
4254
boot/tokenize.cm.mcode
Normal file
4254
boot/tokenize.cm.mcode
Normal file
File diff suppressed because one or more lines are too long
74
boot_miscompile_bad.cm
Normal file
74
boot_miscompile_bad.cm
Normal file
@@ -0,0 +1,74 @@
|
||||
// boot_miscompile_bad.cm — Documents a boot compiler miscompilation bug.
|
||||
//
|
||||
// BUG SUMMARY:
|
||||
// The boot compiler's optimizer (likely compress_slots, eliminate_moves,
|
||||
// or infer_param_types) miscompiles a specific pattern when it appears
|
||||
// inside streamline.cm. The pattern: an array-loaded value used as a
|
||||
// dynamic index for another array store, inside a guarded block:
|
||||
//
|
||||
// sv = instr[j]
|
||||
// if (is_number(sv) && sv >= 0 && sv < nr_slots) {
|
||||
// last_ref[sv] = i // <-- miscompiled: sv reads wrong slot
|
||||
// }
|
||||
//
|
||||
// The bug is CONTEXT-DEPENDENT on streamline.cm's exact function/closure
|
||||
// structure. A standalone module with the same pattern does NOT trigger it.
|
||||
// The boot optimizer's cross-function analysis (infer_param_types, type
|
||||
// propagation, etc.) makes different decisions in the full streamline.cm
|
||||
// context, leading to the miscompilation.
|
||||
//
|
||||
// SYMPTOMS:
|
||||
// - 'log' is not defined (comparison error path fires on non-comparable values)
|
||||
// - array index must be a number (store_dynamic with corrupted index)
|
||||
// - Error line has NO reference to 'log' — the reference comes from the
|
||||
// error-reporting code path of the < operator
|
||||
// - Non-deterministic: different error messages on different runs
|
||||
// - NOT a GC bug: persists with --heap 4GB
|
||||
// - NOT slot overflow: function has only 85 raw slots
|
||||
//
|
||||
// TO REPRODUCE:
|
||||
// In streamline.cm, replace the build_slot_liveness function body with
|
||||
// this version (raw operand scanning instead of get_slot_refs):
|
||||
//
|
||||
// var build_slot_liveness = function(instructions, nr_slots) {
|
||||
// var last_ref = array(nr_slots, -1)
|
||||
// var n = length(instructions)
|
||||
// var i = 0
|
||||
// var j = 0
|
||||
// var limit = 0
|
||||
// var sv = 0
|
||||
// var instr = null
|
||||
//
|
||||
// while (i < n) {
|
||||
// instr = instructions[i]
|
||||
// if (is_array(instr)) {
|
||||
// j = 1
|
||||
// limit = length(instr) - 2
|
||||
// while (j < limit) {
|
||||
// sv = instr[j]
|
||||
// if (is_number(sv) && sv >= 0 && sv < nr_slots) {
|
||||
// last_ref[sv] = i
|
||||
// }
|
||||
// j = j + 1
|
||||
// }
|
||||
// }
|
||||
// i = i + 1
|
||||
// }
|
||||
// return last_ref
|
||||
// }
|
||||
//
|
||||
// Then: rm -rf .cell/build && ./cell --dev vm_suite
|
||||
//
|
||||
// WORKAROUND:
|
||||
// Use get_slot_refs(instr) to iterate only over known slot-reference
|
||||
// positions. This produces different IR that the boot optimizer handles
|
||||
// correctly, and is also more semantically correct.
|
||||
//
|
||||
// FIXING:
|
||||
// To find the root cause, compare the boot-compiled bytecodes of
|
||||
// build_slot_liveness (in the full streamline.cm context) vs the
|
||||
// source-compiled bytecodes. Use disasm.ce with --optimized to see
|
||||
// what the source compiler produces. The boot-compiled bytecodes
|
||||
// would need a C-level MachCode dump to inspect.
|
||||
|
||||
return null
|
||||
130
build.ce
Normal file
130
build.ce
Normal file
@@ -0,0 +1,130 @@
|
||||
// cell build [<locator>] - Build dynamic libraries locally for the current machine
|
||||
//
|
||||
// Usage:
|
||||
// cell build Build dynamic libraries for all packages in shop
|
||||
// cell build . Build dynamic library for current directory package
|
||||
// cell build <locator> Build dynamic library for specific package
|
||||
// cell build -t <target> Cross-compile dynamic libraries for target platform
|
||||
// cell build -b <type> Build type: release (default), debug, or minsize
|
||||
// cell build --verbose Print resolved flags, commands, and cache status
|
||||
|
||||
var build = use('build')
|
||||
var shop = use('internal/shop')
|
||||
var pkg_tools = use('package')
|
||||
var fd = use('fd')
|
||||
|
||||
var target = null
|
||||
var target_package = null
|
||||
var buildtype = 'release'
|
||||
var verbose = false
|
||||
var force_rebuild = false
|
||||
var dry_run = false
|
||||
var i = 0
|
||||
var targets = null
|
||||
var t = 0
|
||||
var lib = null
|
||||
var results = null
|
||||
var success = 0
|
||||
var failed = 0
|
||||
|
||||
var run = function() {
|
||||
for (i = 0; i < length(args); i++) {
|
||||
if (args[i] == '-t' || args[i] == '--target') {
|
||||
if (i + 1 < length(args)) {
|
||||
target = args[++i]
|
||||
} else {
|
||||
log.error('-t requires a target')
|
||||
return
|
||||
}
|
||||
} else if (args[i] == '-p' || args[i] == '--package') {
|
||||
// Legacy support for -p flag
|
||||
if (i + 1 < length(args)) {
|
||||
target_package = args[++i]
|
||||
} else {
|
||||
log.error('-p requires a package name')
|
||||
return
|
||||
}
|
||||
} else if (args[i] == '-b' || args[i] == '--buildtype') {
|
||||
if (i + 1 < length(args)) {
|
||||
buildtype = args[++i]
|
||||
if (buildtype != 'release' && buildtype != 'debug' && buildtype != 'minsize') {
|
||||
log.error('Invalid buildtype: ' + buildtype + '. Must be release, debug, or minsize')
|
||||
return
|
||||
}
|
||||
} else {
|
||||
log.error('-b requires a buildtype (release, debug, minsize)')
|
||||
return
|
||||
}
|
||||
} else if (args[i] == '--force') {
|
||||
force_rebuild = true
|
||||
} else if (args[i] == '--verbose' || args[i] == '-v') {
|
||||
verbose = true
|
||||
} else if (args[i] == '--dry-run') {
|
||||
dry_run = true
|
||||
} else if (args[i] == '--list-targets') {
|
||||
log.console('Available targets:')
|
||||
targets = build.list_targets()
|
||||
for (t = 0; t < length(targets); t++) {
|
||||
log.console(' ' + targets[t])
|
||||
}
|
||||
return
|
||||
} else if (!starts_with(args[i], '-') && !target_package) {
|
||||
// Positional argument - treat as package locator
|
||||
target_package = args[i]
|
||||
}
|
||||
}
|
||||
|
||||
if (target_package)
|
||||
target_package = shop.resolve_locator(target_package)
|
||||
|
||||
// Detect target if not specified
|
||||
if (!target) {
|
||||
target = build.detect_host_target()
|
||||
if (target) log.console('Target: ' + target)
|
||||
}
|
||||
|
||||
if (target && !build.has_target(target)) {
|
||||
log.error('Invalid target: ' + target)
|
||||
log.console('Available targets: ' + text(build.list_targets(), ', '))
|
||||
return
|
||||
}
|
||||
|
||||
var packages = shop.list_packages()
|
||||
arrfor(packages, function(package) {
|
||||
if (package == 'core') return
|
||||
shop.sync(package, {no_build: true})
|
||||
})
|
||||
|
||||
var _build = null
|
||||
if (target_package) {
|
||||
// Build single package
|
||||
log.console('Building ' + target_package + '...')
|
||||
_build = function() {
|
||||
lib = build.build_dynamic(target_package, target, buildtype, {verbose: verbose, force: force_rebuild})
|
||||
if (lib) {
|
||||
log.console(`Built ${text(length(lib))} module(s)`)
|
||||
}
|
||||
} disruption {
|
||||
log.error('Build failed')
|
||||
$stop()
|
||||
}
|
||||
_build()
|
||||
} else {
|
||||
// Build all packages
|
||||
log.console('Building all packages...')
|
||||
results = build.build_all_dynamic(target, buildtype, {verbose: verbose, force: force_rebuild})
|
||||
|
||||
success = 0
|
||||
failed = 0
|
||||
for (i = 0; i < length(results); i++) {
|
||||
if (results[i].modules) {
|
||||
success = success + length(results[i].modules)
|
||||
}
|
||||
}
|
||||
|
||||
log.console(`Build complete: ${success} libraries built${failed > 0 ? `, ${failed} failed` : ''}`)
|
||||
}
|
||||
}
|
||||
run()
|
||||
|
||||
$stop()
|
||||
1
cake/playdate.cm
Normal file
1
cake/playdate.cm
Normal file
@@ -0,0 +1 @@
|
||||
// cake file for making a playdate package
|
||||
2
cell.toml
Normal file
2
cell.toml
Normal file
@@ -0,0 +1,2 @@
|
||||
[compilation.playdate]
|
||||
CFLAGS = "-DMINIZ_NO_TIME -DTARGET_EXTENSION -DTARGET_PLAYDATE -I$LOCAL/PlaydateSDK/C_API"
|
||||
589
cellfs.cm
Normal file
589
cellfs.cm
Normal file
@@ -0,0 +1,589 @@
|
||||
var cellfs = {}
|
||||
|
||||
var fd = use('fd')
|
||||
var miniz = use('miniz')
|
||||
var qop = use('internal/qop')
|
||||
var wildstar = use('internal/wildstar')
|
||||
var blib = use('blob')
|
||||
|
||||
var mounts = []
|
||||
|
||||
var write_mount = null
|
||||
|
||||
function normalize_path(path) {
|
||||
if (!path) return ""
|
||||
return replace(path, /^\/+|\/+$/, "")
|
||||
}
|
||||
|
||||
function mount_exists(mount, path) {
|
||||
var result = false
|
||||
var full_path = null
|
||||
var st = null
|
||||
var _check = null
|
||||
if (mount.type == 'zip') {
|
||||
_check = function() {
|
||||
mount.handle.mod(path)
|
||||
result = true
|
||||
} disruption {}
|
||||
_check()
|
||||
} else if (mount.type == 'qop') {
|
||||
_check = function() {
|
||||
result = mount.handle.stat(path) != null
|
||||
} disruption {}
|
||||
_check()
|
||||
} else if (mount.type == 'fs') {
|
||||
full_path = fd.join_paths(mount.source, path)
|
||||
_check = function() {
|
||||
st = fd.stat(full_path)
|
||||
result = st.isFile || st.isDirectory
|
||||
} disruption {}
|
||||
_check()
|
||||
}
|
||||
return result
|
||||
}
|
||||
|
||||
function is_directory(path) {
|
||||
var res = resolve(path)
|
||||
var mount = res.mount
|
||||
var result = false
|
||||
var full_path = null
|
||||
var st = null
|
||||
var _check = null
|
||||
if (mount.type == 'zip') {
|
||||
_check = function() {
|
||||
result = mount.handle.is_directory(path)
|
||||
} disruption {}
|
||||
_check()
|
||||
} else if (mount.type == 'qop') {
|
||||
_check = function() {
|
||||
result = mount.handle.is_directory(path)
|
||||
} disruption {}
|
||||
_check()
|
||||
} else {
|
||||
full_path = fd.join_paths(mount.source, path)
|
||||
_check = function() {
|
||||
st = fd.stat(full_path)
|
||||
result = st.isDirectory
|
||||
} disruption {}
|
||||
_check()
|
||||
}
|
||||
return result
|
||||
}
|
||||
|
||||
function resolve(path, must_exist) {
|
||||
var idx = null
|
||||
var mount_name = ""
|
||||
var rel_path = ""
|
||||
var mount = null
|
||||
var found_mount = null
|
||||
var npath = normalize_path(path)
|
||||
|
||||
if (starts_with(npath, "@")) {
|
||||
idx = search(npath, "/")
|
||||
|
||||
if (idx == null) {
|
||||
mount_name = text(npath, 1)
|
||||
rel_path = ""
|
||||
} else {
|
||||
mount_name = text(npath, 1, idx)
|
||||
rel_path = text(npath, idx + 1)
|
||||
}
|
||||
|
||||
arrfor(mounts, function(m) {
|
||||
if (m.name == mount_name) {
|
||||
mount = m
|
||||
return true
|
||||
}
|
||||
}, false, true)
|
||||
|
||||
if (!mount) {
|
||||
log.error("Unknown mount point: @" + mount_name); disrupt
|
||||
}
|
||||
|
||||
return { mount: mount, path: rel_path }
|
||||
}
|
||||
|
||||
arrfor(mounts, function(m) {
|
||||
if (mount_exists(m, npath)) {
|
||||
found_mount = { mount: m, path: npath }
|
||||
return true
|
||||
}
|
||||
}, false, true)
|
||||
|
||||
if (found_mount) {
|
||||
return found_mount
|
||||
}
|
||||
|
||||
if (must_exist) {
|
||||
log.error("File not found in any mount: " + npath); disrupt
|
||||
}
|
||||
}
|
||||
|
||||
function mount(source, name) {
|
||||
var st = null
|
||||
var blob = null
|
||||
var qop_archive = null
|
||||
var zip = null
|
||||
var _try_qop = null
|
||||
var http = null
|
||||
|
||||
var mount_info = {
|
||||
source: source,
|
||||
name: name || null,
|
||||
type: 'fs',
|
||||
handle: null,
|
||||
zip_blob: null
|
||||
}
|
||||
|
||||
if (starts_with(source, 'http://') || starts_with(source, 'https://')) {
|
||||
http = use('http')
|
||||
mount_info.type = 'http'
|
||||
mount_info.handle = {
|
||||
base_url: source,
|
||||
get: function(path, callback) {
|
||||
var url = source + '/' + path
|
||||
$clock(function(_t) {
|
||||
var resp = http.request('GET', url, null, null)
|
||||
if (resp && resp.status == 200) {
|
||||
callback(resp.body)
|
||||
} else {
|
||||
callback(null, "HTTP " + text(resp ? resp.status : 0) + ": " + url)
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
mounts[] = mount_info
|
||||
return
|
||||
}
|
||||
|
||||
st = fd.stat(source)
|
||||
|
||||
if (st.isDirectory) {
|
||||
mount_info.type = 'fs'
|
||||
} else if (st.isFile) {
|
||||
blob = fd.slurp(source)
|
||||
|
||||
qop_archive = null
|
||||
_try_qop = function() {
|
||||
qop_archive = qop.open(blob)
|
||||
} disruption {}
|
||||
_try_qop()
|
||||
|
||||
if (qop_archive) {
|
||||
mount_info.type = 'qop'
|
||||
mount_info.handle = qop_archive
|
||||
mount_info.zip_blob = blob
|
||||
} else {
|
||||
zip = miniz.read(blob)
|
||||
if (!is_object(zip) || !is_function(zip.count)) {
|
||||
log.error("Invalid archive file (not zip or qop): " + source); disrupt
|
||||
}
|
||||
|
||||
mount_info.type = 'zip'
|
||||
mount_info.handle = zip
|
||||
mount_info.zip_blob = blob
|
||||
}
|
||||
} else {
|
||||
log.error("Unsupported mount source type: " + source); disrupt
|
||||
}
|
||||
|
||||
mounts[] = mount_info
|
||||
}
|
||||
|
||||
function unmount(name_or_source) {
|
||||
mounts = filter(mounts, function(m) {
|
||||
return m.name != name_or_source && m.source != name_or_source
|
||||
})
|
||||
}
|
||||
|
||||
function slurp(path) {
|
||||
var res = resolve(path, true)
|
||||
var data = null
|
||||
var full_path = null
|
||||
if (!res) { log.error("File not found: " + path); disrupt }
|
||||
|
||||
if (res.mount.type == 'zip') {
|
||||
return res.mount.handle.slurp(res.path)
|
||||
} else if (res.mount.type == 'qop') {
|
||||
data = res.mount.handle.read(res.path)
|
||||
if (!data) { log.error("File not found in qop: " + path); disrupt }
|
||||
return data
|
||||
} else {
|
||||
full_path = fd.join_paths(res.mount.source, res.path)
|
||||
return fd.slurp(full_path)
|
||||
}
|
||||
}
|
||||
|
||||
function slurpwrite(path, data) {
|
||||
var full_path = null
|
||||
if (write_mount) {
|
||||
full_path = fd.join_paths(write_mount.source, path)
|
||||
} else {
|
||||
full_path = fd.join_paths(".", path)
|
||||
}
|
||||
fd.slurpwrite(full_path, data)
|
||||
}
|
||||
|
||||
function exists(path) {
|
||||
var res = resolve(path, false)
|
||||
if (starts_with(path, "@")) {
|
||||
return mount_exists(res.mount, res.path)
|
||||
}
|
||||
return res != null
|
||||
}
|
||||
|
||||
function stat(path) {
|
||||
var res = resolve(path, true)
|
||||
var mod = null
|
||||
var s = null
|
||||
var full_path = null
|
||||
if (!res) { log.error("File not found: " + path); disrupt }
|
||||
|
||||
if (res.mount.type == 'zip') {
|
||||
mod = res.mount.handle.mod(res.path)
|
||||
return {
|
||||
filesize: 0,
|
||||
modtime: mod * 1000,
|
||||
isDirectory: false
|
||||
}
|
||||
} else if (res.mount.type == 'qop') {
|
||||
s = res.mount.handle.stat(res.path)
|
||||
if (!s) { log.error("File not found in qop: " + path); disrupt }
|
||||
return {
|
||||
filesize: s.size,
|
||||
modtime: s.modtime,
|
||||
isDirectory: s.isDirectory
|
||||
}
|
||||
} else {
|
||||
full_path = fd.join_paths(res.mount.source, res.path)
|
||||
s = fd.stat(full_path)
|
||||
return {
|
||||
filesize: s.size,
|
||||
modtime: s.mtime,
|
||||
isDirectory: s.isDirectory
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
function searchpath() {
|
||||
return array(mounts)
|
||||
}
|
||||
|
||||
function mount_package(name) {
|
||||
if (name == null) {
|
||||
mount('.', null)
|
||||
return
|
||||
}
|
||||
|
||||
var shop = use('internal/shop')
|
||||
var dir = shop.get_package_dir(name)
|
||||
|
||||
if (!dir) {
|
||||
log.error("Package not found: " + name); disrupt
|
||||
}
|
||||
|
||||
mount(dir, name)
|
||||
}
|
||||
|
||||
function match(str, pattern) {
|
||||
return wildstar.match(pattern, str, wildstar.WM_PATHNAME | wildstar.WM_PERIOD | wildstar.WM_WILDSTAR)
|
||||
}
|
||||
|
||||
function rm(path) {
|
||||
var res = resolve(path, true)
|
||||
var full_path = null
|
||||
var st = null
|
||||
if (res.mount.type != 'fs') { log.error("Cannot delete from non-fs mount"); disrupt }
|
||||
|
||||
full_path = fd.join_paths(res.mount.source, res.path)
|
||||
st = fd.stat(full_path)
|
||||
if (st.isDirectory) fd.rmdir(full_path)
|
||||
else fd.unlink(full_path)
|
||||
}
|
||||
|
||||
function mkdir(path) {
|
||||
var full = null
|
||||
if (write_mount) {
|
||||
full = fd.join_paths(write_mount.source, path)
|
||||
} else {
|
||||
full = fd.join_paths(".", path)
|
||||
}
|
||||
fd.mkdir(full)
|
||||
}
|
||||
|
||||
function set_writepath(mount_name) {
|
||||
var found = null
|
||||
if (mount_name == null) { write_mount = null; return }
|
||||
arrfor(mounts, function(m) {
|
||||
if (m.name == mount_name) { found = m; return true }
|
||||
}, false, true)
|
||||
if (!found || found.type != 'fs') {
|
||||
log.error("writepath: must be an fs mount"); disrupt
|
||||
}
|
||||
write_mount = found
|
||||
}
|
||||
|
||||
function basedir() {
|
||||
return fd.getcwd()
|
||||
}
|
||||
|
||||
function prefdir(org, app) {
|
||||
return "./"
|
||||
}
|
||||
|
||||
function realdir(path) {
|
||||
var res = resolve(path, false)
|
||||
if (!res) return null
|
||||
return fd.join_paths(res.mount.source, res.path)
|
||||
}
|
||||
|
||||
function enumerate(_path, recurse) {
|
||||
var path = _path == null ? "" : _path
|
||||
|
||||
var res = resolve(path, true)
|
||||
var results = []
|
||||
var full = null
|
||||
var st = null
|
||||
var all = null
|
||||
var prefix = null
|
||||
var prefix_len = null
|
||||
var seen = null
|
||||
|
||||
function visit(curr_full, rel_prefix) {
|
||||
var list = fd.readdir(curr_full)
|
||||
if (!list) return
|
||||
|
||||
arrfor(list, function(item) {
|
||||
var item_rel = rel_prefix ? rel_prefix + "/" + item : item
|
||||
var child_st = null
|
||||
results[] = item_rel
|
||||
|
||||
if (recurse) {
|
||||
child_st = fd.stat(fd.join_paths(curr_full, item))
|
||||
if (child_st.isDirectory) {
|
||||
visit(fd.join_paths(curr_full, item), item_rel)
|
||||
}
|
||||
}
|
||||
})
|
||||
}
|
||||
|
||||
if (res.mount.type == 'fs') {
|
||||
full = fd.join_paths(res.mount.source, res.path)
|
||||
st = fd.stat(full)
|
||||
if (st && st.isDirectory) {
|
||||
visit(full, "")
|
||||
}
|
||||
} else if (res.mount.type == 'qop') {
|
||||
all = res.mount.handle.list()
|
||||
prefix = res.path ? res.path + "/" : ""
|
||||
prefix_len = length(prefix)
|
||||
|
||||
seen = {}
|
||||
|
||||
arrfor(all, function(p) {
|
||||
var rel = null
|
||||
var slash = null
|
||||
if (starts_with(p, prefix)) {
|
||||
rel = text(p, prefix_len)
|
||||
if (length(rel) == 0) return
|
||||
|
||||
if (!recurse) {
|
||||
slash = search(rel, '/')
|
||||
if (slash != null) {
|
||||
rel = text(rel, 0, slash)
|
||||
}
|
||||
}
|
||||
|
||||
if (!seen[rel]) {
|
||||
seen[rel] = true
|
||||
results[] = rel
|
||||
}
|
||||
}
|
||||
})
|
||||
}
|
||||
|
||||
return results
|
||||
}
|
||||
|
||||
function globfs(globs, _dir) {
|
||||
var dir = _dir == null ? "" : _dir
|
||||
var res = resolve(dir, true)
|
||||
var results = []
|
||||
var full = null
|
||||
var st = null
|
||||
var all = null
|
||||
var prefix = null
|
||||
var prefix_len = null
|
||||
|
||||
function check_neg(path) {
|
||||
var result = false
|
||||
arrfor(globs, function(g) {
|
||||
if (starts_with(g, "!") && wildstar.match(text(g, 1), path, wildstar.WM_WILDSTAR)) {
|
||||
result = true
|
||||
return true
|
||||
}
|
||||
}, false, true)
|
||||
return result
|
||||
}
|
||||
|
||||
function check_pos(path) {
|
||||
var result = false
|
||||
arrfor(globs, function(g) {
|
||||
if (!starts_with(g, "!") && wildstar.match(g, path, wildstar.WM_WILDSTAR)) {
|
||||
result = true
|
||||
return true
|
||||
}
|
||||
}, false, true)
|
||||
return result
|
||||
}
|
||||
|
||||
function visit(curr_full, rel_prefix) {
|
||||
if (rel_prefix && check_neg(rel_prefix)) return
|
||||
|
||||
var list = fd.readdir(curr_full)
|
||||
if (!list) return
|
||||
|
||||
arrfor(list, function(item) {
|
||||
var item_rel = rel_prefix ? rel_prefix + "/" + item : item
|
||||
|
||||
var child_full = fd.join_paths(curr_full, item)
|
||||
var child_st = fd.stat(child_full)
|
||||
|
||||
if (child_st.isDirectory) {
|
||||
if (!check_neg(item_rel)) {
|
||||
visit(child_full, item_rel)
|
||||
}
|
||||
} else {
|
||||
if (!check_neg(item_rel) && check_pos(item_rel)) {
|
||||
results[] = item_rel
|
||||
}
|
||||
}
|
||||
})
|
||||
}
|
||||
|
||||
if (res.mount.type == 'fs') {
|
||||
full = fd.join_paths(res.mount.source, res.path)
|
||||
st = fd.stat(full)
|
||||
if (st && st.isDirectory) {
|
||||
visit(full, "")
|
||||
}
|
||||
} else if (res.mount.type == 'qop') {
|
||||
all = res.mount.handle.list()
|
||||
prefix = res.path ? res.path + "/" : ""
|
||||
prefix_len = length(prefix)
|
||||
|
||||
arrfor(all, function(p) {
|
||||
var rel = null
|
||||
if (starts_with(p, prefix)) {
|
||||
rel = text(p, prefix_len)
|
||||
if (length(rel) == 0) return
|
||||
|
||||
if (!check_neg(rel) && check_pos(rel)) {
|
||||
results[] = rel
|
||||
}
|
||||
}
|
||||
})
|
||||
}
|
||||
|
||||
return results
|
||||
}
|
||||
|
||||
// Requestor factory: returns a requestor for reading a file at path
|
||||
function get(path) {
|
||||
return function get_requestor(callback, value) {
|
||||
var res = resolve(path, false)
|
||||
var full = null
|
||||
var f = null
|
||||
var acc = null
|
||||
var cancelled = false
|
||||
var data = null
|
||||
var _close = null
|
||||
|
||||
if (!res) { callback(null, "not found: " + path); return }
|
||||
|
||||
if (res.mount.type == 'zip') {
|
||||
callback(res.mount.handle.slurp(res.path))
|
||||
return
|
||||
}
|
||||
if (res.mount.type == 'qop') {
|
||||
data = res.mount.handle.read(res.path)
|
||||
if (data) {
|
||||
callback(data)
|
||||
} else {
|
||||
callback(null, "not found in qop: " + path)
|
||||
}
|
||||
return
|
||||
}
|
||||
if (res.mount.type == 'http') {
|
||||
res.mount.handle.get(res.path, callback)
|
||||
return
|
||||
}
|
||||
|
||||
full = fd.join_paths(res.mount.source, res.path)
|
||||
f = fd.open(full, 'r')
|
||||
acc = blob()
|
||||
|
||||
function next(_t) {
|
||||
var chunk = null
|
||||
if (cancelled) return
|
||||
chunk = fd.read(f, 65536)
|
||||
if (length(chunk) == 0) {
|
||||
fd.close(f)
|
||||
stone(acc)
|
||||
callback(acc)
|
||||
return
|
||||
}
|
||||
acc.write_blob(chunk)
|
||||
$clock(next)
|
||||
}
|
||||
|
||||
next()
|
||||
return function cancel() {
|
||||
cancelled = true
|
||||
_close = function() { fd.close(f) } disruption {}
|
||||
_close()
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Requestor factory: returns a requestor for writing data to path
|
||||
function put(path, data) {
|
||||
return function put_requestor(callback, value) {
|
||||
var _data = data != null ? data : value
|
||||
var full = null
|
||||
var _do = null
|
||||
if (!write_mount) { callback(null, "no write mount set"); return }
|
||||
full = fd.join_paths(write_mount.source, path)
|
||||
_do = function() {
|
||||
fd.slurpwrite(full, _data)
|
||||
callback(true)
|
||||
} disruption {
|
||||
callback(null, "write failed: " + path)
|
||||
}
|
||||
_do()
|
||||
}
|
||||
}
|
||||
|
||||
cellfs.mount = mount
|
||||
cellfs.mount_package = mount_package
|
||||
cellfs.unmount = unmount
|
||||
cellfs.slurp = slurp
|
||||
cellfs.slurpwrite = slurpwrite
|
||||
cellfs.exists = exists
|
||||
cellfs.is_directory = is_directory
|
||||
cellfs.stat = stat
|
||||
cellfs.searchpath = searchpath
|
||||
cellfs.match = match
|
||||
cellfs.enumerate = enumerate
|
||||
cellfs.globfs = globfs
|
||||
cellfs.rm = rm
|
||||
cellfs.mkdir = mkdir
|
||||
cellfs.writepath = set_writepath
|
||||
cellfs.basedir = basedir
|
||||
cellfs.prefdir = prefdir
|
||||
cellfs.realdir = realdir
|
||||
cellfs.get = get
|
||||
cellfs.put = put
|
||||
cellfs.resolve = resolve
|
||||
|
||||
return cellfs
|
||||
456
cfg.ce
Normal file
456
cfg.ce
Normal file
@@ -0,0 +1,456 @@
|
||||
// cfg.ce — control flow graph
|
||||
//
|
||||
// Usage:
|
||||
// cell cfg --fn <N|name> <file> Text CFG for function
|
||||
// cell cfg --dot --fn <N|name> <file> DOT output for graphviz
|
||||
// cell cfg <file> Text CFG for all functions
|
||||
|
||||
var shop = use("internal/shop")
|
||||
|
||||
var pad_right = function(s, w) {
|
||||
var r = s
|
||||
while (length(r) < w) {
|
||||
r = r + " "
|
||||
}
|
||||
return r
|
||||
}
|
||||
|
||||
var fmt_val = function(v) {
|
||||
if (is_null(v)) return "null"
|
||||
if (is_number(v)) return text(v)
|
||||
if (is_text(v)) return `"${v}"`
|
||||
if (is_object(v)) return text(v)
|
||||
if (is_logical(v)) return v ? "true" : "false"
|
||||
return text(v)
|
||||
}
|
||||
|
||||
var is_jump_op = function(op) {
|
||||
return op == "jump" || op == "jump_true" || op == "jump_false" || op == "jump_null" || op == "jump_not_null"
|
||||
}
|
||||
|
||||
var is_conditional_jump = function(op) {
|
||||
return op == "jump_true" || op == "jump_false" || op == "jump_null" || op == "jump_not_null"
|
||||
}
|
||||
|
||||
var is_terminator = function(op) {
|
||||
return op == "return" || op == "disrupt" || op == "tail_invoke" || op == "goinvoke"
|
||||
}
|
||||
|
||||
var run = function() {
|
||||
var filename = null
|
||||
var fn_filter = null
|
||||
var show_dot = false
|
||||
var use_optimized = false
|
||||
var i = 0
|
||||
var compiled = null
|
||||
var main_name = null
|
||||
var fi = 0
|
||||
var func = null
|
||||
var fname = null
|
||||
|
||||
while (i < length(args)) {
|
||||
if (args[i] == '--fn') {
|
||||
i = i + 1
|
||||
fn_filter = args[i]
|
||||
} else if (args[i] == '--dot') {
|
||||
show_dot = true
|
||||
} else if (args[i] == '--optimized') {
|
||||
use_optimized = true
|
||||
} else if (args[i] == '--help' || args[i] == '-h') {
|
||||
log.console("Usage: cell cfg [--fn <N|name>] [--dot] [--optimized] <file>")
|
||||
log.console("")
|
||||
log.console(" --fn <N|name> Filter to function by index or name")
|
||||
log.console(" --dot Output DOT format for graphviz")
|
||||
log.console(" --optimized Use optimized IR")
|
||||
return null
|
||||
} else if (!starts_with(args[i], '-')) {
|
||||
filename = args[i]
|
||||
}
|
||||
i = i + 1
|
||||
}
|
||||
|
||||
if (!filename) {
|
||||
log.console("Usage: cell cfg [--fn <N|name>] [--dot] [--optimized] <file>")
|
||||
return null
|
||||
}
|
||||
|
||||
if (use_optimized) {
|
||||
compiled = shop.compile_file(filename)
|
||||
} else {
|
||||
compiled = shop.mcode_file(filename)
|
||||
}
|
||||
|
||||
var fn_matches = function(index, name) {
|
||||
var match = null
|
||||
if (fn_filter == null) return true
|
||||
if (index >= 0 && fn_filter == text(index)) return true
|
||||
if (name != null) {
|
||||
match = search(name, fn_filter)
|
||||
if (match != null && match >= 0) return true
|
||||
}
|
||||
return false
|
||||
}
|
||||
|
||||
var build_cfg = function(func) {
|
||||
var instrs = func.instructions
|
||||
var blocks = []
|
||||
var label_to_block = {}
|
||||
var pc_to_block = {}
|
||||
var label_to_pc = {}
|
||||
var block_start_pcs = {}
|
||||
var after_terminator = false
|
||||
var current_block = null
|
||||
var current_label = null
|
||||
var pc = 0
|
||||
var ii = 0
|
||||
var bi = 0
|
||||
var instr = null
|
||||
var op = null
|
||||
var n = 0
|
||||
var line_num = null
|
||||
var blk = null
|
||||
var last_instr_data = null
|
||||
var last_op = null
|
||||
var target_label = null
|
||||
var target_bi = null
|
||||
var edge_type = null
|
||||
|
||||
if (instrs == null || length(instrs) == 0) return []
|
||||
|
||||
// Pass 1: identify block start PCs
|
||||
block_start_pcs["0"] = true
|
||||
pc = 0
|
||||
ii = 0
|
||||
while (ii < length(instrs)) {
|
||||
instr = instrs[ii]
|
||||
if (is_array(instr)) {
|
||||
op = instr[0]
|
||||
if (after_terminator) {
|
||||
block_start_pcs[text(pc)] = true
|
||||
after_terminator = false
|
||||
}
|
||||
if (is_jump_op(op) || is_terminator(op)) {
|
||||
after_terminator = true
|
||||
}
|
||||
pc = pc + 1
|
||||
}
|
||||
ii = ii + 1
|
||||
}
|
||||
|
||||
// Pass 2: map labels to PCs and mark as block starts
|
||||
pc = 0
|
||||
ii = 0
|
||||
while (ii < length(instrs)) {
|
||||
instr = instrs[ii]
|
||||
if (is_text(instr) && !starts_with(instr, "_nop_")) {
|
||||
label_to_pc[instr] = pc
|
||||
block_start_pcs[text(pc)] = true
|
||||
} else if (is_array(instr)) {
|
||||
pc = pc + 1
|
||||
}
|
||||
ii = ii + 1
|
||||
}
|
||||
|
||||
// Pass 3: build basic blocks
|
||||
pc = 0
|
||||
ii = 0
|
||||
current_label = null
|
||||
while (ii < length(instrs)) {
|
||||
instr = instrs[ii]
|
||||
if (is_text(instr)) {
|
||||
if (!starts_with(instr, "_nop_")) {
|
||||
current_label = instr
|
||||
}
|
||||
ii = ii + 1
|
||||
continue
|
||||
}
|
||||
|
||||
if (is_array(instr)) {
|
||||
if (block_start_pcs[text(pc)]) {
|
||||
if (current_block != null) {
|
||||
blocks[] = current_block
|
||||
}
|
||||
current_block = {
|
||||
id: length(blocks),
|
||||
label: current_label,
|
||||
start_pc: pc,
|
||||
end_pc: pc,
|
||||
instrs: [],
|
||||
edges: [],
|
||||
first_line: null,
|
||||
last_line: null
|
||||
}
|
||||
current_label = null
|
||||
}
|
||||
|
||||
if (current_block != null) {
|
||||
current_block.instrs[] = {pc: pc, instr: instr}
|
||||
current_block.end_pc = pc
|
||||
n = length(instr)
|
||||
line_num = instr[n - 2]
|
||||
if (line_num != null) {
|
||||
if (current_block.first_line == null) {
|
||||
current_block.first_line = line_num
|
||||
}
|
||||
current_block.last_line = line_num
|
||||
}
|
||||
}
|
||||
pc = pc + 1
|
||||
}
|
||||
ii = ii + 1
|
||||
}
|
||||
if (current_block != null) {
|
||||
blocks[] = current_block
|
||||
}
|
||||
|
||||
// Build block index
|
||||
bi = 0
|
||||
while (bi < length(blocks)) {
|
||||
pc_to_block[text(blocks[bi].start_pc)] = bi
|
||||
if (blocks[bi].label != null) {
|
||||
label_to_block[blocks[bi].label] = bi
|
||||
}
|
||||
bi = bi + 1
|
||||
}
|
||||
|
||||
// Pass 4: compute edges
|
||||
bi = 0
|
||||
while (bi < length(blocks)) {
|
||||
blk = blocks[bi]
|
||||
if (length(blk.instrs) > 0) {
|
||||
last_instr_data = blk.instrs[length(blk.instrs) - 1]
|
||||
last_op = last_instr_data.instr[0]
|
||||
n = length(last_instr_data.instr)
|
||||
|
||||
if (is_jump_op(last_op)) {
|
||||
if (last_op == "jump") {
|
||||
target_label = last_instr_data.instr[1]
|
||||
} else {
|
||||
target_label = last_instr_data.instr[2]
|
||||
}
|
||||
|
||||
target_bi = label_to_block[target_label]
|
||||
if (target_bi != null) {
|
||||
edge_type = "jump"
|
||||
if (target_bi <= bi) {
|
||||
edge_type = "loop back-edge"
|
||||
}
|
||||
blk.edges[] = {target: target_bi, kind: edge_type}
|
||||
}
|
||||
|
||||
if (is_conditional_jump(last_op)) {
|
||||
if (bi + 1 < length(blocks)) {
|
||||
blk.edges[] = {target: bi + 1, kind: "fallthrough"}
|
||||
}
|
||||
}
|
||||
} else if (is_terminator(last_op)) {
|
||||
blk.edges[] = {target: -1, kind: "EXIT (" + last_op + ")"}
|
||||
} else {
|
||||
if (bi + 1 < length(blocks)) {
|
||||
blk.edges[] = {target: bi + 1, kind: "fallthrough"}
|
||||
}
|
||||
}
|
||||
}
|
||||
bi = bi + 1
|
||||
}
|
||||
|
||||
return blocks
|
||||
}
|
||||
|
||||
var print_cfg_text = function(blocks, name) {
|
||||
var bi = 0
|
||||
var blk = null
|
||||
var header = null
|
||||
var ii = 0
|
||||
var idata = null
|
||||
var instr = null
|
||||
var op = null
|
||||
var n = 0
|
||||
var parts = null
|
||||
var j = 0
|
||||
var operands = null
|
||||
var ei = 0
|
||||
var edge = null
|
||||
var target_label = null
|
||||
|
||||
log.compile(`\n=== ${name} ===`)
|
||||
|
||||
if (length(blocks) == 0) {
|
||||
log.compile(" (empty)")
|
||||
return null
|
||||
}
|
||||
|
||||
bi = 0
|
||||
while (bi < length(blocks)) {
|
||||
blk = blocks[bi]
|
||||
header = ` B${text(bi)}`
|
||||
if (blk.label != null) {
|
||||
header = header + ` "${blk.label}"`
|
||||
}
|
||||
header = header + ` [pc ${text(blk.start_pc)}-${text(blk.end_pc)}`
|
||||
if (blk.first_line != null) {
|
||||
if (blk.first_line == blk.last_line) {
|
||||
header = header + `, line ${text(blk.first_line)}`
|
||||
} else {
|
||||
header = header + `, lines ${text(blk.first_line)}-${text(blk.last_line)}`
|
||||
}
|
||||
}
|
||||
header = header + "]:"
|
||||
|
||||
log.compile(header)
|
||||
|
||||
ii = 0
|
||||
while (ii < length(blk.instrs)) {
|
||||
idata = blk.instrs[ii]
|
||||
instr = idata.instr
|
||||
op = instr[0]
|
||||
n = length(instr)
|
||||
parts = []
|
||||
j = 1
|
||||
while (j < n - 2) {
|
||||
parts[] = fmt_val(instr[j])
|
||||
j = j + 1
|
||||
}
|
||||
operands = text(parts, ", ")
|
||||
log.compile(` ${pad_right(text(idata.pc), 6)}${pad_right(op, 15)}${operands}`)
|
||||
ii = ii + 1
|
||||
}
|
||||
|
||||
ei = 0
|
||||
while (ei < length(blk.edges)) {
|
||||
edge = blk.edges[ei]
|
||||
if (edge.target == -1) {
|
||||
log.compile(` -> ${edge.kind}`)
|
||||
} else {
|
||||
target_label = blocks[edge.target].label
|
||||
if (target_label != null) {
|
||||
log.compile(` -> B${text(edge.target)} "${target_label}" (${edge.kind})`)
|
||||
} else {
|
||||
log.compile(` -> B${text(edge.target)} (${edge.kind})`)
|
||||
}
|
||||
}
|
||||
ei = ei + 1
|
||||
}
|
||||
|
||||
log.compile("")
|
||||
bi = bi + 1
|
||||
}
|
||||
return null
|
||||
}
|
||||
|
||||
var print_cfg_dot = function(blocks, name) {
|
||||
var safe_name = replace(replace(name, '"', '\\"'), ' ', '_')
|
||||
var bi = 0
|
||||
var blk = null
|
||||
var label_text = null
|
||||
var ii = 0
|
||||
var idata = null
|
||||
var instr = null
|
||||
var op = null
|
||||
var n = 0
|
||||
var parts = null
|
||||
var j = 0
|
||||
var operands = null
|
||||
var ei = 0
|
||||
var edge = null
|
||||
var style = null
|
||||
|
||||
log.compile(`digraph "${safe_name}" {`)
|
||||
log.compile(" rankdir=TB;")
|
||||
log.compile(" node [shape=record, fontname=monospace, fontsize=10];")
|
||||
|
||||
bi = 0
|
||||
while (bi < length(blocks)) {
|
||||
blk = blocks[bi]
|
||||
label_text = "B" + text(bi)
|
||||
if (blk.label != null) {
|
||||
label_text = label_text + " (" + blk.label + ")"
|
||||
}
|
||||
label_text = label_text + "\\npc " + text(blk.start_pc) + "-" + text(blk.end_pc)
|
||||
if (blk.first_line != null) {
|
||||
label_text = label_text + "\\nline " + text(blk.first_line)
|
||||
}
|
||||
label_text = label_text + "|"
|
||||
|
||||
ii = 0
|
||||
while (ii < length(blk.instrs)) {
|
||||
idata = blk.instrs[ii]
|
||||
instr = idata.instr
|
||||
op = instr[0]
|
||||
n = length(instr)
|
||||
parts = []
|
||||
j = 1
|
||||
while (j < n - 2) {
|
||||
parts[] = fmt_val(instr[j])
|
||||
j = j + 1
|
||||
}
|
||||
operands = text(parts, ", ")
|
||||
label_text = label_text + text(idata.pc) + " " + op + " " + replace(operands, '"', '\\"') + "\\l"
|
||||
ii = ii + 1
|
||||
}
|
||||
|
||||
log.compile(" B" + text(bi) + " [label=\"{" + label_text + "}\"];")
|
||||
bi = bi + 1
|
||||
}
|
||||
|
||||
// Edges
|
||||
bi = 0
|
||||
while (bi < length(blocks)) {
|
||||
blk = blocks[bi]
|
||||
ei = 0
|
||||
while (ei < length(blk.edges)) {
|
||||
edge = blk.edges[ei]
|
||||
if (edge.target >= 0) {
|
||||
style = ""
|
||||
if (edge.kind == "loop back-edge") {
|
||||
style = " [style=bold, color=red, label=\"loop\"]"
|
||||
} else if (edge.kind == "fallthrough") {
|
||||
style = " [style=dashed]"
|
||||
}
|
||||
log.compile(` B${text(bi)} -> B${text(edge.target)}${style};`)
|
||||
}
|
||||
ei = ei + 1
|
||||
}
|
||||
bi = bi + 1
|
||||
}
|
||||
|
||||
log.compile("}")
|
||||
return null
|
||||
}
|
||||
|
||||
var process_function = function(func, name, index) {
|
||||
var blocks = build_cfg(func)
|
||||
if (show_dot) {
|
||||
print_cfg_dot(blocks, name)
|
||||
} else {
|
||||
print_cfg_text(blocks, name)
|
||||
}
|
||||
return null
|
||||
}
|
||||
|
||||
// Process functions
|
||||
main_name = compiled.name != null ? compiled.name : "<main>"
|
||||
|
||||
if (compiled.main != null) {
|
||||
if (fn_matches(-1, main_name)) {
|
||||
process_function(compiled.main, main_name, -1)
|
||||
}
|
||||
}
|
||||
|
||||
if (compiled.functions != null) {
|
||||
fi = 0
|
||||
while (fi < length(compiled.functions)) {
|
||||
func = compiled.functions[fi]
|
||||
fname = func.name != null ? func.name : "<anonymous>"
|
||||
if (fn_matches(fi, fname)) {
|
||||
process_function(func, `[${text(fi)}] ${fname}`, fi)
|
||||
}
|
||||
fi = fi + 1
|
||||
}
|
||||
}
|
||||
|
||||
return null
|
||||
}
|
||||
|
||||
run()
|
||||
$stop()
|
||||
197
clean.ce
Normal file
197
clean.ce
Normal file
@@ -0,0 +1,197 @@
|
||||
// cell clean [<scope>] - Remove cached material to force refetch/rebuild
|
||||
//
|
||||
// Usage:
|
||||
// cell clean Clean build outputs for current directory package
|
||||
// cell clean . Clean build outputs for current directory package
|
||||
// cell clean <locator> Clean build outputs for specific package
|
||||
// cell clean shop Clean entire shop
|
||||
// cell clean world Clean all world packages
|
||||
//
|
||||
// Options:
|
||||
// --build Remove build outputs only (default)
|
||||
// --fetch Remove fetched sources only
|
||||
// --all Remove both build outputs and fetched sources
|
||||
// --deep Apply to full dependency closure
|
||||
// --dry-run Show what would be deleted
|
||||
|
||||
var shop = use('internal/shop')
|
||||
var pkg = use('package')
|
||||
var fd = use('fd')
|
||||
|
||||
var scope = null
|
||||
var clean_build = false
|
||||
var clean_fetch = false
|
||||
var deep = false
|
||||
var dry_run = false
|
||||
var i = 0
|
||||
var deps = null
|
||||
|
||||
var run = function() {
|
||||
for (i = 0; i < length(args); i++) {
|
||||
if (args[i] == '--build') {
|
||||
clean_build = true
|
||||
} else if (args[i] == '--fetch') {
|
||||
clean_fetch = true
|
||||
} else if (args[i] == '--all') {
|
||||
clean_build = true
|
||||
clean_fetch = true
|
||||
} else if (args[i] == '--deep') {
|
||||
deep = true
|
||||
} else if (args[i] == '--dry-run') {
|
||||
dry_run = true
|
||||
} else if (args[i] == '--help' || args[i] == '-h') {
|
||||
log.console("Usage: cell clean [<scope>] [options]")
|
||||
log.console("")
|
||||
log.console("Remove cached material to force refetch/rebuild.")
|
||||
log.console("")
|
||||
log.console("Scopes:")
|
||||
log.console(" <locator> Clean specific package")
|
||||
log.console(" shop Clean entire shop")
|
||||
log.console(" world Clean all world packages")
|
||||
log.console("")
|
||||
log.console("Options:")
|
||||
log.console(" --build Remove build outputs only (default)")
|
||||
log.console(" --fetch Remove fetched sources only")
|
||||
log.console(" --all Remove both build outputs and fetched sources")
|
||||
log.console(" --deep Apply to full dependency closure")
|
||||
log.console(" --dry-run Show what would be deleted")
|
||||
return
|
||||
} else if (!starts_with(args[i], '-')) {
|
||||
scope = args[i]
|
||||
}
|
||||
}
|
||||
|
||||
// Default to --build if nothing specified
|
||||
if (!clean_build && !clean_fetch) {
|
||||
clean_build = true
|
||||
}
|
||||
|
||||
// Default scope to current directory
|
||||
if (!scope) {
|
||||
scope = '.'
|
||||
}
|
||||
|
||||
// Resolve local paths for single package scope
|
||||
var is_shop_scope = (scope == 'shop')
|
||||
var is_world_scope = (scope == 'world')
|
||||
|
||||
if (!is_shop_scope && !is_world_scope) {
|
||||
scope = shop.resolve_locator(scope)
|
||||
}
|
||||
|
||||
var files_to_delete = []
|
||||
var dirs_to_delete = []
|
||||
|
||||
// Gather packages to clean
|
||||
var packages_to_clean = []
|
||||
var _gather = null
|
||||
|
||||
if (is_shop_scope) {
|
||||
packages_to_clean = shop.list_packages()
|
||||
} else if (is_world_scope) {
|
||||
// For now, world is the same as shop
|
||||
packages_to_clean = shop.list_packages()
|
||||
} else {
|
||||
// Single package
|
||||
packages_to_clean[] = scope
|
||||
|
||||
if (deep) {
|
||||
_gather = function() {
|
||||
deps = pkg.gather_dependencies(scope)
|
||||
arrfor(deps, function(dep) {
|
||||
packages_to_clean[] = dep
|
||||
})
|
||||
} disruption {
|
||||
// Skip if can't read dependencies
|
||||
}
|
||||
_gather()
|
||||
}
|
||||
}
|
||||
|
||||
// Gather files to clean
|
||||
var lib_dir = shop.get_lib_dir()
|
||||
var build_dir = shop.get_build_dir()
|
||||
var packages_dir = replace(shop.get_package_dir(''), /\/$/, '') // Get base packages dir
|
||||
|
||||
if (clean_build) {
|
||||
// Nuke entire build cache (content-addressed, per-package clean impractical)
|
||||
if (fd.is_dir(build_dir)) {
|
||||
dirs_to_delete[] = build_dir
|
||||
}
|
||||
// Clean orphaned lib/ directory if it exists (legacy)
|
||||
if (fd.is_dir(lib_dir)) {
|
||||
dirs_to_delete[] = lib_dir
|
||||
}
|
||||
}
|
||||
|
||||
if (clean_fetch) {
|
||||
if (is_shop_scope) {
|
||||
// Clean entire packages directory (dangerous!)
|
||||
if (fd.is_dir(packages_dir)) {
|
||||
dirs_to_delete[] = packages_dir
|
||||
}
|
||||
} else {
|
||||
// Clean specific package directories
|
||||
arrfor(packages_to_clean, function(p) {
|
||||
if (p == 'core') return
|
||||
|
||||
var pkg_dir = shop.get_package_dir(p)
|
||||
if (fd.is_dir(pkg_dir) || fd.is_link(pkg_dir)) {
|
||||
dirs_to_delete[] = pkg_dir
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
// Execute or report
|
||||
var deleted_count = 0
|
||||
if (dry_run) {
|
||||
log.console("Would delete:")
|
||||
if (length(files_to_delete) == 0 && length(dirs_to_delete) == 0) {
|
||||
log.console(" (nothing to clean)")
|
||||
} else {
|
||||
arrfor(files_to_delete, function(f) {
|
||||
log.console(" [file] " + f)
|
||||
})
|
||||
arrfor(dirs_to_delete, function(d) {
|
||||
log.console(" [dir] " + d)
|
||||
})
|
||||
}
|
||||
} else {
|
||||
arrfor(files_to_delete, function(f) {
|
||||
var _del = function() {
|
||||
fd.unlink(f)
|
||||
log.console("Deleted: " + f)
|
||||
deleted_count++
|
||||
} disruption {
|
||||
log.error("Failed to delete " + f)
|
||||
}
|
||||
_del()
|
||||
})
|
||||
|
||||
arrfor(dirs_to_delete, function(d) {
|
||||
var _del = function() {
|
||||
if (fd.is_link(d)) {
|
||||
fd.unlink(d)
|
||||
} else {
|
||||
fd.rmdir(d, 1) // recursive
|
||||
}
|
||||
log.console("Deleted: " + d)
|
||||
deleted_count++
|
||||
} disruption {
|
||||
log.error("Failed to delete " + d)
|
||||
}
|
||||
_del()
|
||||
})
|
||||
|
||||
if (deleted_count == 0) {
|
||||
log.console("Nothing to clean.")
|
||||
} else {
|
||||
log.console("")
|
||||
log.console("Clean complete: " + text(deleted_count) + " item(s) deleted.")
|
||||
}
|
||||
}
|
||||
}
|
||||
run()
|
||||
|
||||
$stop()
|
||||
71
clone.ce
Normal file
71
clone.ce
Normal file
@@ -0,0 +1,71 @@
|
||||
// cell clone <origin> <path>
|
||||
// Clones a cell package <origin> to the local <path>, and links it.
|
||||
|
||||
var shop = use('internal/shop')
|
||||
var link = use('link')
|
||||
var fd = use('fd')
|
||||
var http = use('http')
|
||||
|
||||
var run = function() {
|
||||
if (length(args) < 2) {
|
||||
log.console("Usage: cell clone <origin> <path>")
|
||||
log.console("Clones a cell package to a local path and links it.")
|
||||
return
|
||||
}
|
||||
|
||||
var origin = args[0]
|
||||
var target_path = args[1]
|
||||
|
||||
// Resolve target path to absolute
|
||||
target_path = shop.resolve_locator(target_path)
|
||||
|
||||
// Check if target already exists
|
||||
if (fd.is_dir(target_path)) {
|
||||
log.console("Error: " + target_path + " already exists")
|
||||
return
|
||||
}
|
||||
|
||||
log.console("Cloning " + origin + " to " + target_path + "...")
|
||||
|
||||
// Get the latest commit
|
||||
var info = shop.resolve_package_info(origin)
|
||||
if (!info || info == 'local') {
|
||||
log.console("Error: " + origin + " is not a remote package")
|
||||
return
|
||||
}
|
||||
|
||||
// Update to get the commit hash
|
||||
var update_result = shop.update(origin)
|
||||
if (!update_result) {
|
||||
log.console("Error: Could not fetch " + origin)
|
||||
return
|
||||
}
|
||||
|
||||
// Fetch and extract to the target path
|
||||
var lock = shop.load_lock()
|
||||
var entry = lock[origin]
|
||||
if (!entry || !entry.commit) {
|
||||
log.console("Error: No commit found for " + origin)
|
||||
return
|
||||
}
|
||||
|
||||
var download_url = shop.get_download_url(origin, entry.commit)
|
||||
log.console("Downloading from " + download_url)
|
||||
|
||||
var _clone = function() {
|
||||
var zip_blob = http.fetch(download_url)
|
||||
shop.install_zip(zip_blob, target_path)
|
||||
|
||||
log.console("Extracted to " + target_path)
|
||||
|
||||
// Link the origin to the cloned path
|
||||
link.add(origin, target_path, shop)
|
||||
log.console("Linked " + origin + " -> " + target_path)
|
||||
} disruption {
|
||||
log.console("Error during clone")
|
||||
}
|
||||
_clone()
|
||||
}
|
||||
run()
|
||||
|
||||
$stop()
|
||||
130
compare_aot.ce
Normal file
130
compare_aot.ce
Normal file
@@ -0,0 +1,130 @@
|
||||
// compare_aot.ce — compile a .ce/.cm file via both paths and compare results
|
||||
//
|
||||
// Usage:
|
||||
// cell --dev compare_aot.ce <file.ce>
|
||||
|
||||
var build = use('build')
|
||||
var fd_mod = use('fd')
|
||||
var os = use('internal/os')
|
||||
var json = use('json')
|
||||
var time = use('time')
|
||||
|
||||
var show = function(v) {
|
||||
if (v == null) return "null"
|
||||
return json.encode(v)
|
||||
}
|
||||
|
||||
if (length(args) < 1) {
|
||||
log.compile('usage: cell --dev compare_aot.ce <file>')
|
||||
return
|
||||
}
|
||||
|
||||
var file = args[0]
|
||||
if (!fd_mod.is_file(file)) {
|
||||
if (!ends_with(file, '.ce') && fd_mod.is_file(file + '.ce'))
|
||||
file = file + '.ce'
|
||||
else if (!ends_with(file, '.cm') && fd_mod.is_file(file + '.cm'))
|
||||
file = file + '.cm'
|
||||
else {
|
||||
log.error('file not found: ' + file)
|
||||
return
|
||||
}
|
||||
}
|
||||
|
||||
var abs = fd_mod.realpath(file)
|
||||
|
||||
// Shared compilation front-end — uses raw modules for per-stage timing
|
||||
var tokenize = use('tokenize')
|
||||
var parse_mod = use('parse')
|
||||
var fold = use('fold')
|
||||
var mcode_mod = use('mcode')
|
||||
var streamline_mod = use('streamline')
|
||||
|
||||
var t0 = time.number()
|
||||
var src = text(fd_mod.slurp(abs))
|
||||
var t1 = time.number()
|
||||
var tok = tokenize(src, abs)
|
||||
var t2 = time.number()
|
||||
var ast = parse_mod(tok.tokens, src, abs, tokenize)
|
||||
var t3 = time.number()
|
||||
var folded = fold(ast)
|
||||
var t4 = time.number()
|
||||
var compiled = mcode_mod(folded)
|
||||
var t5 = time.number()
|
||||
var optimized = streamline_mod(compiled)
|
||||
var t6 = time.number()
|
||||
|
||||
log.compile('--- front-end timing ---')
|
||||
log.compile(' read: ' + text(t1 - t0) + 's')
|
||||
log.compile(' tokenize: ' + text(t2 - t1) + 's')
|
||||
log.compile(' parse: ' + text(t3 - t2) + 's')
|
||||
log.compile(' fold: ' + text(t4 - t3) + 's')
|
||||
log.compile(' mcode: ' + text(t5 - t4) + 's')
|
||||
log.compile(' streamline: ' + text(t6 - t5) + 's')
|
||||
log.compile(' total: ' + text(t6 - t0) + 's')
|
||||
|
||||
// Shared env for both paths — only non-intrinsic runtime functions.
|
||||
// Intrinsics (starts_with, ends_with, logical, some, every, etc.) live on
|
||||
// the stoned global and are found via GETINTRINSIC/cell_rt_get_intrinsic.
|
||||
var env = stone({
|
||||
log: log,
|
||||
fallback: fallback,
|
||||
parallel: parallel,
|
||||
race: race,
|
||||
sequence: sequence,
|
||||
use
|
||||
})
|
||||
|
||||
// --- Interpreted (mach VM) ---
|
||||
var result_interp = null
|
||||
var interp_ok = false
|
||||
var run_interp = function() {
|
||||
log.compile('--- interpreted ---')
|
||||
var mcode_json = json.encode(optimized)
|
||||
var mach_blob = mach_compile_mcode_bin(abs, mcode_json)
|
||||
result_interp = mach_load(mach_blob, env)
|
||||
interp_ok = true
|
||||
log.compile('result: ' + show(result_interp))
|
||||
} disruption {
|
||||
interp_ok = true
|
||||
log.compile('(disruption escaped from interpreted run)')
|
||||
}
|
||||
run_interp()
|
||||
|
||||
// --- Native (AOT via QBE) ---
|
||||
var result_native = null
|
||||
var native_ok = false
|
||||
var run_native = function() {
|
||||
log.compile('\n--- native ---')
|
||||
var dylib_path = build.compile_native_ir(optimized, abs, null)
|
||||
log.compile('dylib: ' + dylib_path)
|
||||
var handle = os.dylib_open(dylib_path)
|
||||
if (!handle) {
|
||||
log.error('failed to open dylib')
|
||||
return
|
||||
}
|
||||
result_native = os.native_module_load(handle, env)
|
||||
native_ok = true
|
||||
log.compile('result: ' + show(result_native))
|
||||
} disruption {
|
||||
native_ok = true
|
||||
log.compile('(disruption escaped from native run)')
|
||||
}
|
||||
run_native()
|
||||
|
||||
// --- Comparison ---
|
||||
log.compile('\n--- comparison ---')
|
||||
var s_interp = show(result_interp)
|
||||
var s_native = show(result_native)
|
||||
if (interp_ok && native_ok) {
|
||||
if (s_interp == s_native) {
|
||||
log.compile('MATCH')
|
||||
} else {
|
||||
log.error('MISMATCH')
|
||||
log.error(' interp: ' + s_interp)
|
||||
log.error(' native: ' + s_native)
|
||||
}
|
||||
} else {
|
||||
if (!interp_ok) log.error('interpreted run failed')
|
||||
if (!native_ok) log.error('native run failed')
|
||||
}
|
||||
27
compile.ce
Normal file
27
compile.ce
Normal file
@@ -0,0 +1,27 @@
|
||||
// compile.ce — compile a .cm or .ce file to native .dylib via QBE
|
||||
//
|
||||
// Usage:
|
||||
// cell compile <file.cm|file.ce>
|
||||
//
|
||||
// Installs the dylib to .cell/lib/<pkg>/<stem>.dylib
|
||||
|
||||
var shop = use('internal/shop')
|
||||
var build = use('build')
|
||||
var fd = use('fd')
|
||||
|
||||
if (length(args) < 1) {
|
||||
log.compile('usage: cell compile <file.cm|file.ce>')
|
||||
return
|
||||
}
|
||||
|
||||
var file = args[0]
|
||||
if (!fd.is_file(file)) {
|
||||
log.error('file not found: ' + file)
|
||||
return
|
||||
}
|
||||
|
||||
var abs = fd.realpath(file)
|
||||
var file_info = shop.file_info(abs)
|
||||
var pkg = file_info.package
|
||||
|
||||
build.compile_native(abs, null, null, pkg)
|
||||
40
compile_worker.ce
Normal file
40
compile_worker.ce
Normal file
@@ -0,0 +1,40 @@
|
||||
// compile_worker - Worker actor that compiles a single module and replies
|
||||
//
|
||||
// Receives a message with:
|
||||
// {type: 'script', path, package} — bytecode compile
|
||||
// {type: 'native_script', path, package} — native compile
|
||||
// {type: 'c_package', package} — C package build
|
||||
// {type: 'c_file', package, file} — single C module build
|
||||
//
|
||||
// Replies with {ok: true/false, path} and stops.
|
||||
|
||||
var shop = use('internal/shop')
|
||||
var build = use('build')
|
||||
|
||||
$receiver(function(msg) {
|
||||
var name = msg.path || (msg.file ? msg.package + '/' + msg.file : msg.package)
|
||||
var _work = function() {
|
||||
if (msg.type == 'script') {
|
||||
log.console('compile_worker: compiling ' + name)
|
||||
shop.precompile(msg.path, msg.package)
|
||||
} else if (msg.type == 'native_script') {
|
||||
log.console('compile_worker: native compiling ' + name)
|
||||
build.compile_native(msg.path, null, null, msg.package)
|
||||
} else if (msg.type == 'c_package') {
|
||||
log.console('compile_worker: building package ' + name)
|
||||
build.build_dynamic(msg.package, null, null, null)
|
||||
} else if (msg.type == 'c_file') {
|
||||
log.console('compile_worker: building ' + name)
|
||||
build.compile_c_module(msg.package, msg.file)
|
||||
}
|
||||
log.console('compile_worker: done ' + name)
|
||||
send(msg, {ok: true, path: name})
|
||||
} disruption {
|
||||
log.error('compile_worker: failed ' + name)
|
||||
send(msg, {ok: false, error: 'compile failed'})
|
||||
}
|
||||
_work()
|
||||
$stop()
|
||||
})
|
||||
|
||||
var _t = $delay($stop, 120)
|
||||
222
config.ce
Normal file
222
config.ce
Normal file
@@ -0,0 +1,222 @@
|
||||
// cell config - Manage system and actor configurations
|
||||
|
||||
var toml = use('toml')
|
||||
var pkg = use('package')
|
||||
|
||||
function print_help() {
|
||||
log.console("Usage: cell config <command> [options]")
|
||||
log.console("")
|
||||
log.console("Commands:")
|
||||
log.console(" get <key> Get a configuration value")
|
||||
log.console(" set <key> <value> Set a configuration value")
|
||||
log.console(" list List all configurations")
|
||||
log.console(" actor <name> get <key> Get actor-specific config")
|
||||
log.console(" actor <name> set <key> <val> Set actor-specific config")
|
||||
log.console(" actor <name> list List actor configurations")
|
||||
log.console("")
|
||||
log.console("Examples:")
|
||||
log.console(" cell config get system.ar_timer")
|
||||
log.console(" cell config set system.net_service 0.2")
|
||||
log.console(" cell config actor prosperon/_sdl_video set resolution 1920x1080")
|
||||
log.console(" cell config actor extramath/spline set precision high")
|
||||
log.console("")
|
||||
log.console("System keys:")
|
||||
log.console(" system.ar_timer - Seconds before idle actor reclamation")
|
||||
log.console(" system.actor_memory - MB of memory an actor can use (0=unbounded)")
|
||||
log.console(" system.net_service - Seconds per network service pull")
|
||||
log.console(" system.reply_timeout - Seconds to hold callback for replies (0=unbounded)")
|
||||
log.console(" system.actor_max - Max number of simultaneous actors")
|
||||
log.console(" system.stack_max - MB of memory each actor's stack can grow to")
|
||||
}
|
||||
|
||||
// Parse a dot-notation key into path segments
|
||||
function parse_key(key) {
|
||||
return array(key, '.')
|
||||
}
|
||||
|
||||
// Get a value from nested object using path
|
||||
function get_nested(obj, path) {
|
||||
var current = obj
|
||||
arrfor(path, function(segment) {
|
||||
if (is_null(current) || !is_object(current)) return null
|
||||
current = current[segment]
|
||||
})
|
||||
return current
|
||||
}
|
||||
|
||||
// Set a value in nested object using path
|
||||
function set_nested(obj, path, value) {
|
||||
var current = obj
|
||||
var i = 0
|
||||
var segment = null
|
||||
for (i = 0; i < length(path) - 1; i++) {
|
||||
segment = path[i]
|
||||
if (is_null(current[segment]) || !is_object(current[segment])) {
|
||||
current[segment] = {}
|
||||
}
|
||||
current = current[segment]
|
||||
}
|
||||
current[path[length(path) - 1]] = value
|
||||
}
|
||||
|
||||
// Parse value string into appropriate type
|
||||
function parse_value(str) {
|
||||
var num_str = null
|
||||
var n = null
|
||||
// Boolean
|
||||
if (str == 'true') return true
|
||||
if (str == 'false') return false
|
||||
|
||||
// Number
|
||||
num_str = replace(str, /_/g, '')
|
||||
n = number(num_str)
|
||||
if (n != null) return n
|
||||
|
||||
// String
|
||||
return str
|
||||
}
|
||||
|
||||
// Format value for display
|
||||
function format_value(val) {
|
||||
if (is_text(val)) return '"' + val + '"'
|
||||
return text(val)
|
||||
}
|
||||
|
||||
// Print configuration tree recursively
|
||||
function print_config(obj, pfx) {
|
||||
var p = pfx || ''
|
||||
arrfor(array(obj), function(key) {
|
||||
var val = obj[key]
|
||||
var full_key = p ? p + '.' + key : key
|
||||
|
||||
if (is_object(val))
|
||||
print_config(val, full_key)
|
||||
else if (!is_null(val))
|
||||
log.console(full_key + ' = ' + format_value(val))
|
||||
})
|
||||
}
|
||||
|
||||
// Main command handling
|
||||
if (length(args) == 0) {
|
||||
print_help()
|
||||
$stop()
|
||||
}
|
||||
|
||||
var config = pkg.load_config()
|
||||
if (!config) {
|
||||
log.error("Failed to load cell.toml")
|
||||
$stop()
|
||||
}
|
||||
|
||||
var command = args[0]
|
||||
var key = null
|
||||
var path = null
|
||||
var value = null
|
||||
var value_str = null
|
||||
var valid_system_keys = null
|
||||
var actor_name = null
|
||||
var actor_cmd = null
|
||||
|
||||
if (command == 'help' || command == '-h' || command == '--help') {
|
||||
print_help()
|
||||
} else if (command == 'list') {
|
||||
log.console("# Cell Configuration")
|
||||
log.console("")
|
||||
print_config(config)
|
||||
} else if (command == 'get') {
|
||||
if (length(args) < 2) {
|
||||
log.error("Usage: cell config get <key>")
|
||||
$stop()
|
||||
}
|
||||
key = args[1]
|
||||
path = parse_key(key)
|
||||
value = get_nested(config, path)
|
||||
|
||||
if (value == null) {
|
||||
log.error("Key not found: " + key)
|
||||
} else if (is_object(value)) {
|
||||
print_config(value, key)
|
||||
} else {
|
||||
log.console(key + ' = ' + format_value(value))
|
||||
}
|
||||
} else if (command == 'set') {
|
||||
if (length(args) < 3) {
|
||||
log.error("Usage: cell config set <key> <value>")
|
||||
$stop()
|
||||
}
|
||||
key = args[1]
|
||||
value_str = args[2]
|
||||
path = parse_key(key)
|
||||
value = parse_value(value_str)
|
||||
|
||||
if (path[0] == 'system') {
|
||||
valid_system_keys = [
|
||||
'ar_timer', 'actor_memory', 'net_service',
|
||||
'reply_timeout', 'actor_max', 'stack_max'
|
||||
]
|
||||
if (find(valid_system_keys, path[1]) == null) {
|
||||
log.error("Invalid system key. Valid keys: " + text(valid_system_keys, ', '))
|
||||
$stop()
|
||||
}
|
||||
}
|
||||
|
||||
set_nested(config, path, value)
|
||||
pkg.save_config(config)
|
||||
log.console("Set " + key + " = " + format_value(value))
|
||||
} else if (command == 'actor') {
|
||||
if (length(args) < 3) {
|
||||
log.error("Usage: cell config actor <name> <command> [options]")
|
||||
$stop()
|
||||
}
|
||||
|
||||
actor_name = args[1]
|
||||
actor_cmd = args[2]
|
||||
|
||||
config.actors = config.actors || {}
|
||||
config.actors[actor_name] = config.actors[actor_name] || {}
|
||||
|
||||
if (actor_cmd == 'list') {
|
||||
if (length(array(config.actors[actor_name])) == 0) {
|
||||
log.console("No configuration for actor: " + actor_name)
|
||||
} else {
|
||||
log.console("# Configuration for actor: " + actor_name)
|
||||
log.console("")
|
||||
print_config(config.actors[actor_name], 'actors.' + actor_name)
|
||||
}
|
||||
} else if (actor_cmd == 'get') {
|
||||
if (length(args) < 4) {
|
||||
log.error("Usage: cell config actor <name> get <key>")
|
||||
$stop()
|
||||
}
|
||||
key = args[3]
|
||||
path = parse_key(key)
|
||||
value = get_nested(config.actors[actor_name], path)
|
||||
|
||||
if (value == null) {
|
||||
log.error("Key not found for actor " + actor_name + ": " + key)
|
||||
} else {
|
||||
log.console('actors.' + actor_name + '.' + key + ' = ' + format_value(value))
|
||||
}
|
||||
} else if (actor_cmd == 'set') {
|
||||
if (length(args) < 5) {
|
||||
log.error("Usage: cell config actor <name> set <key> <value>")
|
||||
$stop()
|
||||
}
|
||||
key = args[3]
|
||||
value_str = args[4]
|
||||
path = parse_key(key)
|
||||
value = parse_value(value_str)
|
||||
|
||||
set_nested(config.actors[actor_name], path, value)
|
||||
pkg.save_config(config)
|
||||
log.console("Set actors." + actor_name + "." + key + " = " + format_value(value))
|
||||
} else {
|
||||
log.error("Unknown actor command: " + actor_cmd)
|
||||
log.console("Valid commands: list, get, set")
|
||||
}
|
||||
} else {
|
||||
log.error("Unknown command: " + command)
|
||||
print_help()
|
||||
}
|
||||
|
||||
$stop()
|
||||
29
debug/debug.c
Normal file
29
debug/debug.c
Normal file
@@ -0,0 +1,29 @@
|
||||
#include "cell.h"
|
||||
|
||||
// TODO: Reimplement stack depth for register VM
|
||||
JSC_CCALL(debug_stack_depth, return number2js(js, 0))
|
||||
|
||||
// TODO: Reimplement debug introspection for register VM
|
||||
JSC_CCALL(debug_build_backtrace, return JS_NewArray(js))
|
||||
JSC_CCALL(debug_closure_vars, return JS_NewObject(js))
|
||||
JSC_CCALL(debug_set_closure_var, return JS_NULL;)
|
||||
JSC_CCALL(debug_local_vars, return JS_NewObject(js))
|
||||
JSC_CCALL(debug_fn_info, return JS_NewObject(js))
|
||||
JSC_CCALL(debug_backtrace_fns, return JS_NewArray(js))
|
||||
|
||||
static const JSCFunctionListEntry js_debug_funcs[] = {
|
||||
MIST_FUNC_DEF(debug, stack_depth, 0),
|
||||
MIST_FUNC_DEF(debug, build_backtrace, 0),
|
||||
MIST_FUNC_DEF(debug, closure_vars, 1),
|
||||
MIST_FUNC_DEF(debug, set_closure_var, 3),
|
||||
MIST_FUNC_DEF(debug, local_vars, 1),
|
||||
MIST_FUNC_DEF(debug, fn_info, 1),
|
||||
MIST_FUNC_DEF(debug, backtrace_fns,0),
|
||||
};
|
||||
|
||||
JSValue js_core_debug_use(JSContext *js) {
|
||||
JS_FRAME(js);
|
||||
JS_ROOT(mod, JS_NewObject(js));
|
||||
JS_SetPropertyFunctionList(js, mod.val, js_debug_funcs, countof(js_debug_funcs));
|
||||
JS_RETURN(mod.val);
|
||||
}
|
||||
29
debug/js.c
Normal file
29
debug/js.c
Normal file
@@ -0,0 +1,29 @@
|
||||
#include "cell.h"
|
||||
#include "pit_internal.h"
|
||||
|
||||
JSC_CCALL(os_mem_limit, JS_SetMemoryLimit(JS_GetRuntime(js), js2number(js,argv[0])))
|
||||
JSC_CCALL(os_max_stacksize, JS_SetMaxStackSize(js, js2number(js,argv[0])))
|
||||
|
||||
// TODO: Reimplement memory usage reporting for new allocator
|
||||
JSC_CCALL(os_calc_mem,
|
||||
ret = JS_NewObject(js);
|
||||
)
|
||||
|
||||
// TODO: Reimplement for register VM
|
||||
JSC_CCALL(js_disassemble, return JS_NewArray(js);)
|
||||
JSC_CCALL(js_fn_info, return JS_NewObject(js);)
|
||||
|
||||
static const JSCFunctionListEntry js_js_funcs[] = {
|
||||
MIST_FUNC_DEF(os, calc_mem, 0),
|
||||
MIST_FUNC_DEF(os, mem_limit, 1),
|
||||
MIST_FUNC_DEF(os, max_stacksize, 1),
|
||||
MIST_FUNC_DEF(js, disassemble, 1),
|
||||
MIST_FUNC_DEF(js, fn_info, 1),
|
||||
};
|
||||
|
||||
JSValue js_core_js_use(JSContext *js) {
|
||||
JS_FRAME(js);
|
||||
JS_ROOT(mod, JS_NewObject(js));
|
||||
JS_SetPropertyFunctionList(js, mod.val, js_js_funcs, countof(js_js_funcs));
|
||||
JS_RETURN(mod.val);
|
||||
}
|
||||
223
diff.ce
Normal file
223
diff.ce
Normal file
@@ -0,0 +1,223 @@
|
||||
// diff.ce — differential testing: run tests optimized vs unoptimized, compare results
|
||||
//
|
||||
// Usage:
|
||||
// cell diff - diff all test files in current package
|
||||
// cell diff suite - diff a specific test file (tests/suite.cm)
|
||||
// cell diff tests/foo - diff a specific test file by path
|
||||
var shop = use('internal/shop')
|
||||
var pkg = use('package')
|
||||
var fd = use('fd')
|
||||
var time = use('time')
|
||||
var testlib = use('internal/testlib')
|
||||
|
||||
var _args = args == null ? [] : args
|
||||
|
||||
var analyze = use('internal/os').analyze
|
||||
var run_ast_fn = use('internal/os').run_ast_fn
|
||||
var run_ast_noopt_fn = use('internal/os').run_ast_noopt_fn
|
||||
|
||||
if (!run_ast_noopt_fn) {
|
||||
log.console("error: run_ast_noopt_fn not available (rebuild bootstrap)")
|
||||
$stop()
|
||||
return
|
||||
}
|
||||
|
||||
// Parse arguments: diff [test_path]
|
||||
var target_test = null
|
||||
if (length(_args) > 0) {
|
||||
target_test = _args[0]
|
||||
}
|
||||
|
||||
var is_valid_package = testlib.is_valid_package
|
||||
|
||||
if (!is_valid_package('.')) {
|
||||
log.console('No cell.toml found in current directory')
|
||||
$stop()
|
||||
return
|
||||
}
|
||||
|
||||
// Collect test files
|
||||
function collect_tests(specific_test) {
|
||||
var files = pkg.list_files(null)
|
||||
var test_files = []
|
||||
var i = 0
|
||||
var f = null
|
||||
var test_name = null
|
||||
var match_name = null
|
||||
var match_base = null
|
||||
for (i = 0; i < length(files); i++) {
|
||||
f = files[i]
|
||||
if (starts_with(f, "tests/") && ends_with(f, ".cm")) {
|
||||
if (specific_test) {
|
||||
test_name = text(f, 0, -3)
|
||||
match_name = specific_test
|
||||
if (!starts_with(match_name, 'tests/')) match_name = 'tests/' + match_name
|
||||
match_base = ends_with(match_name, '.cm') ? text(match_name, 0, -3) : match_name
|
||||
if (test_name != match_base) continue
|
||||
}
|
||||
test_files[] = f
|
||||
}
|
||||
}
|
||||
return test_files
|
||||
}
|
||||
|
||||
var values_equal = testlib.values_equal
|
||||
var describe = testlib.describe
|
||||
|
||||
// Run a single test file through both paths
|
||||
function diff_test_file(file_path) {
|
||||
var mod_path = text(file_path, 0, -3)
|
||||
var src_path = fd.realpath('.') + '/' + file_path
|
||||
var src = null
|
||||
var ast = null
|
||||
var mod_opt = null
|
||||
var mod_noopt = null
|
||||
var results = {file: file_path, tests: [], passed: 0, failed: 0, errors: []}
|
||||
var use_pkg = fd.realpath('.')
|
||||
var opt_error = null
|
||||
var noopt_error = null
|
||||
var keys = null
|
||||
var i = 0
|
||||
var k = null
|
||||
var opt_result = null
|
||||
var noopt_result = null
|
||||
var opt_err = null
|
||||
var noopt_err = null
|
||||
var _run_one_opt = null
|
||||
var _run_one_noopt = null
|
||||
|
||||
// Build env for module loading
|
||||
var make_env = function() {
|
||||
return stone({
|
||||
use: function(path) {
|
||||
return shop.use(path, use_pkg)
|
||||
}
|
||||
})
|
||||
}
|
||||
|
||||
// Read and parse
|
||||
var _read = function() {
|
||||
src = text(fd.slurp(src_path))
|
||||
ast = analyze(src, src_path)
|
||||
} disruption {
|
||||
results.errors[] = `failed to parse ${file_path}`
|
||||
return results
|
||||
}
|
||||
_read()
|
||||
if (length(results.errors) > 0) return results
|
||||
|
||||
// Run optimized
|
||||
var _run_opt = function() {
|
||||
mod_opt = run_ast_fn(mod_path, ast, make_env())
|
||||
} disruption {
|
||||
opt_error = "disrupted"
|
||||
}
|
||||
_run_opt()
|
||||
|
||||
// Run unoptimized
|
||||
var _run_noopt = function() {
|
||||
mod_noopt = run_ast_noopt_fn(mod_path, ast, make_env())
|
||||
} disruption {
|
||||
noopt_error = "disrupted"
|
||||
}
|
||||
_run_noopt()
|
||||
|
||||
// Compare module-level behavior
|
||||
if (opt_error != noopt_error) {
|
||||
results.errors[] = `module load mismatch: opt=${opt_error != null ? opt_error : "ok"} noopt=${noopt_error != null ? noopt_error : "ok"}`
|
||||
results.failed = results.failed + 1
|
||||
return results
|
||||
}
|
||||
if (opt_error != null) {
|
||||
// Both disrupted during load — that's consistent
|
||||
results.passed = results.passed + 1
|
||||
results.tests[] = {name: "<module>", status: "passed"}
|
||||
return results
|
||||
}
|
||||
|
||||
// If module returns a record of functions, test each one
|
||||
if (is_object(mod_opt) && is_object(mod_noopt)) {
|
||||
keys = array(mod_opt)
|
||||
while (i < length(keys)) {
|
||||
k = keys[i]
|
||||
if (is_function(mod_opt[k]) && is_function(mod_noopt[k])) {
|
||||
opt_result = null
|
||||
noopt_result = null
|
||||
opt_err = null
|
||||
noopt_err = null
|
||||
|
||||
_run_one_opt = function() {
|
||||
opt_result = mod_opt[k]()
|
||||
} disruption {
|
||||
opt_err = "disrupted"
|
||||
}
|
||||
_run_one_opt()
|
||||
|
||||
_run_one_noopt = function() {
|
||||
noopt_result = mod_noopt[k]()
|
||||
} disruption {
|
||||
noopt_err = "disrupted"
|
||||
}
|
||||
_run_one_noopt()
|
||||
|
||||
if (opt_err != noopt_err) {
|
||||
results.tests[] = {name: k, status: "failed"}
|
||||
results.errors[] = `${k}: disruption mismatch opt=${opt_err != null ? opt_err : "ok"} noopt=${noopt_err != null ? noopt_err : "ok"}`
|
||||
results.failed = results.failed + 1
|
||||
} else if (!values_equal(opt_result, noopt_result)) {
|
||||
results.tests[] = {name: k, status: "failed"}
|
||||
results.errors[] = `${k}: result mismatch opt=${describe(opt_result)} noopt=${describe(noopt_result)}`
|
||||
results.failed = results.failed + 1
|
||||
} else {
|
||||
results.tests[] = {name: k, status: "passed"}
|
||||
results.passed = results.passed + 1
|
||||
}
|
||||
}
|
||||
i = i + 1
|
||||
}
|
||||
} else {
|
||||
// Compare direct return values
|
||||
if (!values_equal(mod_opt, mod_noopt)) {
|
||||
results.tests[] = {name: "<return>", status: "failed"}
|
||||
results.errors[] = `return value mismatch: opt=${describe(mod_opt)} noopt=${describe(mod_noopt)}`
|
||||
results.failed = results.failed + 1
|
||||
} else {
|
||||
results.tests[] = {name: "<return>", status: "passed"}
|
||||
results.passed = results.passed + 1
|
||||
}
|
||||
}
|
||||
|
||||
return results
|
||||
}
|
||||
|
||||
// Main
|
||||
var test_files = collect_tests(target_test)
|
||||
log.console(`Differential testing: ${text(length(test_files))} file(s)`)
|
||||
|
||||
var total_passed = 0
|
||||
var total_failed = 0
|
||||
var i = 0
|
||||
var result = null
|
||||
var j = 0
|
||||
|
||||
while (i < length(test_files)) {
|
||||
result = diff_test_file(test_files[i])
|
||||
log.console(` ${result.file}: ${text(result.passed)} passed, ${text(result.failed)} failed`)
|
||||
j = 0
|
||||
while (j < length(result.errors)) {
|
||||
log.console(` MISMATCH: ${result.errors[j]}`)
|
||||
j = j + 1
|
||||
}
|
||||
total_passed = total_passed + result.passed
|
||||
total_failed = total_failed + result.failed
|
||||
i = i + 1
|
||||
}
|
||||
|
||||
log.console(`----------------------------------------`)
|
||||
log.console(`Diff: ${text(total_passed)} passed, ${text(total_failed)} failed, ${text(total_passed + total_failed)} total`)
|
||||
|
||||
if (total_failed > 0) {
|
||||
log.console(`DIFFERENTIAL FAILURES DETECTED`)
|
||||
}
|
||||
|
||||
$stop()
|
||||
310
diff_ir.ce
Normal file
310
diff_ir.ce
Normal file
@@ -0,0 +1,310 @@
|
||||
// diff_ir.ce — mcode vs streamline diff
|
||||
//
|
||||
// Usage:
|
||||
// cell diff_ir <file> Diff all functions
|
||||
// cell diff_ir --fn <N|name> <file> Diff only one function
|
||||
// cell diff_ir --summary <file> Counts only
|
||||
|
||||
var fd = use("fd")
|
||||
var shop = use("internal/shop")
|
||||
|
||||
var pad_right = function(s, w) {
|
||||
var r = s
|
||||
while (length(r) < w) {
|
||||
r = r + " "
|
||||
}
|
||||
return r
|
||||
}
|
||||
|
||||
var fmt_val = function(v) {
|
||||
if (is_null(v)) return "null"
|
||||
if (is_number(v)) return text(v)
|
||||
if (is_text(v)) return `"${v}"`
|
||||
if (is_object(v)) return text(v)
|
||||
if (is_logical(v)) return v ? "true" : "false"
|
||||
return text(v)
|
||||
}
|
||||
|
||||
var run = function() {
|
||||
var fn_filter = null
|
||||
var show_summary = false
|
||||
var filename = null
|
||||
var i = 0
|
||||
var mcode_ir = null
|
||||
var opt_ir = null
|
||||
var source_text = null
|
||||
var source_lines = null
|
||||
var main_name = null
|
||||
var fi = 0
|
||||
var func = null
|
||||
var opt_func = null
|
||||
var fname = null
|
||||
|
||||
while (i < length(args)) {
|
||||
if (args[i] == '--fn') {
|
||||
i = i + 1
|
||||
fn_filter = args[i]
|
||||
} else if (args[i] == '--summary') {
|
||||
show_summary = true
|
||||
} else if (args[i] == '--help' || args[i] == '-h') {
|
||||
log.console("Usage: cell diff_ir [--fn <N|name>] [--summary] <file>")
|
||||
log.console("")
|
||||
log.console(" --fn <N|name> Filter to function by index or name")
|
||||
log.console(" --summary Show counts only")
|
||||
return null
|
||||
} else if (!starts_with(args[i], '-')) {
|
||||
filename = args[i]
|
||||
}
|
||||
i = i + 1
|
||||
}
|
||||
|
||||
if (!filename) {
|
||||
log.console("Usage: cell diff_ir [--fn <N|name>] [--summary] <file>")
|
||||
return null
|
||||
}
|
||||
|
||||
mcode_ir = shop.mcode_file(filename)
|
||||
opt_ir = shop.compile_file(filename)
|
||||
|
||||
source_text = text(fd.slurp(filename))
|
||||
source_lines = array(source_text, "\n")
|
||||
|
||||
var get_source_line = function(line_num) {
|
||||
if (line_num < 1 || line_num > length(source_lines)) return null
|
||||
return source_lines[line_num - 1]
|
||||
}
|
||||
|
||||
var fn_matches = function(index, name) {
|
||||
var match = null
|
||||
if (fn_filter == null) return true
|
||||
if (index >= 0 && fn_filter == text(index)) return true
|
||||
if (name != null) {
|
||||
match = search(name, fn_filter)
|
||||
if (match != null && match >= 0) return true
|
||||
}
|
||||
return false
|
||||
}
|
||||
|
||||
var fmt_instr = function(instr) {
|
||||
var op = instr[0]
|
||||
var n = length(instr)
|
||||
var parts = []
|
||||
var j = 1
|
||||
var operands = null
|
||||
var line_str = null
|
||||
while (j < n - 2) {
|
||||
parts[] = fmt_val(instr[j])
|
||||
j = j + 1
|
||||
}
|
||||
operands = text(parts, ", ")
|
||||
line_str = instr[n - 2] != null ? `:${text(instr[n - 2])}` : ""
|
||||
return pad_right(`${pad_right(op, 15)}${operands}`, 45) + line_str
|
||||
}
|
||||
|
||||
var classify = function(before, after) {
|
||||
var bn = 0
|
||||
var an = 0
|
||||
var k = 0
|
||||
if (is_text(after) && starts_with(after, "_nop_")) return "eliminated"
|
||||
if (is_array(before) && is_array(after)) {
|
||||
if (before[0] != after[0]) return "rewritten"
|
||||
bn = length(before)
|
||||
an = length(after)
|
||||
if (bn != an) return "rewritten"
|
||||
k = 1
|
||||
while (k < bn - 2) {
|
||||
if (before[k] != after[k]) return "rewritten"
|
||||
k = k + 1
|
||||
}
|
||||
return "identical"
|
||||
}
|
||||
return "identical"
|
||||
}
|
||||
|
||||
var total_eliminated = 0
|
||||
var total_rewritten = 0
|
||||
var total_funcs = 0
|
||||
|
||||
var diff_function = function(mcode_func, opt_func, name, index) {
|
||||
var nr_args = mcode_func.nr_args != null ? mcode_func.nr_args : 0
|
||||
var nr_slots = mcode_func.nr_slots != null ? mcode_func.nr_slots : 0
|
||||
var m_instrs = mcode_func.instructions
|
||||
var o_instrs = opt_func.instructions
|
||||
var eliminated = 0
|
||||
var rewritten = 0
|
||||
var mi = 0
|
||||
var oi = 0
|
||||
var pc = 0
|
||||
var m_instr = null
|
||||
var o_instr = null
|
||||
var kind = null
|
||||
var last_line = null
|
||||
var instr_line = null
|
||||
var n = 0
|
||||
var src = null
|
||||
var annotation = null
|
||||
|
||||
if (m_instrs == null) m_instrs = []
|
||||
if (o_instrs == null) o_instrs = []
|
||||
|
||||
// First pass: count changes
|
||||
mi = 0
|
||||
oi = 0
|
||||
while (mi < length(m_instrs) && oi < length(o_instrs)) {
|
||||
m_instr = m_instrs[mi]
|
||||
o_instr = o_instrs[oi]
|
||||
|
||||
if (is_text(m_instr)) {
|
||||
mi = mi + 1
|
||||
oi = oi + 1
|
||||
continue
|
||||
}
|
||||
|
||||
if (is_text(o_instr) && starts_with(o_instr, "_nop_")) {
|
||||
if (is_array(m_instr)) {
|
||||
eliminated = eliminated + 1
|
||||
}
|
||||
mi = mi + 1
|
||||
oi = oi + 1
|
||||
continue
|
||||
}
|
||||
|
||||
if (is_array(m_instr) && is_array(o_instr)) {
|
||||
kind = classify(m_instr, o_instr)
|
||||
if (kind == "rewritten") {
|
||||
rewritten = rewritten + 1
|
||||
}
|
||||
}
|
||||
mi = mi + 1
|
||||
oi = oi + 1
|
||||
}
|
||||
|
||||
total_eliminated = total_eliminated + eliminated
|
||||
total_rewritten = total_rewritten + rewritten
|
||||
total_funcs = total_funcs + 1
|
||||
|
||||
if (show_summary) {
|
||||
if (eliminated == 0 && rewritten == 0) {
|
||||
log.compile(` ${pad_right(name + ":", 40)} 0 eliminated, 0 rewritten (unchanged)`)
|
||||
} else {
|
||||
log.compile(` ${pad_right(name + ":", 40)} ${text(eliminated)} eliminated, ${text(rewritten)} rewritten`)
|
||||
}
|
||||
return null
|
||||
}
|
||||
|
||||
if (eliminated == 0 && rewritten == 0) return null
|
||||
|
||||
log.compile(`\n=== ${name} (args=${text(nr_args)}, slots=${text(nr_slots)}) ===`)
|
||||
log.compile(` ${text(eliminated)} eliminated, ${text(rewritten)} rewritten`)
|
||||
|
||||
// Second pass: show diffs
|
||||
mi = 0
|
||||
oi = 0
|
||||
pc = 0
|
||||
last_line = null
|
||||
while (mi < length(m_instrs) && oi < length(o_instrs)) {
|
||||
m_instr = m_instrs[mi]
|
||||
o_instr = o_instrs[oi]
|
||||
|
||||
if (is_text(m_instr) && !starts_with(m_instr, "_nop_")) {
|
||||
mi = mi + 1
|
||||
oi = oi + 1
|
||||
continue
|
||||
}
|
||||
|
||||
if (is_text(m_instr) && starts_with(m_instr, "_nop_")) {
|
||||
mi = mi + 1
|
||||
oi = oi + 1
|
||||
continue
|
||||
}
|
||||
|
||||
if (is_text(o_instr) && starts_with(o_instr, "_nop_")) {
|
||||
if (is_array(m_instr)) {
|
||||
n = length(m_instr)
|
||||
instr_line = m_instr[n - 2]
|
||||
if (instr_line != last_line && instr_line != null) {
|
||||
src = get_source_line(instr_line)
|
||||
if (src != null) src = trim(src)
|
||||
if (last_line != null) log.compile("")
|
||||
if (src != null && length(src) > 0) {
|
||||
log.compile(` --- line ${text(instr_line)}: ${src} ---`)
|
||||
}
|
||||
last_line = instr_line
|
||||
}
|
||||
log.compile(` - ${pad_right(text(pc), 6)}${fmt_instr(m_instr)}`)
|
||||
log.compile(` + ${pad_right(text(pc), 6)}${pad_right(o_instr, 45)} (eliminated)`)
|
||||
}
|
||||
mi = mi + 1
|
||||
oi = oi + 1
|
||||
pc = pc + 1
|
||||
continue
|
||||
}
|
||||
|
||||
if (is_array(m_instr) && is_array(o_instr)) {
|
||||
kind = classify(m_instr, o_instr)
|
||||
if (kind != "identical") {
|
||||
n = length(m_instr)
|
||||
instr_line = m_instr[n - 2]
|
||||
if (instr_line != last_line && instr_line != null) {
|
||||
src = get_source_line(instr_line)
|
||||
if (src != null) src = trim(src)
|
||||
if (last_line != null) log.compile("")
|
||||
if (src != null && length(src) > 0) {
|
||||
log.compile(` --- line ${text(instr_line)}: ${src} ---`)
|
||||
}
|
||||
last_line = instr_line
|
||||
}
|
||||
|
||||
annotation = ""
|
||||
if (kind == "rewritten") {
|
||||
if (o_instr[0] == "concat" && m_instr[0] != "concat") {
|
||||
annotation = "(specialized)"
|
||||
} else {
|
||||
annotation = "(rewritten)"
|
||||
}
|
||||
}
|
||||
|
||||
log.compile(` - ${pad_right(text(pc), 6)}${fmt_instr(m_instr)}`)
|
||||
log.compile(` + ${pad_right(text(pc), 6)}${fmt_instr(o_instr)} ${annotation}`)
|
||||
}
|
||||
pc = pc + 1
|
||||
}
|
||||
|
||||
mi = mi + 1
|
||||
oi = oi + 1
|
||||
}
|
||||
|
||||
return null
|
||||
}
|
||||
|
||||
// Process functions
|
||||
main_name = mcode_ir.name != null ? mcode_ir.name : "<main>"
|
||||
|
||||
if (mcode_ir.main != null && opt_ir.main != null) {
|
||||
if (fn_matches(-1, main_name)) {
|
||||
diff_function(mcode_ir.main, opt_ir.main, main_name, -1)
|
||||
}
|
||||
}
|
||||
|
||||
if (mcode_ir.functions != null && opt_ir.functions != null) {
|
||||
fi = 0
|
||||
while (fi < length(mcode_ir.functions) && fi < length(opt_ir.functions)) {
|
||||
func = mcode_ir.functions[fi]
|
||||
opt_func = opt_ir.functions[fi]
|
||||
fname = func.name != null ? func.name : "<anonymous>"
|
||||
if (fn_matches(fi, fname)) {
|
||||
diff_function(func, opt_func, `[${text(fi)}] ${fname}`, fi)
|
||||
}
|
||||
fi = fi + 1
|
||||
}
|
||||
}
|
||||
|
||||
if (show_summary) {
|
||||
log.compile(`\n total: ${text(total_eliminated)} eliminated, ${text(total_rewritten)} rewritten across ${text(total_funcs)} functions`)
|
||||
}
|
||||
|
||||
return null
|
||||
}
|
||||
|
||||
run()
|
||||
$stop()
|
||||
265
disasm.ce
Normal file
265
disasm.ce
Normal file
@@ -0,0 +1,265 @@
|
||||
// disasm.ce — source-interleaved disassembly
|
||||
//
|
||||
// Usage:
|
||||
// cell disasm <file> Disassemble all functions (mcode)
|
||||
// cell disasm --optimized <file> Disassemble optimized IR (streamline)
|
||||
// cell disasm --fn <N|name> <file> Show only function N or named function
|
||||
// cell disasm --line <N> <file> Show instructions from source line N
|
||||
|
||||
var fd = use("fd")
|
||||
var shop = use("internal/shop")
|
||||
|
||||
var pad_right = function(s, w) {
|
||||
var r = s
|
||||
while (length(r) < w) {
|
||||
r = r + " "
|
||||
}
|
||||
return r
|
||||
}
|
||||
|
||||
var fmt_val = function(v) {
|
||||
if (is_null(v)) return "null"
|
||||
if (is_number(v)) return text(v)
|
||||
if (is_text(v)) return `"${v}"`
|
||||
if (is_object(v)) return text(v)
|
||||
if (is_logical(v)) return v ? "true" : "false"
|
||||
return text(v)
|
||||
}
|
||||
|
||||
var run = function() {
|
||||
var use_optimized = false
|
||||
var fn_filter = null
|
||||
var line_filter = null
|
||||
var filename = null
|
||||
var i = 0
|
||||
var compiled = null
|
||||
var source_text = null
|
||||
var source_lines = null
|
||||
var main_name = null
|
||||
var fi = 0
|
||||
var func = null
|
||||
var fname = null
|
||||
|
||||
while (i < length(args)) {
|
||||
if (args[i] == '--optimized') {
|
||||
use_optimized = true
|
||||
} else if (args[i] == '--fn') {
|
||||
i = i + 1
|
||||
fn_filter = args[i]
|
||||
} else if (args[i] == '--line') {
|
||||
i = i + 1
|
||||
line_filter = number(args[i])
|
||||
} else if (args[i] == '--help' || args[i] == '-h') {
|
||||
log.console("Usage: cell disasm [--optimized] [--fn <N|name>] [--line <N>] <file>")
|
||||
log.console("")
|
||||
log.console(" --optimized Use optimized IR (streamline) instead of raw mcode")
|
||||
log.console(" --fn <N|name> Filter to function by index or name")
|
||||
log.console(" --line <N> Show only instructions from source line N")
|
||||
return null
|
||||
} else if (!starts_with(args[i], '-')) {
|
||||
filename = args[i]
|
||||
}
|
||||
i = i + 1
|
||||
}
|
||||
|
||||
if (!filename) {
|
||||
log.console("Usage: cell disasm [--optimized] [--fn <N|name>] [--line <N>] <file>")
|
||||
return null
|
||||
}
|
||||
|
||||
// Compile
|
||||
|
||||
if (use_optimized) {
|
||||
compiled = shop.compile_file(filename)
|
||||
} else {
|
||||
compiled = shop.mcode_file(filename)
|
||||
}
|
||||
|
||||
// Read source file
|
||||
|
||||
source_text = text(fd.slurp(filename))
|
||||
source_lines = array(source_text, "\n")
|
||||
|
||||
// Helpers
|
||||
|
||||
var get_source_line = function(line_num) {
|
||||
if (line_num < 1 || line_num > length(source_lines)) return null
|
||||
return source_lines[line_num - 1]
|
||||
}
|
||||
|
||||
var first_instr_line = function(func) {
|
||||
var instrs = func.instructions
|
||||
var i = 0
|
||||
var n = 0
|
||||
if (instrs == null) return null
|
||||
while (i < length(instrs)) {
|
||||
if (is_array(instrs[i])) {
|
||||
n = length(instrs[i])
|
||||
return instrs[i][n - 2]
|
||||
}
|
||||
i = i + 1
|
||||
}
|
||||
return null
|
||||
}
|
||||
|
||||
var func_has_line = function(func, target) {
|
||||
var instrs = func.instructions
|
||||
var i = 0
|
||||
var n = 0
|
||||
if (instrs == null) return false
|
||||
while (i < length(instrs)) {
|
||||
if (is_array(instrs[i])) {
|
||||
n = length(instrs[i])
|
||||
if (instrs[i][n - 2] == target) return true
|
||||
}
|
||||
i = i + 1
|
||||
}
|
||||
return false
|
||||
}
|
||||
|
||||
var fn_matches = function(index, name) {
|
||||
var match = null
|
||||
if (fn_filter == null) return true
|
||||
if (index >= 0 && fn_filter == text(index)) return true
|
||||
if (name != null) {
|
||||
match = search(name, fn_filter)
|
||||
if (match != null && match >= 0) return true
|
||||
}
|
||||
return false
|
||||
}
|
||||
|
||||
var func_name_by_index = function(fi) {
|
||||
var f = null
|
||||
if (compiled.functions == null) return null
|
||||
if (fi < 0 || fi >= length(compiled.functions)) return null
|
||||
f = compiled.functions[fi]
|
||||
return f.name
|
||||
}
|
||||
|
||||
var dump_function = function(func, name, index) {
|
||||
var nr_args = func.nr_args != null ? func.nr_args : 0
|
||||
var nr_slots = func.nr_slots != null ? func.nr_slots : 0
|
||||
var nr_close = func.nr_close_slots != null ? func.nr_close_slots : 0
|
||||
var instrs = func.instructions
|
||||
var start_line = first_instr_line(func)
|
||||
var header = null
|
||||
var i = 0
|
||||
var pc = 0
|
||||
var instr = null
|
||||
var op = null
|
||||
var n = 0
|
||||
var parts = null
|
||||
var j = 0
|
||||
var operands = null
|
||||
var instr_line = null
|
||||
var last_line = null
|
||||
var src = null
|
||||
var line_str = null
|
||||
var instr_text = null
|
||||
var target_name = null
|
||||
|
||||
header = `\n=== ${name} (args=${text(nr_args)}, slots=${text(nr_slots)}, closures=${text(nr_close)})`
|
||||
if (start_line != null) {
|
||||
header = header + ` [line ${text(start_line)}]`
|
||||
}
|
||||
header = header + " ==="
|
||||
log.compile(header)
|
||||
|
||||
if (instrs == null || length(instrs) == 0) {
|
||||
log.compile(" (empty)")
|
||||
return null
|
||||
}
|
||||
|
||||
while (i < length(instrs)) {
|
||||
instr = instrs[i]
|
||||
if (is_text(instr)) {
|
||||
if (!starts_with(instr, "_nop_") && line_filter == null) {
|
||||
log.compile(` ${instr}:`)
|
||||
}
|
||||
} else if (is_array(instr)) {
|
||||
op = instr[0]
|
||||
n = length(instr)
|
||||
instr_line = instr[n - 2]
|
||||
|
||||
if (line_filter != null && instr_line != line_filter) {
|
||||
pc = pc + 1
|
||||
i = i + 1
|
||||
continue
|
||||
}
|
||||
|
||||
if (instr_line != last_line && instr_line != null) {
|
||||
src = get_source_line(instr_line)
|
||||
if (src != null) {
|
||||
src = trim(src)
|
||||
}
|
||||
if (last_line != null) {
|
||||
log.compile("")
|
||||
}
|
||||
if (src != null && length(src) > 0) {
|
||||
log.compile(` --- line ${text(instr_line)}: ${src} ---`)
|
||||
} else {
|
||||
log.compile(` --- line ${text(instr_line)} ---`)
|
||||
}
|
||||
last_line = instr_line
|
||||
}
|
||||
|
||||
parts = []
|
||||
j = 1
|
||||
while (j < n - 2) {
|
||||
parts[] = fmt_val(instr[j])
|
||||
j = j + 1
|
||||
}
|
||||
operands = text(parts, ", ")
|
||||
line_str = instr_line != null ? `:${text(instr_line)}` : ""
|
||||
instr_text = ` ${pad_right(text(pc), 6)}${pad_right(op, 15)}${operands}`
|
||||
|
||||
// Cross-reference for function creation instructions
|
||||
target_name = null
|
||||
if (op == "function" && n >= 5) {
|
||||
target_name = func_name_by_index(instr[2])
|
||||
}
|
||||
if (target_name != null) {
|
||||
instr_text = pad_right(instr_text, 65) + line_str + ` ; -> [${text(instr[2])}] ${target_name}`
|
||||
} else {
|
||||
instr_text = pad_right(instr_text, 65) + line_str
|
||||
}
|
||||
|
||||
log.compile(instr_text)
|
||||
pc = pc + 1
|
||||
}
|
||||
i = i + 1
|
||||
}
|
||||
return null
|
||||
}
|
||||
|
||||
// Process functions
|
||||
|
||||
main_name = compiled.name != null ? compiled.name : "<main>"
|
||||
|
||||
if (compiled.main != null) {
|
||||
if (fn_matches(-1, main_name)) {
|
||||
if (line_filter == null || func_has_line(compiled.main, line_filter)) {
|
||||
dump_function(compiled.main, main_name, -1)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if (compiled.functions != null) {
|
||||
fi = 0
|
||||
while (fi < length(compiled.functions)) {
|
||||
func = compiled.functions[fi]
|
||||
fname = func.name != null ? func.name : "<anonymous>"
|
||||
if (fn_matches(fi, fname)) {
|
||||
if (line_filter == null || func_has_line(func, line_filter)) {
|
||||
dump_function(func, `[${text(fi)}] ${fname}`, fi)
|
||||
}
|
||||
}
|
||||
fi = fi + 1
|
||||
}
|
||||
}
|
||||
|
||||
return null
|
||||
}
|
||||
|
||||
run()
|
||||
$stop()
|
||||
12
docs/.pages
12
docs/.pages
@@ -1,12 +0,0 @@
|
||||
nav:
|
||||
- index.md
|
||||
- tutorial.md
|
||||
- actors.md
|
||||
- rendering.md
|
||||
- resources.md
|
||||
- input.md
|
||||
- exporting.md
|
||||
- ...
|
||||
- Appendix A - dull: dull
|
||||
- Appendix B - api: api
|
||||
|
||||
94
docs/_index.md
Normal file
94
docs/_index.md
Normal file
@@ -0,0 +1,94 @@
|
||||
---
|
||||
title: "Documentation"
|
||||
description: "ƿit language documentation"
|
||||
type: "docs"
|
||||
---
|
||||
|
||||

|
||||
|
||||
ƿit is an actor-based scripting language for building concurrent applications. It combines a familiar C-like syntax with the actor model of computation, optimized for low memory usage and simplicity.
|
||||
|
||||
## Key Features
|
||||
|
||||
- **Actor Model** — isolated memory, message passing, no shared state
|
||||
- **Immutability** — `stone()` makes values permanently frozen
|
||||
- **Prototype Inheritance** — objects without classes
|
||||
- **C Integration** — seamlessly extend with native code
|
||||
- **Cross-Platform** — deploy to desktop, web, and embedded
|
||||
|
||||
## Quick Start
|
||||
|
||||
```javascript
|
||||
// hello.ce - A simple actor
|
||||
print("Hello, ƿit!")
|
||||
$stop()
|
||||
```
|
||||
|
||||
```bash
|
||||
pit hello
|
||||
```
|
||||
|
||||
## Language
|
||||
|
||||
- [**ƿit Language**](/docs/language/) — syntax, types, and operators
|
||||
- [**Actors and Modules**](/docs/actors/) — the execution model
|
||||
- [**Requestors**](/docs/requestors/) — asynchronous composition
|
||||
- [**Packages**](/docs/packages/) — code organization and sharing
|
||||
- [**Shop Architecture**](/docs/shop/) — module resolution, compilation, and caching
|
||||
|
||||
## Reference
|
||||
|
||||
- [**Built-in Functions**](/docs/functions/) — intrinsics reference
|
||||
- [text](/docs/library/text/) — text conversion and manipulation
|
||||
- [number](/docs/library/number/) — numeric conversion and operations
|
||||
- [array](/docs/library/array/) — array creation and manipulation
|
||||
- [object](/docs/library/object/) — object creation, prototypes, and serialization
|
||||
|
||||
## Standard Library
|
||||
|
||||
Modules loaded with `use()`:
|
||||
|
||||
- [blob](/docs/library/blob/) — binary data
|
||||
- [time](/docs/library/time/) — time and dates
|
||||
- [math](/docs/library/math/) — trigonometry and math
|
||||
- [json](/docs/library/json/) — JSON encoding/decoding
|
||||
- [random](/docs/library/random/) — random numbers
|
||||
|
||||
## Tools
|
||||
|
||||
- [**Command Line**](/docs/cli/) — the `pit` tool
|
||||
- [**Semantic Index**](/docs/semantic-index/) — index and query symbols, references, and call sites
|
||||
- [**Testing**](/docs/testing/) — writing and running tests
|
||||
- [**Compiler Inspection**](/docs/compiler-tools/) — dump AST, mcode, and optimizer reports
|
||||
- [**Writing C Modules**](/docs/c-modules/) — native extensions
|
||||
|
||||
## Architecture
|
||||
|
||||
ƿit programs are organized into **packages**. Each package contains:
|
||||
|
||||
- **Modules** (`.cm`) — return a value, cached and frozen
|
||||
- **Actors** (`.ce`) — run independently, communicate via messages
|
||||
- **C files** (`.c`) — compiled to native libraries
|
||||
|
||||
Actors never share memory. They communicate by sending messages, which are automatically serialized. This makes concurrent programming safe and predictable.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
# Clone and bootstrap
|
||||
git clone https://gitea.pockle.world/john/cell
|
||||
cd cell
|
||||
make bootstrap
|
||||
```
|
||||
|
||||
The ƿit shop is stored at `~/.cell/`.
|
||||
|
||||
## Development
|
||||
|
||||
After making changes, recompile with:
|
||||
|
||||
```bash
|
||||
make
|
||||
```
|
||||
|
||||
Run `cell --help` to see all available CLI flags.
|
||||
383
docs/actors.md
383
docs/actors.md
@@ -1,77 +1,362 @@
|
||||
# Programs: Programs and Modules
|
||||
---
|
||||
title: "Actors and Modules"
|
||||
description: "The ƿit execution model"
|
||||
weight: 20
|
||||
type: "docs"
|
||||
---
|
||||
|
||||
Prosperon organizes your code into two broad categories: **modules** and **programs**. Modules are used to extend programs with new functionality, while programs are used to spawn actors.
|
||||
ƿit organizes code into two types of scripts: **modules** (`.cm`) and **actors** (`.ce`).
|
||||
|
||||
## Modules
|
||||
## The Actor Model
|
||||
|
||||
A **module** is any file that returns a single value. This return value is commonly an object, but it can be any data type (string, number, function, etc.). Once a module returns its value, Prosperon **freezes** that value, preventing accidental modification. The module is then cached so that subsequent imports of the same module don’t re-run the file—they reuse the cached result.
|
||||
ƿit is built on the actor model of computation. Each actor:
|
||||
|
||||
### Importing a Module
|
||||
- Has its own **isolated memory** — actors never share state
|
||||
- Runs to completion each **turn** — no preemption
|
||||
- Performs its own **garbage collection**
|
||||
- Communicates only through **message passing**
|
||||
|
||||
Use the built-in `use` function to import a module by file path (or by name if resolvable via Prosperon’s path settings). For example:
|
||||
This isolation makes concurrent programming safer and more predictable.
|
||||
|
||||
```
|
||||
var myModule = use('scripts/modules/myModule')
|
||||
## Modules (.cm)
|
||||
|
||||
A module is a script that **returns a value**. The returned value is cached and frozen (made stone).
|
||||
|
||||
```javascript
|
||||
// math_utils.cm
|
||||
var math = use('math/radians')
|
||||
|
||||
var distance = function(x1, y1, x2, y2) {
|
||||
var dx = x2 - x1
|
||||
var dy = y2 - y1
|
||||
return math.sqrt(dx * dx + dy * dy)
|
||||
}
|
||||
|
||||
var midpoint = function(x1, y1, x2, y2) {
|
||||
return {
|
||||
x: (x1 + x2) / 2,
|
||||
y: (y1 + y2) / 2
|
||||
}
|
||||
}
|
||||
|
||||
return {
|
||||
distance: distance,
|
||||
midpoint: midpoint
|
||||
}
|
||||
```
|
||||
|
||||
`use('module')` returns the **exact** same object if called multiple times, since modules are cached and not re-run.
|
||||
**Key properties:**
|
||||
|
||||
Dull based modules are resolved by searching for them from the `prosperon.PATH` array. Engine modules are stored under `scripts/modules`, which is already added to the PATH for you.
|
||||
- **Must return a value** — it's an error not to
|
||||
- **Executed once per actor** — subsequent `use()` calls return the cached value
|
||||
- **Return value is stone** — immutable, safe to share
|
||||
- Modules can import other modules with `use()`
|
||||
|
||||
Prosperon can also load C based modules. If two modules have the same path resolution, the C based library will be imported.
|
||||
### Using Modules
|
||||
|
||||
## Programs
|
||||
```javascript
|
||||
var utils = use('math_utils')
|
||||
var d = utils.distance(0, 0, 3, 4) // 5
|
||||
```
|
||||
|
||||
An **program** is a file that **does not** return a value. Instead, the file’s contents run top to bottom as soon as the program is spawned. Programs are your game’s “live” scripts: each program can hold its own state and logic, spawn sub-programs, schedule timed tasks, and eventually **kill** itself (or be killed) when it’s done.
|
||||
## Actors (.ce)
|
||||
|
||||
### Program Intrinsic Functions
|
||||
An actor is a script that **does not return a value**. It runs as an independent unit of execution.
|
||||
|
||||
Certain functions are intrinsic to the program and cannot be overridden. They’re assigned to each new program instance at spawn time:
|
||||
```javascript
|
||||
// worker.ce
|
||||
print("Worker started")
|
||||
|
||||
1. **`spawn(script, config, callback)`**
|
||||
Creates (spawns) a new program from another script file.
|
||||
- **`script`**: Path to the program script (a file containing statements, not returning anything).
|
||||
- **`config`**: Optional object of extra properties to assign to the new program.
|
||||
- **`callback(underling, info)`**: Optional function invoked right after the program is instantiated but before it fully initializes.
|
||||
$receiver(function(msg) {
|
||||
print("Received:", msg)
|
||||
send(msg, {status: "ok"})
|
||||
})
|
||||
```
|
||||
|
||||
The newly spawned program:
|
||||
- Receives a reference to its parent (the `overling`) and can store child programs (the `underlings`).
|
||||
- Automatically calls `awake()` if that function is defined, after basic setup completes.
|
||||
- Registers any recognized event handlers (like `update`, `draw`, etc.) if they exist.
|
||||
**Key properties:**
|
||||
|
||||
2. **`kill()`**
|
||||
Destroys the program, all of its timers, and recursively kills any underling (child) programs. If the program has a parent, it is removed from the parent’s `underlings` set.
|
||||
- **Must not return a value** — it's an error to return
|
||||
- Has access to **actor intrinsics** (functions starting with `$`)
|
||||
- Runs until explicitly stopped or crashes
|
||||
|
||||
3. **`delay(fn, seconds)`**
|
||||
Runs the given function `fn` after `seconds`. This is implemented under the hood with a timer that automatically clears itself once it fires.
|
||||
- **Example**:
|
||||
```js
|
||||
this.delay(_ => {
|
||||
log.console("3 seconds later!")
|
||||
}, 3)
|
||||
```
|
||||
## Actor Intrinsics
|
||||
|
||||
4. **`clear()`**
|
||||
Recursively kills all child programs, clearing your immediate `underlings` set. This is not called automatically. You can use it to manually clean up all children without necessarily killing the program itself.
|
||||
Actors have access to special functions prefixed with `$`:
|
||||
|
||||
### The program Lifecycle
|
||||
### $self
|
||||
|
||||
Specific hooks can be set on a program when it is initialized.
|
||||
Reference to the current actor. This is a stone (immutable) actor object.
|
||||
|
||||
- **Awake**: If the new program defines `awake()`, Prosperon calls it after the script finishes its top-level execution. This is a common place to do initialization.
|
||||
- **Garbage**: When the program is killed, if it has a `garbage()` function, Prosperon calls it before final removal.
|
||||
- **Then**: If the program has a `then()` function, Prosperon calls it at the very end of the kill process, allowing any final statements after your `garbage()` logic completes.
|
||||
- **Registration**: In addition, if the object has **any** function named the same thing as a hook created with **prosperon.on**, that function will be registered with it after initialization.
|
||||
```javascript
|
||||
print($self) // actor reference
|
||||
print(is_actor($self)) // true
|
||||
```
|
||||
|
||||
### Overlings and Underlings
|
||||
### $overling
|
||||
|
||||
Programs have access to its creator and other programs created underneath it, termed its overling and underlings.
|
||||
Reference to the parent actor that started this actor. `null` for the root actor. Child actors are automatically coupled to their overling — if the parent dies, the child dies too.
|
||||
|
||||
- **`this.overling`** is the parent program that spawned the current one.
|
||||
- **`this.underlings`** is a set of child programs that the current program has spawned.
|
||||
```javascript
|
||||
if ($overling != null) {
|
||||
send($overling, {status: "ready"})
|
||||
}
|
||||
```
|
||||
|
||||
Killing a parent automatically kills all of its underlings, which in turn can kill their own underlings, and so on.
|
||||
### $stop()
|
||||
|
||||
## Program Documentation
|
||||
Stop the current actor. When called with an actor argument, stops that underling (child) instead.
|
||||
|
||||
Prosperon includes a module called `doc.js` which helps generate documentation for your modules and programs. Any function and value can be assigned a docstring, and prosperon will then be able to generate documentation for it via doc.js. Look under the module API for more info.
|
||||
```javascript
|
||||
$stop() // stop self
|
||||
$stop(child) // stop a child actor
|
||||
```
|
||||
|
||||
**Important:** `$stop()` does not halt execution immediately. Code after the call continues running in the current turn — it only prevents the actor from receiving future messages. Structure your code so that nothing runs after `$stop()`, or use `return` to exit the current function first.
|
||||
|
||||
```javascript
|
||||
// Wrong — code after $stop() still runs
|
||||
if (done) $stop()
|
||||
do_more_work() // this still executes!
|
||||
|
||||
// Right — return after $stop()
|
||||
if (done) { $stop(); return }
|
||||
do_more_work()
|
||||
```
|
||||
|
||||
### $start(callback, program)
|
||||
|
||||
Start a new child actor from a script. The callback receives lifecycle events:
|
||||
|
||||
- `{type: "greet", actor: <ref>}` — child started successfully
|
||||
- `{type: "stop"}` — child stopped cleanly
|
||||
- `{type: "disrupt", reason: ...}` — child crashed
|
||||
|
||||
```javascript
|
||||
$start(function(event) {
|
||||
if (event.type == 'greet') {
|
||||
print("Child started:", event.actor)
|
||||
send(event.actor, {task: "work"})
|
||||
}
|
||||
if (event.type == 'stop') {
|
||||
print("Child stopped")
|
||||
}
|
||||
if (event.type == 'disrupt') {
|
||||
print("Child crashed:", event.reason)
|
||||
}
|
||||
}, "worker")
|
||||
```
|
||||
|
||||
### $delay(callback, seconds)
|
||||
|
||||
Schedule a callback after a delay. Returns a cancel function that can be called to prevent the callback from firing.
|
||||
|
||||
```javascript
|
||||
var cancel = $delay(function() {
|
||||
print("5 seconds later")
|
||||
}, 5)
|
||||
|
||||
// To cancel before it fires:
|
||||
cancel()
|
||||
```
|
||||
|
||||
### $clock(callback)
|
||||
|
||||
Get called every frame/tick. The callback receives the current time as a number.
|
||||
|
||||
```javascript
|
||||
$clock(function(t) {
|
||||
// called each tick with current time
|
||||
})
|
||||
```
|
||||
|
||||
### $receiver(callback)
|
||||
|
||||
Set up a message receiver. The callback is called with the incoming message whenever another actor sends a message to this actor.
|
||||
|
||||
To reply to a message, call `send(message, reply_data)` — the message object contains routing information that directs the reply back to the sender.
|
||||
|
||||
```javascript
|
||||
$receiver(function(message) {
|
||||
// handle incoming message
|
||||
send(message, {status: "ok"})
|
||||
})
|
||||
```
|
||||
|
||||
### $portal(callback, port)
|
||||
|
||||
Open a network port to receive connections from remote actors.
|
||||
|
||||
```javascript
|
||||
$portal(function(connection) {
|
||||
// handle new connection
|
||||
}, 8080)
|
||||
```
|
||||
|
||||
### $contact(callback, record)
|
||||
|
||||
Connect to a remote actor at a given address.
|
||||
|
||||
```javascript
|
||||
$contact(function(connection) {
|
||||
// connected
|
||||
}, {host: "example.com", port: 80})
|
||||
```
|
||||
|
||||
### $time_limit(requestor, seconds)
|
||||
|
||||
Wrap a requestor with a timeout. Returns a new requestor that will cancel the original and call its callback with a failure if the time limit is exceeded. See [Requestors](/docs/requestors/) for details.
|
||||
|
||||
```javascript
|
||||
var timed = $time_limit(my_requestor, 10)
|
||||
|
||||
timed(function(result, reason) {
|
||||
// reason will explain timeout if it fires
|
||||
}, initial_value)
|
||||
```
|
||||
|
||||
### $couple(actor)
|
||||
|
||||
Couple the current actor to another actor. When the coupled actor dies, the current actor also dies. Coupling is automatic between a child actor and its overling (parent).
|
||||
|
||||
```javascript
|
||||
$couple(other_actor)
|
||||
```
|
||||
|
||||
### $unneeded(callback, seconds)
|
||||
|
||||
Schedule the actor for removal after a specified time. The callback fires when the time elapses.
|
||||
|
||||
```javascript
|
||||
$unneeded(function() {
|
||||
// cleanup before removal
|
||||
}, 30)
|
||||
```
|
||||
|
||||
### $connection(callback, actor, config)
|
||||
|
||||
Get information about the connection to another actor. For local actors, returns `{type: "local"}`. For remote actors, returns connection details including latency, bandwidth, and activity.
|
||||
|
||||
```javascript
|
||||
$connection(function(info) {
|
||||
if (info.type == "local") {
|
||||
print("same machine")
|
||||
} else {
|
||||
print(info.latency)
|
||||
}
|
||||
}, other_actor, {})
|
||||
```
|
||||
|
||||
## Runtime Functions
|
||||
|
||||
These functions are available in actors without the `$` prefix:
|
||||
|
||||
### send(actor, message, callback)
|
||||
|
||||
Send a message to another actor. The message must be an object record.
|
||||
|
||||
The optional callback receives the reply when the recipient responds.
|
||||
|
||||
```javascript
|
||||
send(other_actor, {type: "ping"}, function(reply) {
|
||||
print("Got reply:", reply)
|
||||
})
|
||||
```
|
||||
|
||||
To reply to a received message, pass the message itself as the first argument — it contains routing information:
|
||||
|
||||
```javascript
|
||||
$receiver(function(message) {
|
||||
send(message, {result: 42})
|
||||
})
|
||||
```
|
||||
|
||||
Messages are automatically flattened to plain data.
|
||||
|
||||
### is_actor(value)
|
||||
|
||||
Returns `true` if the value is an actor reference.
|
||||
|
||||
```javascript
|
||||
if (is_actor(some_value)) {
|
||||
send(some_value, {ping: true})
|
||||
}
|
||||
```
|
||||
|
||||
### log
|
||||
|
||||
Channel-based logging. Any `log.X(value)` writes to channel `"X"`. Three channels are conventional: `log.console(msg)`, `log.error(msg)`, `log.system(msg)` — but any name works.
|
||||
|
||||
Channels are routed to configurable **sinks** (console or file) defined in `.cell/log.toml`. See [Logging](/docs/logging/) for the full guide.
|
||||
|
||||
### use(path)
|
||||
|
||||
Import a module. See [Module Resolution](#module-resolution) below.
|
||||
|
||||
### args
|
||||
|
||||
Array of command-line arguments passed to the actor.
|
||||
|
||||
### sequence(), parallel(), race(), fallback()
|
||||
|
||||
Requestor composition functions. See [Requestors](/docs/requestors/) for details.
|
||||
|
||||
## Module Resolution
|
||||
|
||||
When you call `use('name')`, ƿit searches:
|
||||
|
||||
1. **Current package** — files relative to package root
|
||||
2. **Dependencies** — packages declared in `cell.toml`
|
||||
3. **Core** — built-in ƿit modules
|
||||
|
||||
```javascript
|
||||
// From within package 'myapp':
|
||||
use('utils') // myapp/utils.cm
|
||||
use('helper/math') // myapp/helper/math.cm
|
||||
use('json') // core json module
|
||||
use('otherlib/foo') // dependency 'otherlib', file foo.cm
|
||||
```
|
||||
|
||||
Files in the `internal/` directory are private to the package.
|
||||
|
||||
## Example: Simple Actor System
|
||||
|
||||
```javascript
|
||||
// main.ce - Entry point
|
||||
var config = use('config')
|
||||
|
||||
print("Starting application...")
|
||||
|
||||
$start(function(event) {
|
||||
if (event.type == 'greet') {
|
||||
send(event.actor, {task: "process", data: [1, 2, 3]})
|
||||
}
|
||||
if (event.type == 'stop') {
|
||||
print("Worker finished")
|
||||
$stop()
|
||||
}
|
||||
}, "worker")
|
||||
|
||||
$delay(function() {
|
||||
print("Shutting down")
|
||||
$stop()
|
||||
}, 10)
|
||||
```
|
||||
|
||||
```javascript
|
||||
// worker.ce - Worker actor
|
||||
$receiver(function(msg) {
|
||||
if (msg.task == "process") {
|
||||
var result = array(msg.data, function(x) { return x * 2 })
|
||||
send(msg, {result: result})
|
||||
}
|
||||
$stop()
|
||||
})
|
||||
```
|
||||
|
||||
```javascript
|
||||
// config.cm - Shared configuration
|
||||
return {
|
||||
debug: true,
|
||||
timeout: 30
|
||||
}
|
||||
```
|
||||
|
||||
@@ -1,11 +0,0 @@
|
||||
# actor
|
||||
|
||||
### toString() <sub>function</sub>
|
||||
|
||||
### spawn(script, config, callback) <sub>function</sub>
|
||||
|
||||
### clear() <sub>function</sub>
|
||||
|
||||
### kill() <sub>function</sub>
|
||||
|
||||
### delay(fn, seconds) <sub>function</sub>
|
||||
@@ -1,37 +0,0 @@
|
||||
# console
|
||||
|
||||
The console object provides various logging, debugging, and output methods.
|
||||
|
||||
### print() <sub>function</sub>
|
||||
|
||||
### spam(msg) <sub>function</sub>
|
||||
|
||||
Output a spam-level message for very verbose logging.
|
||||
|
||||
### debug(msg) <sub>function</sub>
|
||||
|
||||
Output a debug-level message.
|
||||
|
||||
### info(msg) <sub>function</sub>
|
||||
|
||||
Output info level message.
|
||||
|
||||
### warn(msg) <sub>function</sub>
|
||||
|
||||
Output warn level message.
|
||||
|
||||
### log(msg) <sub>function</sub>
|
||||
|
||||
Output directly to in game console.
|
||||
|
||||
### error(e) <sub>function</sub>
|
||||
|
||||
Output error level message, and print stacktrace.
|
||||
|
||||
### panic(e) <sub>function</sub>
|
||||
|
||||
Output a panic-level message and exit the program.
|
||||
|
||||
### assert(op, str = `assertion failed [value '${op}']`) <sub>function</sub>
|
||||
|
||||
If the condition is false, print an error and panic.
|
||||
@@ -1,5 +0,0 @@
|
||||
# Appendix B - api
|
||||
|
||||
This is a complete list of accessible functions and parameters that are built into Prosperon. For the most part, developers will concern themselves with the modules, all of which can be imported with `use`.
|
||||
|
||||
Types document particular javascript objects with a specific object in their prototype chain, which can allow access to an underlying C data structure. A lot of these are used only internally by Prosperon, but brave developers can pick around in the module internals to see how they're used and do their own thing if they want!
|
||||
@@ -1,87 +0,0 @@
|
||||
# actor
|
||||
|
||||
|
||||
A set of utilities for iterating over a hierarchy of actor-like objects, as well
|
||||
as managing tag-based lookups. Objects are assumed to have a "objects" property,
|
||||
pointing to children or sub-objects, forming a tree.
|
||||
|
||||
|
||||
### all_objects(fn, startobj) <sub>function</sub>
|
||||
|
||||
|
||||
Iterate over each object (and its sub-objects) in the hierarchy, calling fn for each one.
|
||||
|
||||
|
||||
**fn**: A callback function that receives each object. If it returns a truthy value, iteration stops and that value is returned.
|
||||
|
||||
**startobj**: The root object at which iteration begins, default is the global "world".
|
||||
|
||||
|
||||
**Returns**: The first truthy value returned by fn, or undefined if none.
|
||||
|
||||
|
||||
### find_object(fn, startobj) <sub>function</sub>
|
||||
|
||||
|
||||
Intended to find a matching object within the hierarchy.
|
||||
|
||||
|
||||
**fn**: A callback or criteria to locate a particular object.
|
||||
|
||||
**startobj**: The root object at which search begins, default "world".
|
||||
|
||||
|
||||
**Returns**: Not yet implemented.
|
||||
|
||||
|
||||
### tag_add(tag, obj) <sub>function</sub>
|
||||
|
||||
|
||||
Associate the given object with the specified tag. Creates a new tag set if it does not exist.
|
||||
|
||||
|
||||
**tag**: A string tag to associate with the object.
|
||||
|
||||
**obj**: The object to add under this tag.
|
||||
|
||||
|
||||
**Returns**: None
|
||||
|
||||
|
||||
### tag_rm(tag, obj) <sub>function</sub>
|
||||
|
||||
|
||||
Remove the given object from the specified tag’s set, if it exists.
|
||||
|
||||
|
||||
**tag**: The tag to remove the object from.
|
||||
|
||||
**obj**: The object to remove from the tag set.
|
||||
|
||||
|
||||
**Returns**: None
|
||||
|
||||
|
||||
### tag_clear_guid(obj) <sub>function</sub>
|
||||
|
||||
|
||||
Remove the object from all tag sets.
|
||||
|
||||
|
||||
**obj**: The object whose tags should be cleared.
|
||||
|
||||
|
||||
**Returns**: None
|
||||
|
||||
|
||||
### objects_with_tag(tag) <sub>function</sub>
|
||||
|
||||
|
||||
Retrieve all objects currently tagged with the specified tag.
|
||||
|
||||
|
||||
**tag**: A string tag to look up.
|
||||
|
||||
|
||||
**Returns**: An array of objects associated with the given tag.
|
||||
|
||||
@@ -1,46 +0,0 @@
|
||||
# camera
|
||||
|
||||
### list() <sub>function</sub>
|
||||
|
||||
Return an array of available camera device IDs.
|
||||
|
||||
|
||||
|
||||
**Returns**: An array of camera IDs, or undefined if no cameras are available.
|
||||
|
||||
|
||||
### open(id) <sub>function</sub>
|
||||
|
||||
Open a camera device with the given ID.
|
||||
|
||||
|
||||
|
||||
**id**: The camera ID to open.
|
||||
|
||||
|
||||
**Returns**: A camera object on success, or throws an error if the camera cannot be opened.
|
||||
|
||||
|
||||
### name(id) <sub>function</sub>
|
||||
|
||||
Return the name of the camera with the given ID.
|
||||
|
||||
|
||||
|
||||
**id**: The camera ID to query.
|
||||
|
||||
|
||||
**Returns**: A string with the camera's name, or throws an error if the name cannot be retrieved.
|
||||
|
||||
|
||||
### position(id) <sub>function</sub>
|
||||
|
||||
Return the physical position of the camera with the given ID.
|
||||
|
||||
|
||||
|
||||
**id**: The camera ID to query.
|
||||
|
||||
|
||||
**Returns**: A string indicating the camera position ("unknown", "front", or "back").
|
||||
|
||||
@@ -1,7 +0,0 @@
|
||||
# cmd
|
||||
|
||||
### length <sub>number</sub>
|
||||
|
||||
### name <sub>string</sub>
|
||||
|
||||
### prototype <sub>object</sub>
|
||||
@@ -1,7 +0,0 @@
|
||||
# color
|
||||
|
||||
### Color <sub>object</sub>
|
||||
|
||||
### esc <sub>object</sub>
|
||||
|
||||
### ColorMap <sub>object</sub>
|
||||
@@ -1,76 +0,0 @@
|
||||
# debug
|
||||
|
||||
### stack_depth() <sub>function</sub>
|
||||
|
||||
Return the current stack depth.
|
||||
|
||||
|
||||
|
||||
**Returns**: A number representing the stack depth.
|
||||
|
||||
|
||||
### build_backtrace() <sub>function</sub>
|
||||
|
||||
Build and return a backtrace of the current call stack.
|
||||
|
||||
|
||||
|
||||
**Returns**: An object representing the call stack backtrace.
|
||||
|
||||
|
||||
### closure_vars(fn) <sub>function</sub>
|
||||
|
||||
Return the closure variables for a given function.
|
||||
|
||||
|
||||
|
||||
**fn**: The function object to inspect.
|
||||
|
||||
|
||||
**Returns**: An object containing the closure variables.
|
||||
|
||||
|
||||
### local_vars(depth) <sub>function</sub>
|
||||
|
||||
Return the local variables for a specific stack frame.
|
||||
|
||||
|
||||
|
||||
**depth**: The stack frame depth to inspect.
|
||||
|
||||
|
||||
**Returns**: An object containing the local variables at the specified depth.
|
||||
|
||||
|
||||
### fn_info(fn) <sub>function</sub>
|
||||
|
||||
Return metadata about a given function.
|
||||
|
||||
|
||||
|
||||
**fn**: The function object to inspect.
|
||||
|
||||
|
||||
**Returns**: An object with metadata about the function.
|
||||
|
||||
|
||||
### backtrace_fns() <sub>function</sub>
|
||||
|
||||
Return an array of functions in the current backtrace.
|
||||
|
||||
|
||||
|
||||
**Returns**: An array of function objects from the call stack.
|
||||
|
||||
|
||||
### dump_obj(obj) <sub>function</sub>
|
||||
|
||||
Return a string representation of a given object.
|
||||
|
||||
|
||||
|
||||
**obj**: The object to dump.
|
||||
|
||||
|
||||
**Returns**: A string describing the object's contents.
|
||||
|
||||
@@ -1,39 +0,0 @@
|
||||
# dmon
|
||||
|
||||
### watch() <sub>function</sub>
|
||||
|
||||
Start watching the root directory, recursively.
|
||||
|
||||
This function begins monitoring the specified directory and its subdirectories recursively for events such as file creation, deletion, modification, or movement. Events are queued and can be retrieved by calling poll.
|
||||
|
||||
:throws: An error if dmon is already watching.
|
||||
|
||||
|
||||
**Returns**: None
|
||||
|
||||
|
||||
### unwatch() <sub>function</sub>
|
||||
|
||||
Stop watching the currently monitored directory.
|
||||
|
||||
This function halts filesystem monitoring for the directory previously set by watch. It clears the watch state, allowing a new watch to be started.
|
||||
|
||||
:throws: An error if no directory is currently being watched.
|
||||
|
||||
|
||||
**Returns**: None
|
||||
|
||||
|
||||
### poll(callback) <sub>function</sub>
|
||||
|
||||
Retrieve and process queued filesystem events.
|
||||
|
||||
This function dequeues all pending filesystem events and invokes the provided callback for each one. The callback receives an event object with properties: 'action' (string: "create", "delete", "modify", or "move"), 'root' (string: watched directory), 'file' (string: affected file path), and 'old' (string: previous file path for move events, empty if not applicable).
|
||||
|
||||
|
||||
|
||||
**callback**: A function to call for each event, receiving an event object as its argument.
|
||||
|
||||
|
||||
**Returns**: None
|
||||
|
||||
@@ -1,38 +0,0 @@
|
||||
# doc
|
||||
|
||||
|
||||
Provides a consistent way to create documentation for prosperon elements. Objects are documented by adding docstrings directly to object-like things (functions, objects, ...), or to an object's own "doc object".
|
||||
|
||||
Docstrings are set to the symbol `cell.DOC`
|
||||
|
||||
```js
|
||||
// Suppose we have a module that returns a function
|
||||
function greet(name) { log.console("Hello, " + name) }
|
||||
|
||||
// We can attach a docstring
|
||||
greet.doc = `
|
||||
Greets the user by name.
|
||||
`
|
||||
|
||||
// A single function is a valid return!
|
||||
return greet
|
||||
```
|
||||
|
||||
```js
|
||||
// Another way is to add a docstring object to an object
|
||||
var greet = {
|
||||
hello() { log.console('hello!') }
|
||||
}
|
||||
|
||||
greet[cell.DOC] = {}
|
||||
greet[cell.DOC][cell.DOC] = 'An object full of different greeter functions'
|
||||
greet[cell.DOC].hello = 'A greeter that says, "hello!"'
|
||||
```
|
||||
|
||||
|
||||
**name**: The name of the person to greet.
|
||||
|
||||
|
||||
### writeDocFile(obj, title) <sub>function</sub>
|
||||
|
||||
Return a markdown string for a given obj, with an optional title.
|
||||
@@ -1,228 +0,0 @@
|
||||
# draw2d
|
||||
|
||||
|
||||
A collection of 2D drawing functions that operate in screen space. Provides primitives
|
||||
for lines, rectangles, text, sprite drawing, etc.
|
||||
|
||||
|
||||
### point(pos, size, color) <sub>function</sub>
|
||||
|
||||
|
||||
|
||||
|
||||
**pos**: A 2D position ([x, y]) where the point should be drawn.
|
||||
|
||||
**size**: The size of the point (not currently affecting rendering).
|
||||
|
||||
**color**: The color of the point, defaults to Color.blue.
|
||||
|
||||
|
||||
**Returns**: None
|
||||
|
||||
|
||||
### line(points, color, thickness, pipeline) <sub>function</sub>
|
||||
|
||||
|
||||
|
||||
|
||||
**points**: An array of 2D positions representing the line vertices.
|
||||
|
||||
**color**: The color of the line, default Color.white.
|
||||
|
||||
**thickness**: The line thickness, default 1.
|
||||
|
||||
**pipeline**: (Optional) A pipeline or rendering state object.
|
||||
|
||||
|
||||
**Returns**: None
|
||||
|
||||
|
||||
### cross(pos, size, color, thickness, pipe) <sub>function</sub>
|
||||
|
||||
|
||||
|
||||
|
||||
**pos**: The center of the cross as a 2D position ([x, y]).
|
||||
|
||||
**size**: Half the size of each cross arm.
|
||||
|
||||
**color**: The color of the cross, default Color.red.
|
||||
|
||||
**thickness**: The thickness of each line, default 1.
|
||||
|
||||
**pipe**: (Optional) A pipeline or rendering state object.
|
||||
|
||||
|
||||
**Returns**: None
|
||||
|
||||
|
||||
### arrow(start, end, color, wingspan, wingangle, pipe) <sub>function</sub>
|
||||
|
||||
|
||||
|
||||
|
||||
**start**: The start position of the arrow ([x, y]).
|
||||
|
||||
**end**: The end (tip) position of the arrow ([x, y]).
|
||||
|
||||
**color**: The color, default Color.red.
|
||||
|
||||
**wingspan**: The length of each arrowhead 'wing', default 4.
|
||||
|
||||
**wingangle**: Wing rotation in degrees, default 10.
|
||||
|
||||
**pipe**: (Optional) A pipeline or rendering state object.
|
||||
|
||||
|
||||
**Returns**: None
|
||||
|
||||
|
||||
### rectangle(rect, color, pipeline) <sub>function</sub>
|
||||
|
||||
|
||||
|
||||
|
||||
**rect**: A rectangle object with {x, y, width, height}.
|
||||
|
||||
**color**: The fill color, default Color.white.
|
||||
|
||||
**pipeline**: (Optional) A pipeline or rendering state object.
|
||||
|
||||
|
||||
**Returns**: None
|
||||
|
||||
|
||||
### tile(image, rect, color, tile, pipeline) <sub>function</sub>
|
||||
|
||||
|
||||
:raises Error: If no image is provided.
|
||||
|
||||
|
||||
**image**: An image object or string path to a texture.
|
||||
|
||||
**rect**: A rectangle specifying draw location/size ({x, y, width, height}).
|
||||
|
||||
**color**: The color tint, default Color.white.
|
||||
|
||||
**tile**: A tiling definition ({repeat_x, repeat_y}), default tile_def.
|
||||
|
||||
**pipeline**: (Optional) A pipeline or rendering state object.
|
||||
|
||||
|
||||
**Returns**: None
|
||||
|
||||
|
||||
### slice9(image, rect, slice, color, info, pipeline) <sub>function</sub>
|
||||
|
||||
|
||||
:raises Error: If no image is provided.
|
||||
|
||||
|
||||
**image**: An image object or string path to a texture.
|
||||
|
||||
**rect**: A rectangle specifying draw location/size, default [0, 0].
|
||||
|
||||
**slice**: The pixel inset or spacing for the 9-slice (number or object).
|
||||
|
||||
**color**: The color tint, default Color.white.
|
||||
|
||||
**info**: A slice9 info object controlling tiling of edges/corners.
|
||||
|
||||
**pipeline**: (Optional) A pipeline or rendering state object.
|
||||
|
||||
|
||||
**Returns**: None
|
||||
|
||||
|
||||
### image(image, rect, rotation, color, pipeline) <sub>function</sub>
|
||||
|
||||
|
||||
:raises Error: If no image is provided.
|
||||
|
||||
|
||||
**image**: An image object or string path to a texture.
|
||||
|
||||
**rect**: A rectangle specifying draw location/size, default [0,0]; width/height default to image size.
|
||||
|
||||
**rotation**: Rotation in degrees (not currently used).
|
||||
|
||||
**color**: The color tint, default none.
|
||||
|
||||
**pipeline**: (Optional) A pipeline or rendering state object.
|
||||
|
||||
|
||||
**Returns**: A sprite object that was created for this draw call.
|
||||
|
||||
|
||||
### images(image, rects, config) <sub>function</sub>
|
||||
|
||||
|
||||
:raises Error: If no image is provided.
|
||||
|
||||
|
||||
**image**: An image object or string path to a texture.
|
||||
|
||||
**rects**: An array of rectangle objects ({x, y, width, height}) to draw.
|
||||
|
||||
**config**: (Unused) Additional config data if needed.
|
||||
|
||||
|
||||
**Returns**: An array of sprite objects created and queued for rendering.
|
||||
|
||||
|
||||
### sprites(sprites, sort, pipeline) <sub>function</sub>
|
||||
|
||||
|
||||
|
||||
|
||||
**sprites**: An array of sprite objects to draw.
|
||||
|
||||
**sort**: Sorting mode or order, default 0.
|
||||
|
||||
**pipeline**: (Optional) A pipeline or rendering state object.
|
||||
|
||||
|
||||
**Returns**: None
|
||||
|
||||
|
||||
### circle(pos, radius, color, inner_radius, pipeline) <sub>function</sub>
|
||||
|
||||
|
||||
|
||||
|
||||
**pos**: Center of the circle ([x, y]).
|
||||
|
||||
**radius**: The circle radius.
|
||||
|
||||
**color**: The fill color of the circle, default none.
|
||||
|
||||
**inner_radius**: (Unused) Possibly ring thickness, default 1.
|
||||
|
||||
**pipeline**: (Optional) A pipeline or rendering state object.
|
||||
|
||||
|
||||
**Returns**: None
|
||||
|
||||
|
||||
### text(text, rect, font, size, color, wrap, pipeline) <sub>function</sub>
|
||||
|
||||
|
||||
|
||||
|
||||
**text**: The string to draw.
|
||||
|
||||
**rect**: A rectangle specifying draw position (and possibly wrapping area).
|
||||
|
||||
**font**: A font object or string path, default sysfont.
|
||||
|
||||
**size**: (Unused) Possibly intended for scaling the font size.
|
||||
|
||||
**color**: The text color, default Color.white.
|
||||
|
||||
**wrap**: Pixel width for text wrapping, default 0 (no wrap).
|
||||
|
||||
**pipeline**: (Optional) A pipeline or rendering state object.
|
||||
|
||||
|
||||
**Returns**: None
|
||||
|
||||
@@ -1,45 +0,0 @@
|
||||
# enet
|
||||
|
||||
### initialize() <sub>function</sub>
|
||||
|
||||
|
||||
Initialize the ENet library. Must be called before using any ENet functionality.
|
||||
Throws an error if initialization fails.
|
||||
|
||||
|
||||
|
||||
**Returns**: None
|
||||
|
||||
|
||||
### deinitialize() <sub>function</sub>
|
||||
|
||||
|
||||
Deinitialize the ENet library, cleaning up all resources. Call this when you no longer
|
||||
need any ENet functionality.
|
||||
|
||||
|
||||
|
||||
**Returns**: None
|
||||
|
||||
|
||||
### create_host(address) <sub>function</sub>
|
||||
|
||||
|
||||
Create an ENet host for either a client-like unbound host or a server bound to a specific
|
||||
address and port:
|
||||
|
||||
- If no argument is provided, creates an unbound "client-like" host with default settings
|
||||
(maximum 32 peers, 2 channels, unlimited bandwidth).
|
||||
- If you pass an "ip:port" string (e.g. "127.0.0.1:7777"), it creates a server bound to
|
||||
that address. The server supports up to 32 peers, 2 channels, and unlimited bandwidth.
|
||||
|
||||
Throws an error if host creation fails for any reason.
|
||||
|
||||
omit to create an unbound client-like host.
|
||||
|
||||
|
||||
**address**: (optional) A string in 'ip:port' format to bind the host (server), or
|
||||
|
||||
|
||||
**Returns**: An ENetHost object.
|
||||
|
||||
@@ -1,25 +0,0 @@
|
||||
# event
|
||||
|
||||
### push_event(event) <sub>function</sub>
|
||||
|
||||
Push a custom user event into SDL's queue, passing a callback function.
|
||||
|
||||
|
||||
|
||||
**event**: A function to call when this event is consumed.
|
||||
|
||||
|
||||
**Returns**: None
|
||||
|
||||
|
||||
### engine_input(callback) <sub>function</sub>
|
||||
|
||||
Poll all system events (keyboard, mouse, etc.) and call the given function with each event object.
|
||||
|
||||
|
||||
|
||||
**callback**: A function that executes on each event consumed from the poll.
|
||||
|
||||
|
||||
**Returns**: None
|
||||
|
||||
@@ -1,221 +0,0 @@
|
||||
# geometry
|
||||
|
||||
|
||||
A collection of geometry-related functions for circles, spheres, boxes, polygons,
|
||||
and rectangle utilities. Some functionality is implemented in C and exposed here.
|
||||
|
||||
|
||||
### rect_intersection(a, b) <sub>function</sub>
|
||||
|
||||
|
||||
Return the intersection of two rectangles. The result may be empty if no intersection.
|
||||
|
||||
|
||||
**a**: The first rectangle as {x, y, w, h}.
|
||||
|
||||
**b**: The second rectangle as {x, y, w, h}.
|
||||
|
||||
|
||||
**Returns**: A rectangle that is the intersection of the two. May have zero width/height if no overlap.
|
||||
|
||||
|
||||
### rect_intersects(a, b) <sub>function</sub>
|
||||
|
||||
|
||||
|
||||
|
||||
**a**: Rectangle {x,y,w,h}.
|
||||
|
||||
**b**: Rectangle {x,y,w,h}.
|
||||
|
||||
|
||||
**Returns**: A boolean indicating whether the two rectangles overlap.
|
||||
|
||||
|
||||
### rect_expand(a, b) <sub>function</sub>
|
||||
|
||||
|
||||
Merge or combine two rectangles, returning their bounding rectangle.
|
||||
|
||||
|
||||
**a**: Rectangle {x,y,w,h}.
|
||||
|
||||
**b**: Rectangle {x,y,w,h}.
|
||||
|
||||
|
||||
**Returns**: A new rectangle that covers the bounds of both input rectangles.
|
||||
|
||||
|
||||
### rect_inside(inner, outer) <sub>function</sub>
|
||||
|
||||
|
||||
|
||||
|
||||
**inner**: A rectangle to test.
|
||||
|
||||
**outer**: A rectangle that may contain 'inner'.
|
||||
|
||||
|
||||
**Returns**: True if 'inner' is completely inside 'outer', otherwise false.
|
||||
|
||||
|
||||
### rect_random(rect) <sub>function</sub>
|
||||
|
||||
|
||||
|
||||
|
||||
**rect**: A rectangle {x,y,w,h}.
|
||||
|
||||
|
||||
**Returns**: A random point within the rectangle (uniform distribution).
|
||||
|
||||
|
||||
### cwh2rect(center, wh) <sub>function</sub>
|
||||
|
||||
|
||||
Helper: convert a center point and width/height vector to a rect object.
|
||||
|
||||
|
||||
**center**: A 2D point [cx, cy].
|
||||
|
||||
**wh**: A 2D size [width, height].
|
||||
|
||||
|
||||
**Returns**: A rectangle {x, y, w, h} with x,y set to center and w,h set to the given size.
|
||||
|
||||
|
||||
### rect_point_inside(rect, point) <sub>function</sub>
|
||||
|
||||
|
||||
|
||||
|
||||
**rect**: A rectangle {x,y,w,h}.
|
||||
|
||||
**point**: A 2D point [px, py].
|
||||
|
||||
|
||||
**Returns**: True if the point lies inside the rectangle, otherwise false.
|
||||
|
||||
|
||||
### rect_pos(rect) <sub>function</sub>
|
||||
|
||||
|
||||
|
||||
|
||||
**rect**: A rectangle {x,y,w,h}.
|
||||
|
||||
|
||||
**Returns**: A 2D vector [x,y] giving the rectangle's position.
|
||||
|
||||
|
||||
### rect_move(rect, offset) <sub>function</sub>
|
||||
|
||||
|
||||
|
||||
|
||||
**rect**: A rectangle {x,y,w,h}.
|
||||
|
||||
**offset**: A 2D vector to add to the rectangle's position.
|
||||
|
||||
|
||||
**Returns**: A new rectangle with updated x,y offset.
|
||||
|
||||
|
||||
### box(w, h) <sub>function</sub>
|
||||
|
||||
|
||||
Construct a box centered at the origin with the given width and height. This overrides the box object above.
|
||||
|
||||
|
||||
**w**: The width of the box.
|
||||
|
||||
**h**: The height of the box.
|
||||
|
||||
|
||||
**Returns**: An array of four 2D points representing the corners of a rectangle centered at [0,0].
|
||||
|
||||
|
||||
### sphere <sub>object</sub>
|
||||
|
||||
|
||||
Sphere-related geometry functions:
|
||||
- volume(r): Return the volume of a sphere with radius r.
|
||||
- random(r, theta, phi): Return a random point on or inside a sphere.
|
||||
|
||||
|
||||
### circle <sub>object</sub>
|
||||
|
||||
|
||||
Circle-related geometry functions:
|
||||
- area(r): Return the area of a circle with radius r.
|
||||
- random(r, theta): Return a random 2D point on a circle; uses sphere.random internally and extracts x,z.
|
||||
|
||||
|
||||
### ngon(radius, n) <sub>function</sub>
|
||||
|
||||
|
||||
Generates a regular n-gon by calling geometry.arc with full 360 degrees.
|
||||
|
||||
|
||||
**radius**: The radius of the n-gon from center to each vertex.
|
||||
|
||||
**n**: Number of sides/vertices.
|
||||
|
||||
|
||||
**Returns**: An array of 2D points forming a regular n-gon.
|
||||
|
||||
|
||||
### arc(radius, angle, n, start) <sub>function</sub>
|
||||
|
||||
|
||||
Generate an arc (or partial circle) of n points, each angle spread equally over 'angle' degrees from 'start'.
|
||||
|
||||
|
||||
**radius**: The distance from center to the arc points.
|
||||
|
||||
**angle**: The total angle (in degrees) over which points are generated, capped at 360.
|
||||
|
||||
**n**: Number of segments (if <=1, empty array is returned).
|
||||
|
||||
**start**: Starting angle (in degrees), default 0.
|
||||
|
||||
|
||||
**Returns**: An array of 2D points along the arc.
|
||||
|
||||
|
||||
### corners2points(ll, ur) <sub>function</sub>
|
||||
|
||||
|
||||
Similar to box.points, but calculates differently.
|
||||
|
||||
|
||||
**ll**: Lower-left 2D coordinate.
|
||||
|
||||
**ur**: Upper-right 2D coordinate (relative offset in x,y).
|
||||
|
||||
|
||||
**Returns**: A four-point array of corners [ll, lower-right, upper-right, upper-left].
|
||||
|
||||
|
||||
### sortpointsccw(points) <sub>function</sub>
|
||||
|
||||
|
||||
Sort an array of points in CCW order based on their angles from the centroid.
|
||||
|
||||
|
||||
**points**: An array of 2D points.
|
||||
|
||||
|
||||
**Returns**: A new array of the same points, sorted counterclockwise around their centroid.
|
||||
|
||||
|
||||
### points2cm(points) <sub>function</sub>
|
||||
|
||||
|
||||
|
||||
|
||||
**points**: An array of 2D points.
|
||||
|
||||
|
||||
**Returns**: The centroid (average x,y) of the given points.
|
||||
|
||||
@@ -1,278 +0,0 @@
|
||||
# graphics
|
||||
|
||||
|
||||
Provides functionality for loading and managing images, fonts, textures, and sprite meshes.
|
||||
Includes both JavaScript and C-implemented routines for creating geometry buffers, performing
|
||||
rectangle packing, etc.
|
||||
|
||||
|
||||
### make_sprite_mesh(sprites) <sub>function</sub>
|
||||
|
||||
|
||||
:param oldMesh (optional): An existing mesh object to reuse/resize if possible.
|
||||
Given an array of sprites, build a single geometry mesh for rendering them.
|
||||
|
||||
|
||||
**sprites**: An array of sprite objects, each containing .rect (or transform), .src (UV region), .color, etc.
|
||||
|
||||
|
||||
**Returns**: A GPU mesh object with pos, uv, color, and indices buffers for all sprites.
|
||||
|
||||
|
||||
### make_sprite_queue(sprites, camera, pipeline, sort) <sub>function</sub>
|
||||
|
||||
|
||||
Given an array of sprites, optionally sort them, then build a queue of pipeline commands.
|
||||
Each group with a shared image becomes one command.
|
||||
|
||||
|
||||
**sprites**: An array of sprite objects.
|
||||
|
||||
**camera**: (unused in the C code example) Typically a camera or transform for sorting?
|
||||
|
||||
**pipeline**: A pipeline object for rendering.
|
||||
|
||||
**sort**: An integer or boolean for whether to sort sprites; if truthy, sorts by layer & texture.
|
||||
|
||||
|
||||
**Returns**: An array of pipeline commands: geometry with mesh references, grouped by image.
|
||||
|
||||
|
||||
### make_text_buffer(text, rect, angle, color, wrap, font) <sub>function</sub>
|
||||
|
||||
|
||||
Generate a GPU buffer mesh of text quads for rendering with a font, etc.
|
||||
|
||||
|
||||
**text**: The string to render.
|
||||
|
||||
**rect**: A rectangle specifying position and possibly wrapping.
|
||||
|
||||
**angle**: Rotation angle (unused or optional).
|
||||
|
||||
**color**: A color for the text (could be a vec4).
|
||||
|
||||
**wrap**: The width in pixels to wrap text, or 0 for no wrap.
|
||||
|
||||
**font**: A font object created by graphics.make_font or graphics.get_font.
|
||||
|
||||
|
||||
**Returns**: A geometry buffer mesh (pos, uv, color, indices) for rendering text.
|
||||
|
||||
|
||||
### rectpack(width, height, sizes) <sub>function</sub>
|
||||
|
||||
|
||||
Perform a rectangle packing using the stbrp library. Return positions for each rect.
|
||||
|
||||
|
||||
**width**: The width of the area to pack into.
|
||||
|
||||
**height**: The height of the area to pack into.
|
||||
|
||||
**sizes**: An array of [w,h] pairs for the rectangles to pack.
|
||||
|
||||
|
||||
**Returns**: An array of [x,y] coordinates placing each rect, or null if they don't fit.
|
||||
|
||||
|
||||
### make_rtree() <sub>function</sub>
|
||||
|
||||
|
||||
Create a new R-Tree for geometry queries.
|
||||
|
||||
|
||||
**Returns**: An R-Tree object for quickly querying many rectangles or sprite bounds.
|
||||
|
||||
|
||||
### make_texture(data) <sub>function</sub>
|
||||
|
||||
|
||||
Convert raw image bytes into an SDL_Surface object.
|
||||
|
||||
|
||||
**data**: Raw image bytes (PNG, JPG, etc.) as an ArrayBuffer.
|
||||
|
||||
|
||||
**Returns**: An SDL_Surface object representing the decoded image in RAM, for use with GPU or software rendering.
|
||||
|
||||
|
||||
### make_gif(data) <sub>function</sub>
|
||||
|
||||
|
||||
Load a GIF, returning its frames. If it's a single-frame GIF, the result may have .surface only.
|
||||
|
||||
|
||||
**data**: An ArrayBuffer containing GIF data.
|
||||
|
||||
|
||||
**Returns**: An object with frames[], each frame having its own .surface. Some also have a .texture for GPU use.
|
||||
|
||||
|
||||
### make_aseprite(data) <sub>function</sub>
|
||||
|
||||
|
||||
Load an Aseprite/ASE file from an array of bytes, returning frames or animations.
|
||||
|
||||
|
||||
**data**: An ArrayBuffer containing Aseprite (ASE) file data.
|
||||
|
||||
|
||||
**Returns**: An object containing frames or animations, each with .surface. May also have top-level .surface for a single-layer case.
|
||||
|
||||
|
||||
### cull_sprites(sprites, camera) <sub>function</sub>
|
||||
|
||||
|
||||
Filter an array of sprites to only those visible in the provided camera’s view.
|
||||
|
||||
|
||||
**sprites**: An array of sprite objects (each has rect or transform).
|
||||
|
||||
**camera**: A camera or bounding rectangle defining the view area.
|
||||
|
||||
|
||||
**Returns**: A new array of sprites that are visible in the camera's view.
|
||||
|
||||
|
||||
### rects_to_sprites(rects, image) <sub>function</sub>
|
||||
|
||||
|
||||
Convert an array of rect coords into sprite objects referencing a single image.
|
||||
|
||||
|
||||
**rects**: An array of rect coords or objects.
|
||||
|
||||
**image**: An image object (with .texture).
|
||||
|
||||
|
||||
**Returns**: An array of sprite objects referencing the 'image' and each rect for UV or position.
|
||||
|
||||
|
||||
### make_surface(dimensions) <sub>function</sub>
|
||||
|
||||
|
||||
Create a blank surface in RAM.
|
||||
|
||||
|
||||
**dimensions**: The size object {width, height}, or an array [w,h].
|
||||
|
||||
|
||||
**Returns**: A blank RGBA surface with the given dimensions, typically for software rendering or icons.
|
||||
|
||||
|
||||
### make_cursor(opts) <sub>function</sub>
|
||||
|
||||
|
||||
|
||||
|
||||
**opts**: An object with {surface, hotx, hoty} or similar.
|
||||
|
||||
|
||||
**Returns**: An SDL_Cursor object referencing the given surface for a custom mouse cursor.
|
||||
|
||||
|
||||
### make_font(data, size) <sub>function</sub>
|
||||
|
||||
|
||||
Load a font from TTF/OTF data at the given size.
|
||||
|
||||
|
||||
**data**: TTF/OTF file data as an ArrayBuffer.
|
||||
|
||||
**size**: Pixel size for rendering glyphs.
|
||||
|
||||
|
||||
**Returns**: A font object with surface, texture, and glyph data, for text rendering with make_text_buffer.
|
||||
|
||||
|
||||
### make_sprite() <sub>function</sub>
|
||||
|
||||
|
||||
Create a new sprite object, storing default properties.
|
||||
|
||||
|
||||
**Returns**: A new sprite object, which typically has .rect, .color, .layer, .image, etc.
|
||||
|
||||
|
||||
### make_line_prim(points, thickness, startCap, endCap, color) <sub>function</sub>
|
||||
|
||||
|
||||
Build a GPU mesh representing a thick polyline from an array of points, using parsl or a similar library under the hood.
|
||||
|
||||
|
||||
**points**: An array of [x,y] points forming the line.
|
||||
|
||||
**thickness**: The thickness (width) of the polyline.
|
||||
|
||||
**startCap**: (Unused) Possibly the type of cap for the start.
|
||||
|
||||
**endCap**: (Unused) Possibly the type of cap for the end.
|
||||
|
||||
**color**: A color to apply to the line.
|
||||
|
||||
|
||||
**Returns**: A geometry mesh object suitable for rendering the line via a pipeline command.
|
||||
|
||||
|
||||
### is_image(obj) <sub>function</sub>
|
||||
|
||||
|
||||
|
||||
|
||||
**obj**: An object to check.
|
||||
|
||||
|
||||
**Returns**: True if 'obj' has a .texture and a .rect property, indicating it's an image object.
|
||||
|
||||
|
||||
### texture(path) <sub>function</sub>
|
||||
|
||||
|
||||
Load or retrieve a cached image, converting it into a GPU texture. If 'path' is already an object, it’s returned directly.
|
||||
|
||||
|
||||
**path**: A string path to an image file or an already-loaded image object.
|
||||
|
||||
|
||||
**Returns**: An image object with {surface, texture, frames?, etc.} depending on the format.
|
||||
|
||||
|
||||
### tex_hotreload(file) <sub>function</sub>
|
||||
|
||||
|
||||
Reload the image for the given file, updating the cached copy in memory and GPU.
|
||||
|
||||
|
||||
**file**: The file path that was changed on disk.
|
||||
|
||||
|
||||
**Returns**: None
|
||||
|
||||
|
||||
### get_font(path, size) <sub>function</sub>
|
||||
|
||||
|
||||
Load a font from file if not cached, or retrieve from cache if already loaded.
|
||||
|
||||
|
||||
**path**: A string path to a font file, optionally with ".size" appended.
|
||||
|
||||
**size**: Pixel size of the font, if not included in 'path'.
|
||||
|
||||
|
||||
**Returns**: A font object with .surface and .texture for rendering text.
|
||||
|
||||
|
||||
### queue_sprite_mesh(queue) <sub>function</sub>
|
||||
|
||||
|
||||
Builds a single geometry mesh for all sprite-type commands in the queue, storing first_index/num_indices
|
||||
so they can be rendered in one draw call.
|
||||
|
||||
|
||||
**queue**: An array of draw commands, some of which are {type:'sprite'} objects.
|
||||
|
||||
|
||||
**Returns**: An array of references to GPU buffers [pos,uv,color,indices].
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
@@ -1,68 +0,0 @@
|
||||
# input
|
||||
|
||||
### mouse_show(show) <sub>function</sub>
|
||||
|
||||
Show or hide the mouse cursor. Pass true to show, false to hide.
|
||||
|
||||
|
||||
|
||||
**show**: Boolean. True to show, false to hide.
|
||||
|
||||
|
||||
**Returns**: None
|
||||
|
||||
|
||||
### mouse_lock(lock) <sub>function</sub>
|
||||
|
||||
Capture or release the mouse, confining it within the window if locked.
|
||||
|
||||
|
||||
|
||||
**lock**: Boolean. True to lock, false to unlock.
|
||||
|
||||
|
||||
**Returns**: None
|
||||
|
||||
|
||||
### cursor_set(cursor) <sub>function</sub>
|
||||
|
||||
Set the given cursor (created by os.make_cursor) as the active mouse cursor.
|
||||
|
||||
|
||||
|
||||
**cursor**: The cursor to set.
|
||||
|
||||
|
||||
**Returns**: None
|
||||
|
||||
|
||||
### keyname(keycode) <sub>function</sub>
|
||||
|
||||
Given a numeric keycode, return the corresponding key name (e.g., from SDL).
|
||||
|
||||
|
||||
|
||||
**keycode**: A numeric SDL keycode.
|
||||
|
||||
|
||||
**Returns**: A string with the key name.
|
||||
|
||||
|
||||
### keymod() <sub>function</sub>
|
||||
|
||||
Return an object describing the current modifier keys, e.g. {shift:true, ctrl:true}.
|
||||
|
||||
|
||||
|
||||
**Returns**: An object with boolean fields for each modifier key.
|
||||
|
||||
|
||||
### mousestate() <sub>function</sub>
|
||||
|
||||
Return an object describing the current mouse state, including x,y coordinates
|
||||
and booleans for pressed buttons (left, middle, right, x1, x2).
|
||||
|
||||
|
||||
|
||||
**Returns**: Object { x, y, left, middle, right, x1, x2 }
|
||||
|
||||
@@ -1,243 +0,0 @@
|
||||
# io
|
||||
|
||||
### rm(path) <sub>function</sub>
|
||||
|
||||
Remove the file or empty directory at the given path.
|
||||
|
||||
|
||||
|
||||
**path**: The file or empty directory to remove. Must be empty if a directory.
|
||||
|
||||
|
||||
**Returns**: None
|
||||
|
||||
|
||||
### mkdir(path) <sub>function</sub>
|
||||
|
||||
Create a directory at the given path.
|
||||
|
||||
|
||||
|
||||
**path**: The directory path to create.
|
||||
|
||||
|
||||
**Returns**: None
|
||||
|
||||
|
||||
### stat(path) <sub>function</sub>
|
||||
|
||||
Return an object describing file metadata for the given path. The object includes
|
||||
filesize, modtime, createtime, and accesstime. Throw an error if the path does not exist.
|
||||
|
||||
|
||||
|
||||
**path**: The file or directory to retrieve metadata for.
|
||||
|
||||
|
||||
**Returns**: An object with metadata (filesize, modtime, createtime, accesstime).
|
||||
|
||||
|
||||
### globfs(patterns) <sub>function</sub>
|
||||
|
||||
Return an array of files that do not match any of the provided glob patterns. It
|
||||
recursively enumerates the filesystem within PHYSFS. Each pattern is treated as an
|
||||
"ignore" rule, similar to .gitignore usage.
|
||||
|
||||
|
||||
|
||||
**patterns**: An array of glob patterns to ignore. Any file matching one of these is skipped.
|
||||
|
||||
|
||||
**Returns**: An array of matching file paths.
|
||||
|
||||
|
||||
### match(pattern, string) <sub>function</sub>
|
||||
|
||||
Return boolean indicating whether the given wildcard pattern matches the provided
|
||||
string. Dots must match dots. Case is not ignored.
|
||||
|
||||
Patterns can incorporate:
|
||||
'?' - Matches exactly one character (except leading dots or slashes).
|
||||
'*' - Matches zero or more characters (excluding path separators).
|
||||
'**' - Matches zero or more characters, including path separators.
|
||||
'[abc]' - A bracket expression; matches any single character from the set. Ranges like [a-z], [0-9] also work.
|
||||
'[[:alpha:]]' - POSIX character classes can be used inside brackets.
|
||||
'\' - Backslash escapes the next character.
|
||||
'!' - If placed immediately inside brackets (like [!abc]), it negates the set.
|
||||
|
||||
|
||||
|
||||
**pattern**: The wildcard pattern to compare.
|
||||
|
||||
**string**: The string to test against the wildcard pattern.
|
||||
|
||||
|
||||
**Returns**: True if matched, otherwise false.
|
||||
|
||||
|
||||
### exists(path) <sub>function</sub>
|
||||
|
||||
Return a boolean indicating whether the file or directory at the given path exists.
|
||||
|
||||
|
||||
|
||||
**path**: The file or directory path to check.
|
||||
|
||||
|
||||
**Returns**: True if the path exists, otherwise false.
|
||||
|
||||
|
||||
### mount(archiveOrDir, mountPoint) <sub>function</sub>
|
||||
|
||||
Mount a directory or archive at the specified mount point. An undefined mount
|
||||
point mounts to '/'. Throw on error.
|
||||
|
||||
|
||||
|
||||
**archiveOrDir**: The directory or archive to mount.
|
||||
|
||||
**mountPoint**: The path at which to mount. If omitted or undefined, '/' is used.
|
||||
|
||||
|
||||
**Returns**: None
|
||||
|
||||
|
||||
### unmount(path) <sub>function</sub>
|
||||
|
||||
Unmount a previously mounted directory or archive. Throw on error.
|
||||
|
||||
|
||||
|
||||
**path**: The directory or archive mount point to unmount.
|
||||
|
||||
|
||||
**Returns**: None
|
||||
|
||||
|
||||
### slurp(path) <sub>function</sub>
|
||||
|
||||
Read the entire file at the given path as a string. Throw on error.
|
||||
|
||||
|
||||
|
||||
**path**: The file path to read from.
|
||||
|
||||
|
||||
**Returns**: A string with the file’s contents.
|
||||
|
||||
|
||||
### slurpbytes(path) <sub>function</sub>
|
||||
|
||||
Read the entire file at the given path as a raw ArrayBuffer. Throw on error.
|
||||
|
||||
|
||||
|
||||
**path**: The file path to read from.
|
||||
|
||||
|
||||
**Returns**: An ArrayBuffer containing the file’s raw bytes.
|
||||
|
||||
|
||||
### slurpwrite(data, path) <sub>function</sub>
|
||||
|
||||
Write data (string or ArrayBuffer) to the given file path. Overwrite if it exists.
|
||||
Throw on error.
|
||||
|
||||
|
||||
|
||||
**data**: The data to write (string or ArrayBuffer).
|
||||
|
||||
**path**: The file path to write to.
|
||||
|
||||
|
||||
**Returns**: None
|
||||
|
||||
|
||||
### writepath(path) <sub>function</sub>
|
||||
|
||||
Set the write directory. Subsequent writes will go here by default. Throw on error.
|
||||
|
||||
|
||||
|
||||
**path**: The directory path to set as writable.
|
||||
|
||||
|
||||
**Returns**: None
|
||||
|
||||
|
||||
### basedir() <sub>function</sub>
|
||||
|
||||
Return the application's base directory (where the executable is located).
|
||||
|
||||
|
||||
|
||||
**Returns**: A string with the base directory path.
|
||||
|
||||
|
||||
### prefdir(org, app) <sub>function</sub>
|
||||
|
||||
Get the user-and-app-specific path where files can be written.
|
||||
|
||||
|
||||
|
||||
**org**: The name of your organization.
|
||||
|
||||
**app**: The name of your application.
|
||||
|
||||
|
||||
**Returns**: A string with the user's directory path.
|
||||
|
||||
|
||||
### realdir(path) <sub>function</sub>
|
||||
|
||||
Return the actual, real directory (on the host filesystem) that contains the given
|
||||
file path. Return undefined if not found.
|
||||
|
||||
|
||||
|
||||
**path**: The file path whose real directory is requested.
|
||||
|
||||
|
||||
**Returns**: A string with the real directory path, or undefined.
|
||||
|
||||
|
||||
### open(path) <sub>function</sub>
|
||||
|
||||
Open a file for writing, returning a file object that can be used for further
|
||||
operations. Throw on error.
|
||||
|
||||
|
||||
|
||||
**path**: The file path to open for writing.
|
||||
|
||||
|
||||
**Returns**: A file object for subsequent write operations.
|
||||
|
||||
|
||||
### searchpath() <sub>function</sub>
|
||||
|
||||
Return an array of all directories in the current paths.
|
||||
|
||||
|
||||
|
||||
**Returns**: An array of directory paths in the search path.
|
||||
|
||||
|
||||
### enumerate(path, recurse) <sub>function</sub>
|
||||
|
||||
Return an array of files within the given directory, optionally recursing into
|
||||
subdirectories.
|
||||
|
||||
|
||||
|
||||
**path**: The directory to list.
|
||||
|
||||
**recurse**: Whether to recursively include subdirectories (true or false).
|
||||
|
||||
|
||||
**Returns**: An array of file (and directory) paths found.
|
||||
|
||||
|
||||
### mount_core() <sub>function</sub>
|
||||
|
||||
### is_directory() <sub>function</sub>
|
||||
@@ -1,175 +0,0 @@
|
||||
# js
|
||||
|
||||
|
||||
Provides functions for introspecting and configuring the QuickJS runtime engine.
|
||||
Includes debug info, memory usage, GC controls, code evaluation, etc.
|
||||
|
||||
|
||||
### cycle_hook(callback) <sub>function</sub>
|
||||
|
||||
|
||||
or undefined to remove the callback.
|
||||
|
||||
Register or remove a hook function that QuickJS calls once per execution cycle. If the callback
|
||||
is set, it receives a single argument (an optional object/value describing the cycle). If callback
|
||||
is undefined, the hook is removed.
|
||||
|
||||
|
||||
**callback**: A function to call each time QuickJS completes a "cycle" (internal VM loop),
|
||||
|
||||
|
||||
**Returns**: None
|
||||
|
||||
|
||||
### dump_shapes() <sub>function</sub>
|
||||
|
||||
|
||||
Use this for internal debugging of object shapes.
|
||||
|
||||
|
||||
**Returns**: A debug string describing the internal shape hierarchy used by QuickJS.
|
||||
|
||||
|
||||
### dump_atoms() <sub>function</sub>
|
||||
|
||||
|
||||
known by QuickJS. Helpful for diagnosing memory usage or potential key collisions.
|
||||
|
||||
|
||||
**Returns**: A debug string listing all currently registered atoms (internal property keys/symbols)
|
||||
|
||||
|
||||
### dump_class() <sub>function</sub>
|
||||
|
||||
|
||||
Shows how many objects of each class exist, useful for advanced memory or performance profiling.
|
||||
|
||||
|
||||
**Returns**: A debug string describing the distribution of JS object classes in the QuickJS runtime.
|
||||
|
||||
|
||||
### dump_objects() <sub>function</sub>
|
||||
|
||||
|
||||
useful for debugging memory leaks or object lifetimes.
|
||||
|
||||
|
||||
**Returns**: A debug string listing certain internal QuickJS objects and their references,
|
||||
|
||||
|
||||
### dump_type_overheads() <sub>function</sub>
|
||||
|
||||
|
||||
Displays memory usage breakdown for different internal object types.
|
||||
|
||||
|
||||
**Returns**: A debug string describing the overheads for various JS object types in QuickJS.
|
||||
|
||||
|
||||
### stack_info() <sub>function</sub>
|
||||
|
||||
|
||||
Internal debugging utility to examine call stack details.
|
||||
|
||||
|
||||
**Returns**: An object or string describing the runtime's current stack usage and capacity.
|
||||
|
||||
|
||||
### calc_mem(value) <sub>function</sub>
|
||||
|
||||
|
||||
|
||||
Compute the approximate size of a single JS value in memory. This is a best-effort estimate.
|
||||
|
||||
|
||||
**value**: A JavaScript value to analyze.
|
||||
|
||||
|
||||
**Returns**: Approximate memory usage (in bytes) of that single value.
|
||||
|
||||
|
||||
### mem() <sub>function</sub>
|
||||
|
||||
|
||||
including total allocated bytes, object counts, and more.
|
||||
|
||||
Retrieve an overview of the runtime’s memory usage.
|
||||
|
||||
|
||||
**Returns**: An object containing a comprehensive snapshot of memory usage for the current QuickJS runtime,
|
||||
|
||||
|
||||
### mem_limit(bytes) <sub>function</sub>
|
||||
|
||||
|
||||
|
||||
Set the upper memory limit for the QuickJS runtime. Exceeding this limit may cause operations to
|
||||
fail or throw errors.
|
||||
|
||||
|
||||
**bytes**: The maximum memory (in bytes) QuickJS is allowed to use.
|
||||
|
||||
|
||||
**Returns**: None
|
||||
|
||||
|
||||
### gc_threshold(bytes) <sub>function</sub>
|
||||
|
||||
|
||||
|
||||
Set the threshold (in bytes) for QuickJS to perform an automatic GC pass when memory usage surpasses it.
|
||||
|
||||
|
||||
**bytes**: The threshold (in bytes) at which the engine triggers automatic garbage collection.
|
||||
|
||||
|
||||
**Returns**: None
|
||||
|
||||
|
||||
### max_stacksize(bytes) <sub>function</sub>
|
||||
|
||||
|
||||
|
||||
Set the maximum stack size for QuickJS. If exceeded, the runtime may throw a stack overflow error.
|
||||
|
||||
|
||||
**bytes**: The maximum allowed stack size (in bytes) for QuickJS.
|
||||
|
||||
|
||||
**Returns**: None
|
||||
|
||||
|
||||
### memstate() <sub>function</sub>
|
||||
|
||||
|
||||
|
||||
Gives a quick overview of the memory usage, including malloc size and other allocations.
|
||||
|
||||
|
||||
**Returns**: A simpler memory usage object (malloc sizes, etc.) for the QuickJS runtime.
|
||||
|
||||
|
||||
### gc() <sub>function</sub>
|
||||
|
||||
|
||||
|
||||
Force an immediate, full garbage collection pass, reclaiming unreachable memory.
|
||||
|
||||
|
||||
**Returns**: None
|
||||
|
||||
|
||||
### eval(src, filename) <sub>function</sub>
|
||||
|
||||
|
||||
|
||||
Execute a string of JavaScript code in the current QuickJS context.
|
||||
|
||||
|
||||
**src**: A string of JavaScript source code to evaluate.
|
||||
|
||||
**filename**: (Optional) A string for the filename or label, used in debugging or stack traces.
|
||||
|
||||
|
||||
**Returns**: The result of evaluating the given source code.
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user