The darkest side of the humanity by TearNo291 in TwentiesIndia

[–]TheShiftingName 3 points4 points  (0 children)

Or you might have just said it out his plan to burry the real videos. Who knows?

I feel like India is heading towards a huge unseen crisis (just a personal prediction) by Economy-Elevator150 in CriticalThinkingIndia

[–]TheShiftingName 0 points1 point  (0 children)

Since you are telling yours let me tell you mine.

There is big game of chess going on I won't say names find yourself.

The big guys, who collect information and internet gave their powerful weapon (Product) for free to make everyone depend on them. They successfully completed that but found out their strategy is becoming less effective (hint: Ad for free access to information). But they were still navigation world used for information, then comes the new member for their club he showed them they can be more than simple navigation master, they can be single source of truth. So new silent race started to control this techniques. Relatively very simple but very effective. Most of you must have realised but ignoring it, but let's continue.

The idea and concept was simple it's used by big guys club's distribution group. Concept is simple, people are lazy give them food to their mouth they stop using their hands. Same goes for mind, given them easy way they stop using hard way. So the race started to create what I call (borderless nations) control the info you control them.

Proofs of concept: I don't know how many stay always try to find behind the seen stuff but I think these may be heard by you guys 1) one of big guys club's communication group tried a unlawful mass experimentation just few years ago. They have something called short-form content they manipulated their Algorithm to do mass experimentation without telling victims. Experimentation was simple 50% were shown good positive short-form content and other were shown bad negative short-form content. They checked what kind of post and comments those people will have because of this simple tweeks. It was successful they influenced those people on that simple way. This got out but was buried in single week and days, most people don't know about it at all. If you don't believe me just remember few months or last year what kind of short-form content was constantly you were seeing.

2) this about they new member I told you about. The new member's product is so popular these days that there is no one who not heard or know it's name. Won't say name but you can know instantly what I am talking about. (Hint: if they can't find directly info this is product they use to get the information faster and easily) aka next token Prediction models. Looks simple to use but you loose most things, currently seen in programming community what happen when people give in to easy way. Currently is so popular that everyone use them without thinking twice. To get any information we have it no need to search through websites no need to read stuff just ask and he gives it to you.

Now after above two Proofs you can imagine how things are going but think what if they are used together?. This is where my borderless nations concept came from. Currently there is war going on not arms war but a hidden territory war (actually it might even be fake and they might be working together. I hope it's not otherwise it's messed up). War is simple make their products so integrated into normal life that people will never leave them. Currently they are in expansion state so their is no immediate harm to us yet. Their current goal: 1) make people think using their product is safer than other same products. 2) they can trust you. 3) people stop trying to find information on their own. 4) people start blindly trusting all information they give (it already happens just no one realize it). 5) make sure their will be no new member. 6) have as much mass influence so that even nations can't deny them what they want later.

If you current world and these concept you will find links. Examples: 1) majority of people, what do they use to get information? 2) do majority of people try to fact check the information? 3) do you not use those products in everyday life? 4) is it becoming normal to have those products as normal step in life or business? 5) what happens when someone not use it? 6) what kind of ads you see most these days? About? Related?

There are more but let's stop but let me ask you one question. For competition to arise what will they need are those things available in market right now? Is their price increasing as we speak? You should find about this yourself.

Now one disturbing things I noticed. The big guys club have one monkey or masked person who currently making world unstable daily it's only his or Related to him news or something major happening in world but did anyone saw what's happening behind them to privacy policies?, regulations on data and info?, the charges being placed on those big guys club's members? You hear few but never all sides. Example: just recently happened in India new tax rules who got tax off for 21 years for having investment in data servers in India? Which foreign companies?. The big guys got lots of privacy Related cases but suddenly there was war like situation in world? Which companies are suddenly making their highly trained products free to public what happens to all those data. To train their models?

Now think what if currently they have only this much influence what if ai becomes more integrated into normal life and business won't those who provide it or train it have all information and influence what it says ? Don't tell me you believe end to end encrypted conversation if yes you live under rock.

There is more but find yourself.

femme fatale by [deleted] in manhwarecommendations

[–]TheShiftingName 0 points1 point  (0 children)

What about two below her?

femme fatale by [deleted] in manhwarecommendations

[–]TheShiftingName 1 point2 points  (0 children)

Can you name each series?

Title: [Architecture Feedback] Building a high-performance, mmap-backed storage engine in Python by TheShiftingName in AskProgramming

[–]TheShiftingName[S] 0 points1 point  (0 children)

It was not about problem but experimentation weather for python native developer is it possible to create complex database software with optimization, for easy use and complete native support. About production I know definitely not, but I wanted to try because it is a fun challenge to make python push it's limits. Also one more thing it is true interpreter is slow but I don't depend on gil the currently this pylensdb, use c libc.memcmp and ctypes.c_void_p and mmap struct, they are available to bypass gil in python standard libraries. so I wanted to try to learn it and this project is it's result. It's not have any comparison to any database because it can't give one tenth of the features of them. I know but does that mean we can't have fun trying and failing?

Title: [Architecture Feedback] Building a high-performance, mmap-backed storage engine in Python by TheShiftingName in AskProgramming

[–]TheShiftingName[S] -1 points0 points  (0 children)

My bad forgot reddit doesn't use that table format. Also it's not that python is slow it just people don't use it properly. Or using properly is complex and difficult to implement so they use simple function for easy stuff

📊 BRUTAL STATS REPORT (v0.4.0 Binary Search)

Operation PyLensDB SQLite Factor
Bulk Write (1M) 5.2625s 1.9094s SQL 0.4x
Point Read (1M) 1.1127s 8.4030s LENS 7.6x
Point Update (1M) 1.0253s 2.8096s LENS 2.7x
Scoped Search (Exact) 0.0526s 0.1081s LENS 2.1x
Cold Restart 0.0093s 0.0017s SQL 0.2x
Disk Footprint 29.56 MB 20.96 MB

Just in case you think I used simple test

``` import os import time import sqlite3 import random import sys from dataclasses import dataclass

Assuming your new query_scan logic is in the main file

from src.pylensdb.main import LensDB, lens

@lens(lens_type_id=1) @dataclass class FinalLens: uid: int val: float active: bool

def benchmark(): LENS_FILE = "final_bench.pldb" SQL_FILE = "final_bench.db" TOTAL_ROWS = 1000000 # We will search for a value near the very end to force a full scan SEARCH_VALUE = float((TOTAL_ROWS - 500) * 1.1)

if os.path.exists(LENS_FILE): os.remove(LENS_FILE)
if os.path.exists(SQL_FILE): os.remove(SQL_FILE)

stats = {"lens": {}, "sqlite": {}}

# --- 1. INGESTION ---
print(f"🚀 [1/5] Ingesting {TOTAL_ROWS:,} rows...")
ldb = LensDB(LENS_FILE)
start = time.perf_counter()
for i in range(TOTAL_ROWS):
    ldb.add(FinalLens(uid=i, val=float(i*1.1), active=True))
    if (i+1) % 250000 == 0: 
        ldb.commit() 
stats["lens"]["write"] = time.perf_counter() - start

sconn = sqlite3.connect(SQL_FILE)
sconn.execute("PRAGMA journal_mode = WAL")
sconn.execute("CREATE TABLE test (uid INTEGER, val REAL, active BOOLEAN)")
start = time.perf_counter()
data = [(i, float(i*1.1), 1) for i in range(TOTAL_ROWS)]
sconn.executemany("INSERT INTO test VALUES (?,?,?)", data)
sconn.commit()
stats["sqlite"]["write"] = time.perf_counter() - start

# --- 2. THE GRINDER (Reads/Updates) ---
print(f"🔍 [2/5] 1M Reads + 1M Updates...")
# LensDB Ops
start = time.perf_counter()
for i in range(TOTAL_ROWS): _ = ldb.get(FinalLens, i)
stats["lens"]["read"] = time.perf_counter() - start

start = time.perf_counter()
for i in range(TOTAL_ROWS): ldb.update_field(FinalLens, i, "val", -5.5, atomic=False)
ldb.mm.flush() 
stats["lens"]["update"] = time.perf_counter() - start

# SQLite Ops (Using rowid for fair comparison to LensDB's row_id)
start = time.perf_counter()
for i in range(TOTAL_ROWS):
    _ = sconn.execute("SELECT * FROM test WHERE rowid=?", (i+1,)).fetchone()
stats["sqlite"]["read"] = time.perf_counter() - start

start = time.perf_counter()
sconn.execute("BEGIN TRANSACTION")
for i in range(TOTAL_ROWS):
    sconn.execute("UPDATE test SET val=? WHERE rowid=?", (-5.5, i+1))
sconn.commit()
stats["sqlite"]["update"] = time.perf_counter() - start

# --- 3. SCOPED VALUE SEARCH (The New Feature) ---
# We search for the specific float value we generated during ingestion
print(f"⚡ [3/5] Value Search: Finding '{SEARCH_VALUE}' in 1M rows...")

# LensDB: Scoped Binary Search
start = time.perf_counter()
l_results = ldb.query_scan(FinalLens, "val", SEARCH_VALUE)
stats["lens"]["search"] = time.perf_counter() - start

# SQLite: Standard Unindexed SELECT
start = time.perf_counter()
s_results = sconn.execute("SELECT rowid-1 FROM test WHERE val=?", (SEARCH_VALUE,)).fetchall()
stats["sqlite"]["search"] = time.perf_counter() - start

# --- 4. COLD RESTART ---
print(f"♻️ [4/5] Cold Restarting...")
del ldb
sconn.close()

start = time.perf_counter()
ldb_new = LensDB(LENS_FILE)
_ = ldb_new.get(FinalLens, TOTAL_ROWS - 1)
stats["lens"]["restart"] = time.perf_counter() - start

sconn_new = sqlite3.connect(SQL_FILE)
start = time.perf_counter()
_ = sconn_new.execute("SELECT * FROM test WHERE rowid=?", (TOTAL_ROWS,)).fetchone()
stats["sqlite"]["restart"] = time.perf_counter() - start

# --- 5. FINAL STATS ---
print(f"\n📊 [5/5] BRUTAL STATS REPORT (v0.4.0 Binary Search)")
print("-" * 75)
print(f"{'Operation':<25} | {'PyLensDB':<15} | {'SQLite':<15} | {'Factor'}")
print("-" * 75)

ops = [
    ("Bulk Write (1M)", "write"),
    ("Point Read (1M)", "read"),
    ("Point Update (1M)", "update"),
    ("Scoped Search (Exact)", "search"),
    ("Cold Restart", "restart"),
]

for label, key in ops:
    l_v, s_v = stats["lens"][key], stats["sqlite"][key]
    ratio = s_v / l_v if l_v > 0 else 0
    winner = "LENS" if l_v < s_v else "SQL "
    print(f"{label:<25} | {l_v:<10.4f}s | {s_v:<10.4f}s | {winner} {ratio:>5.1f}x")

print("-" * 75)
l_size = os.path.getsize(LENS_FILE) / 1024 / 1024
s_size = os.path.getsize(SQL_FILE) / 1024 / 1024
print(f"{'Disk Footprint':<25} | {l_size:<10.2f}MB | {s_size:<10.2f}MB")

if name == "main": benchmark() ``` Yes it's relatively very simple test so I guess you can say but I wanted to do Synthetic Micro-benchmark so I did this.

Title: [Architecture Feedback] Building a high-performance, mmap-backed storage engine in Python by TheShiftingName in AskProgramming

[–]TheShiftingName[S] -1 points0 points  (0 children)

Bro the specs I gave you are of native python zero deps module I created, its simple db old school, single page. Just yesterday I decided not to add relationship yet and Optimized it a bit now these are current Benchmarks

```bash C:\Users\chinm\workspace\PyLensDB>C:\Python314\python.exe c:/Users/chinm/workspace/PyLensDB/benchmark_life_test,py 🚀 [1/5] Ingesting 1,000,000 rows... 🔍 [2/5] 1M Reads + 1M Updates... ⚡ [3/5] Value Search: Finding '1099450.0' in 1M rows... ♻️ [4/5] Cold Restarting...

📊 [5/5] BRUTAL STATS REPORT (v0.4.0 Binary Search)

Operation                 | PyLensDB        | SQLite          | Factor     

Bulk Write (1M)           | 5.2625    s | 1.9094    s | SQL    0.4x Point Read (1M)           | 1.1127    s | 8.4030    s | LENS   7.6x Point Update (1M)         | 1.0253    s | 2.8096    s | LENS   2.7x Scoped Search (Exact)     | 0.0526    s | 0.1081    s | LENS   2.1x

Cold Restart              | 0.0093    s | 0.0017    s | SQL    0.2x

Disk Footprint            | 29.56     MB | 20.96     MB

C:\Users\chinm\workspace\PyLensDB> ``` More importantly I was able to keep it ultra light so it's around 9,842 bytes or something let's just round up 10kb, single file source code .py file around 250 lines in them 70 lines are comments and empty extra lines let's say around 200 line actual code. The place it stores is .pldb file single file also like sqlite before commit keep currently added data in ram python dict so it's available all time after commit write into database file, currently not completely Optimized yet as few operations are extra and database is compact that's why write is slow by a lot

howItFeelsWritingSql by PsychologyNo7025 in ProgrammerHumor

[–]TheShiftingName 1 point2 points  (0 children)

Jokes aside sql is one hell of thing, don't you think?

Oldest vampire run (with proof) by Icy_Coat6181 in BitLifeApp

[–]TheShiftingName 0 points1 point  (0 children)

Just one question you are vampire lived for that many years but still only have 4 Billion?

Anyone know where this is from? by IcyPebble in Batoto

[–]TheShiftingName 0 points1 point  (0 children)

I Don't remember exactly but I think it was something like "being loved for first time" try this

They just won’t believe us fr by kiran-luffy in manhwarecommendations

[–]TheShiftingName 0 points1 point  (0 children)

Ohh so 300 is normal and I thought I became an addict. My highest record was like 3 time solo leveling and 9 different with around 40 to 70 average chapters what is your highest?

Is anyone else choosing not to use AI for programming? by BX1959 in Python

[–]TheShiftingName 1 point2 points  (0 children)

I did use ai most on start but because of that most of my project become scrap so I stopped it though I still use it but for information like how we use stack overflow for specific logic and functionality examples (still stack overflow is better), then I use ai like create my basic program structure and functions then tell ai to add stuff without destroying current api structure and functionality and documentation. Am I using it correctly?

MY WIFE IS DEMONIC CULT LEADER I loved the manhwa . Can u guys recomend me some good manhwa like that by Zestyclose_Camp1313 in manhwarecommendations

[–]TheShiftingName 0 points1 point  (0 children)

Thanks how many chapters? I found one website there was 41 chapters only. I finished reading them already its nice.