| Thread Review (Newest First) |
| Posted by admin - 04-12-2026, 12:28 AM |
|
The system itself will be an open world which will be a house. it will be made with unreal engine 5, the characters will be made by me and imported into the world as FBX. The character will be operating with a local LLM and dataset created by me. The idea is to have the whole thing as one big system, instead of running three or four programs on the computer causing lag.
|
| Posted by admin - 04-12-2026, 12:20 AM |
|
I Have the PC, VR headset, Evolocity supercomputer which has 23 blades, each blade has two HD, the Evolocity main PC for the supercomputer has a speed of around 10 Terraflops, the network is a fiber optic network using Myricom 10G fiber optic network all this tied into one massive system, which will be added on to.
|
| Posted by admin - 03-29-2026, 02:21 AM |
Here is how my AI system is going to be set up. I have all the parts, just need to put it together
|
| Posted by admin - 03-29-2026, 02:15 AM |
The json ~ to make the character feel more real
Code: {
"name": "Silas",
"hunger": 20,
"sanity": 85,
"last_seen_user": "2026-03-28T21:15:00",
"current_task": "Resting"
}
|
| Posted by admin - 03-29-2026, 02:11 AM |
As you can see:
self.url = model_url # Your Server Blade IP (e.g., hxxp://192.168.1.50:11434/api/generate) *Note: you can't connect to that IP or port because they are on a different system
it being ran through a server blade system that I have so the internal IP and port for the api is set for that blade.
|
| Posted by admin - 03-29-2026, 02:07 AM |
I am using Python
Code: import requests
import json
class HistoricalAgent:
def __init__(self, name, model_url):
self.name = name
self.url = model_url # Your Server Blade IP (e.g., http://192.168.1.50:11434/api/generate)
# --- THE "INTERNAL STATE" (This is what Vedal does) ---
self.personality = "A gruff but curious 1880s blacksmith named Silas."
self.mood = "Neutral"
self.short_term_memory = [] # Last few things said
def construct_prompt(self, user_input):
# We "bake" the state into every request so the AI knows who it is
full_prompt = f"""
[SYSTEM INSTRUCTION]
You are {self.name}. {self.personality}
Current Mood: {self.mood}
Recent History: {self.short_term_memory[-3:]}
[USER INPUT]
{user_input}
[RESPONSE]
"""
return full_prompt
def think_and_speak(self, user_input):
payload = {
"model": "llama3", # Or your custom 1880s model on the blade
"prompt": self.construct_prompt(user_input),
"stream": False
}
try:
# Talking to your server blade via local network
response = requests.post(self.url, json=payload)
reply = response.json().get("response", "...")
# Update internal memory
self.short_term_memory.append(f"User: {user_input}")
self.short_term_memory.append(f"Silas: {reply}")
return reply
except Exception as e:
return f"Lost connection to the brain: {e}"
if __name__ == "__main__":
# Point this to your actual server blade IP
silas = HistoricalAgent("Silas", "http://192.168.1.50:11434/api/generate")
print(f"--- {silas.name} is standing by in the forge ---")
while True:
text = input("You: ")
if text.lower() in ['exit', 'quit']: break
print(f"\n{silas.name} is thinking...")
print(f"{silas.name}: {silas.think_and_speak(text)}")
|