pH7Console is a privacy-focused, ML-first terminal that brings the power of local AI and sophisticated machine learning to your command line experience. Built with Tauri for maximum performance and security.
- Neural Pattern Recognition - Advanced ML algorithms learn from your command patterns
- Local LLM Processing - Phi-3 Mini, Llama 3.2, TinyLlama, CodeQwen models
- Natural Language Commands - Convert plain English to shell commands
- Context-Aware Memory - Intelligent session and workflow context tracking
- Smart Error Resolution - AI-powered troubleshooting with learned solutions
- Adaptive Learning Engine - Continuously improves based on your usage patterns
- Privacy-Preserving ML - All learning happens locally with encrypted data storage
- Neural Pattern Learning - Sophisticated ML engine with gradient descent-like optimization
- Command Classification - Automatic categorization (FileManagement, GitOperation, Development, etc.)
- Workflow Recognition - Learns command sequences and suggests next steps
- Context Memory - Remembers successful contexts and adapts suggestions accordingly
- Temporal Pattern Analysis - Understands when you typically use certain commands
- Smart Completions - Context-aware command completions with confidence scoring
- Natural Language Processing - Convert plain English to shell commands
- Error Analysis & Fixes - AI-powered error resolution with learning feedback
- Feature Extraction - Advanced context analysis (directory, file types, time, project structure)
- Neural Patterns - Local neural network-like structures for pattern recognition
- Real-time Learning - Continuous adaptation from every command execution
- Session Workflow Tracking - Builds understanding of your development workflows
- Success Rate Optimization - Learns from command success/failure patterns
- User Analytics - Detailed insights into your command usage and improvement areas
- 100% Local Processing - No data leaves your machine
- No Telemetry - Your commands and data stay private
- Encrypted Learning Data - Local ML training data encrypted and secure
- Local Model Storage - All AI models stored and run locally
- Lightweight Models - Optimized for MacBook Air and similar hardware
- Adaptive Loading - Models load on-demand
- Multi-Platform - Native performance on macOS, Windows, Linux
- Multi-Session Management - Multiple terminal sessions
- Workflow Automation - Smart command templates
- Session Recording - Replay and analyze terminal sessions
cargo fmt(in src-tauri/) - Format Rust codecargo clippy(in src-tauri/) - Lint Rust code for common mistakesnpm run lint- Check TypeScript/React code stylenpm run type-check- Verify TypeScript types
- Launch the app - pH7Console opens with a dark terminal interface
- Natural language commands - Type what you want in plain English:
"show me all large files" → AI suggests: find . -type f -size +100M -exec ls -lh {} \; - Smart completions - Start typing and press Tab for AI suggestions
- Error assistance - When commands fail, AI automatically suggests fixes
- Intelligent Command Prediction: Machine learning engine suggests commands based on your patterns
- Command Explanation: Hover over any command for AI explanation
- Error Recovery: Automatic error analysis with learned solutions
- Context Awareness: AI understands your project structure and working context
- Workflow Learning: Recognizes command sequences and optimizes your workflows
- Adaptive Suggestions: Confidence-scored suggestions that improve over time
- Session Memory: Remembers context within and across terminal sessions
- Frontend: React 18 + TypeScript + Tailwind CSS for styling
- Backend: Rust (Tauri 2.0) for native performance
- AI Runtime: Candle ML framework (Rust-native)
- Terminal: Cross-platform PTY with xterm.js
pH7Console runs AI models locally using Candle (Rust's ML framework) for zero privacy concerns:
- Phi-3 Mini (3.8GB) - Best for command understanding
- Llama 3.2 1B (1.2GB) - Fastest responses on MacBook Air
- TinyLlama (1.1GB) - Minimal resource usage
- CodeQwen (1.5GB) - Specialized for code tasks
Made with ❤️ by Pierre-Henry Soria. A super passionate & enthusiastic problem-solver engineer. Also a true cheese 🧀, ristretto ☕️, and dark chocolate lover! 😋
Distributed under the MIT License. See LICENSE for more information.
Unlike PyTorch training, pH7Console uses smart adaptation:
// Example of how the system learns (in Rust)
pub struct UserPatternLearner {
command_embeddings: LocalEmbeddingStore,
context_analyzer: ContextAnalyzer,
workflow_detector: WorkflowDetector,
}
impl UserPatternLearner {
// Learns from command sequences
pub fn analyze_command_sequence(&mut self, commands: &[Command]) {
let patterns = self.workflow_detector.find_patterns(commands);
self.store_workflow_template(patterns);
}
// Adapts suggestions based on context
pub fn get_contextual_suggestions(&self, context: &Context) -> Vec<Suggestion> {
let similar_contexts = self.find_similar_contexts(context);
self.generate_suggestions(similar_contexts)
}
}Learning Mechanisms:
-
Neural Pattern Recognition
- Advanced feature extraction from commands and context
- Gradient descent-like optimization for pattern weights
- Multi-dimensional similarity analysis for command matching
- Confidence scoring based on usage frequency and success rates
-
Context Understanding & Memory
- Project type detection (React, Rust, Python, etc.)
- Working directory and file structure analysis
- Session-level context tracking and memory
- Environmental context association (git repos, node projects, etc.)
-
Workflow & Temporal Learning
- Command sequence pattern recognition
- Session workflow analysis (e.g., git → test → deploy)
- Temporal usage pattern learning (time-of-day preferences)
- Workflow template creation for repeated tasks
-
Error Pattern Learning & Recovery
- Contextual error analysis and classification
- Success/failure pattern recognition
- Learned solution suggestions based on past resolutions
- Preventive error detection and warnings
-
Adaptive Personalization Engine
- Command style preference learning
- Suggestion confidence adaptation
- Custom workflow optimization
- User feedback integration for continuous improvement
Privacy-Preserving Machine Learning:
// All learning happens locally with sophisticated ML techniques
pub struct LearningEngine {
// Neural pattern learning
patterns: HashMap<String, NeuralPattern>,
// Advanced context tracking
session_workflows: HashMap<String, Vec<String>>,
temporal_patterns: HashMap<String, Vec<DateTime<Utc>>>,
context_memory: HashMap<String, f32>,
// ML optimization
learning_rate: f32,
neural_weights: Vec<f32>,
}
impl LearningEngine {
// Feature extraction with multiple dimensions
fn extract_input_features(&self, input: &str, context: &str) -> Vec<f32> {
// Command length, word count, type classification
// Context features: directory, file types, git status, etc.
// Temporal features: time of day, day of week
// Environmental features: project type, tool presence
}
// Neural pattern learning with gradient descent
fn update_patterns(&mut self, example: &LearningExample) {
// Update weights based on success/failure
// Confidence scoring with usage frequency
// Similarity calculation with cosine distance
}
}Machine Learning Analytics:
The system provides detailed insights into your terminal usage patterns:
- Success Rate Analysis - Track command success rates and improvement over time
- Most Used Commands - Identify your most frequent commands and optimize them
- Learning Progress - See how many patterns the AI has learned from your usage
- Workflow Efficiency - Analyze command sequences for optimization opportunities
- Context Performance - Understand which contexts yield the highest success rates
- Temporal Patterns - Discover your productivity patterns throughout the day
Real-time Learning Feedback:
Every command execution contributes to the learning engine:
// Automatic learning from every command
pub async fn execute_command(command: &str) -> CommandResult {
let result = shell_execute(command).await;
// Learn from this execution
learning_engine.learn_from_interaction(
command,
&result.output,
¤t_context,
result.success,
result.execution_time,
).await;
// Track workflow patterns
learning_engine.track_session_workflow(session_id, command);
result
}
// No data ever leaves your machine
impl PrivacyPreservingLearner {
pub fn learn_from_session(&mut self, session: &TerminalSession) {
// Extract patterns without storing raw commands
let patterns = self.extract_abstract_patterns(session);
self.update_local_knowledge(patterns);
// Raw commands are never stored or transmitted
}
}| Model | Size | RAM Usage | Inference Speed | Best For |
|---|---|---|---|---|
| Phi-3 Mini | 3.8GB | 4-6GB | 200-500ms | Complex reasoning, code generation |
| Llama 3.2 1B | 1.2GB | 2-3GB | 100-200ms | General commands, explanations |
| TinyLlama | 1.1GB | 1.5-2GB | 50-100ms | Real-time completions, quick suggestions |
| CodeQwen | 1.5GB | 2-4GB | 150-300ms | Programming tasks, code analysis |
Adaptive Loading:
# Models load based on task complexity
Simple completion → TinyLlama (fast)
Command explanation → Llama 3.2 (balanced)
Code generation → Phi-3 Mini (capable)
Error analysis → CodeQwen (specialized)Memory Optimization:
- Quantization: 4-bit and 8-bit models reduce memory by 70%
- Lazy Loading: Models load only when needed
- Memory Mapping: Efficient model storage and access
- Batch Processing: Multiple requests processed together
Battery Optimization:
- Power-aware inference: Adjusts model complexity based on battery level
- Thermal throttling: Reduces inference rate if system gets hot
- Sleep mode: Models unload during inactivity
Development Build (faster, includes debugging symbols):
npm run tauri build -- --debugProduction Build (optimized, smaller size):
npm run tauri buildUniversal Binary (Intel + Apple Silicon):
npm run tauri build -- --target universal-apple-darwinPlatform-specific builds:
# Intel Macs only (Intel)
npm run tauri build -- --target x86_64-apple-darwin
# Apple Silicon only (M1/M2/M3)
npm run tauri build -- --target aarch64-apple-darwinBuild outputs (located in src-tauri/target/release/bundle/):
macos/pH7Console.app- Application bundledmg/pH7Console_1.0.0_universal.dmg- Installer../release/ph7console- Raw executable
Prerequisites:
# Install Windows target (from macOS/Linux)
rustup target add x86_64-pc-windows-msvc
# Install Windows-specific dependencies
npm install --save-dev @tauri-apps/cli@latestCross-compilation from macOS:
# Build for Windows from macOS
npm run tauri build -- --target x86_64-pc-windows-msvcNative Windows build:
# On Windows machine
npm run tauri buildBuild outputs (located in src-tauri/target/release/bundle/):
msi/pH7Console_1.0.0_x64_en-US.msi- Windows installernsis/pH7Console_1.0.0_x64-setup.exe- NSIS installer../release/pH7Console.exe- Raw executable
Ubuntu/Debian (.deb):
npm run tauri build -- --target x86_64-unknown-linux-gnuAppImage (universal Linux):
npm run tauri build -- --bundles appimageRPM (Red Hat/Fedora):
npm run tauri build -- --bundles rpm1. Get Apple Developer Certificate:
- Join Apple Developer Program ($99/year)
- Create Developer ID Application certificate in Xcode
2. Configure signing in tauri.conf.json:
{
"tauri": {
"bundle": {
"macOS": {
"signingIdentity": "Developer ID Application: Your Name (TEAM_ID)",
"hardenedRuntime": true,
"entitlements": "entitlements.plist"
}
}
}
}3. Create entitlements.plist:
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
<key>com.apple.security.cs.allow-jit</key>
<true/>
<key>com.apple.security.cs.allow-unsigned-executable-memory</key>
<true/>
<key>com.apple.security.cs.disable-library-validation</key>
<true/>
</dict>
</plist>4. Build with signing:
# Set environment variables
export APPLE_SIGNING_IDENTITY="Developer ID Application: Your Name (TEAM_ID)"
export APPLE_ID="[email protected]"
export APPLE_PASSWORD="app-specific-password"
# Build and sign
npm run tauri build5. Notarize for macOS Gatekeeper:
# Notarize the .dmg
xcrun notarytool submit "src-tauri/target/release/bundle/dmg/pH7Console_1.0.0_universal.dmg" \
--apple-id "[email protected]" \
--password "app-specific-password" \
--team-id "YOUR_TEAM_ID" \
--wait
# Staple the ticket
xcrun stapler staple "src-tauri/target/release/bundle/dmg/pH7Console_1.0.0_universal.dmg"1. Get Code Signing Certificate:
- Purchase from CA (DigiCert, Sectigo, etc.)
- Or use self-signed for testing
2. Configure in tauri.conf.json:
{
"tauri": {
"bundle": {
"windows": {
"certificateThumbprint": "YOUR_CERT_THUMBPRINT",
"digestAlgorithm": "sha256",
"timestampUrl": "http://timestamp.digicert.com"
}
}
}
}3. Build with signing:
npm run tauri build1. Prepare for App Store:
Update tauri.conf.json for App Store:
{
"tauri": {
"bundle": {
"macOS": {
"signingIdentity": "3rd Party Mac Developer Application: Your Name (TEAM_ID)",
"providerShortName": "YOUR_PROVIDER_NAME",
"entitlements": "entitlements.mas.plist",
"exceptionDomain": "localhost"
}
}
}
}2. Create App Store entitlements (entitlements.mas.plist):
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
<key>com.apple.security.app-sandbox</key>
<true/>
<key>com.apple.security.network.client</key>
<true/>
<key>com.apple.security.files.user-selected.read-write</key>
<true/>
</dict>
</plist>3. Build for App Store:
npm run tauri build -- --target universal-apple-darwin4. Create installer package:
productbuild --component "src-tauri/target/release/bundle/macos/pH7Console.app" \
/Applications \
--sign "3rd Party Mac Developer Installer: Your Name (TEAM_ID)" \
pH7Console-mas.pkg5. Submit to App Store:
xcrun altool --upload-app \
--type osx \
--file "pH7Console-mas.pkg" \
--username "[email protected]" \
--password "app-specific-password"1. Prepare Windows package:
Install Windows App SDK:
# Install MSIX packaging tools
winget install Microsoft.WindowsSDK2. Create appxmanifest.xml:
<?xml version="1.0" encoding="utf-8"?>
<Package xmlns="http://schemas.microsoft.com/appx/manifest/foundation/windows10">
<Identity Name="YourCompany.pH7Console"
Publisher="CN=Your Name"
Version="1.0.0.0" />
<Properties>
<DisplayName>pH7Console</DisplayName>
<PublisherDisplayName>Your Name</PublisherDisplayName>
<Description>AI-Powered Terminal that Respects Your Privacy</Description>
</Properties>
<Dependencies>
<TargetDeviceFamily Name="Windows.Desktop"
MinVersion="10.0.17763.0"
MaxVersionTested="10.0.22000.0" />
</Dependencies>
<Applications>
<Application Id="pH7Console" Executable="pH7Console.exe" EntryPoint="Windows.FullTrustApplication">
<uap:VisualElements DisplayName="pH7Console"
Description="AI-Powered Terminal"
BackgroundColor="#1a1a1a"
Square150x150Logo="assets/logo150.png"
Square44x44Logo="assets/logo44.png" />
</Application>
</Applications>
</Package>3. Package for Microsoft Store:
# Create MSIX package
makeappx pack /d "path/to/app/folder" /p "pH7Console.msix"
# Sign the package
signtool sign /fd SHA256 /a "pH7Console.msix"4. Submit to Microsoft Store:
- Upload via Partner Center
- Complete store listing
- Submit for certification
GitHub Actions (.github/workflows/release.yml):
name: Release
on:
push:
tags: ['v*']
jobs:
release:
permissions:
contents: write
strategy:
fail-fast: false
matrix:
platform: [macos-latest, ubuntu-20.04, windows-latest]
runs-on: ${{ matrix.platform }}
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Install dependencies (ubuntu only)
if: matrix.platform == 'ubuntu-20.04'
run: |
sudo apt-get update
sudo apt-get install -y libgtk-3-dev libwebkit2gtk-4.0-dev librsvg2-dev
- name: Rust setup
uses: dtolnay/rust-toolchain@stable
- name: Node.js setup
uses: actions/setup-node@v4
with:
node-version: 18
- name: Install dependencies
run: npm ci
- name: Build the app
uses: tauri-apps/tauri-action@v0
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
APPLE_CERTIFICATE: ${{ secrets.APPLE_CERTIFICATE }}
APPLE_CERTIFICATE_PASSWORD: ${{ secrets.APPLE_CERTIFICATE_PASSWORD }}
APPLE_SIGNING_IDENTITY: ${{ secrets.APPLE_SIGNING_IDENTITY }}
APPLE_ID: ${{ secrets.APPLE_ID }}
APPLE_PASSWORD: ${{ secrets.APPLE_PASSWORD }}
with:
tagName: ${{ github.ref_name }}
releaseName: 'pH7Console v__VERSION__'
releaseBody: 'See the assets to download and install this version.'
releaseDraft: true
prerelease: falseBefore Release:
- Update version in
tauri.conf.jsonandpackage.json - Test on all target platforms
- Verify code signing works
- Test installation/uninstallation
- Update release notes
macOS:
- Build universal binary
- Code sign with Developer ID
- Notarize with Apple
- Test Gatekeeper compatibility
- Create DMG installer
Windows:
- Build x64 executable
- Code sign with valid certificate
- Create MSI/NSIS installer
- Test Windows Defender compatibility
Linux:
- Build .deb package
- Create AppImage
- Test on Ubuntu/Fedora
- Verify desktop integration
App Stores:
- Prepare store assets (icons, screenshots)
- Write store descriptions
- Set pricing/availability
- Submit for review
- Monitor review status
This comprehensive build and distribution setup ensures your pH7Console can reach users across all major platforms! 🚀
-
Clone the repository
git clone https://github.com/EfficientTools/pH7Console.git cd pH7Console -
Install dependencies - Run the automated setup script
chmod +x setup.sh ./setup.sh
-
Start development - Launch the terminal with AI capabilities
npm run tauri dev
npm run tauri dev- Start development server with hot reloadnpm run tauri:build- Build for production (creates native binaries)npm run tauri:build -- --target universal-apple-darwin- Build universal macOS binarynpm run tauri:build -- --target x86_64-pc-windows-msvc- Build for Windowsnpm run test- Run all tests (Rust backend and TypeScript frontend)npm run clean- Clean all build artifacts and reinstall dependencies
System Requirements:
- RAM: 4GB minimum, 8GB recommended (for AI models)
- Storage: 5GB free space (for models and dependencies)
- OS: macOS 10.15+, Windows 10+, or Linux (Ubuntu 18.04+)
Required Tools:
- Rust (1.70+) - Install Rust
- Node.js (v18+) - Install Node.js
- Git - Install Git
Development Mode (with hot reload):
npm run tauri:dev
# or
npx tauri devProduction Build:
npm run tauri:build
./src-tauri/target/release/pH7Console # Linux/macOS
# or pH7Console.exe on Windows- Launch the app - pH7Console opens with a dark terminal interface
- Create sessions - Click "+" to create new terminal sessions
- Natural language commands - Type what you want in plain English:
"show me all large files" → AI suggests: find . -type f -size +100M -exec ls -lh {} \; - Smart completions - Start typing and press Tab for AI suggestions
- Error assistance - When commands fail, AI automatically suggests fixes
Multi-Session Management:
Cmd/Ctrl + T- New terminal sessionCmd/Ctrl + W- Close current sessionCmd/Ctrl + 1-9- Switch between sessions
AI Commands:
- Type naturally: "check git status and stage all changes"
- Use
/explain <command>to understand any command - Use
/fixafter an error for AI troubleshooting - Use
/optimizeto get more efficient alternatives
Workflow Learning:
- AI learns your patterns and suggests personalized workflows
- Frequently used command sequences become smart templates
- Context-aware suggestions based on your project type
Prerequisites for App Store submission:
- Apple Developer Program membership ($99/year)
- App Store Connect account setup
- Provisioning Profile for Mac App Store distribution
- Code Signing Certificate (3rd Party Mac Developer Application)
Setup Steps:
-
Create App ID in Developer Portal:
- Go to Apple Developer Portal
- Create App ID with Bundle ID:
com.efficienttools.ph7console
-
Create Provisioning Profile:
- Create "Mac App Store Connect" provisioning profile
- Download and save as:
src-tauri/profile/pH7Console_Mac_App_Store.provisionprofile
-
Update Entitlements:
- Edit
src-tauri/Entitlements.plist - Replace
YOUR_TEAM_IDwith your Apple Developer Team ID
- Edit
-
Download API Key:
- Create App Store Connect API Key in Users and Access
- Download
AuthKey_XXXXXXXXXX.p8file - Save to
~/.appstoreconnect/private_keys/or~/private_keys/
Build and Upload:
# Build for App Store
npm run tauri:build:appstore
# Upload to App Store Connect
npm run tauri:upload:appstore
# Or do both in one command
npm run appstoreManual Build Process:
# 1. Build app bundle
npm run tauri build -- --no-bundle
npm run tauri bundle -- --bundles app --target universal-apple-darwin --config src-tauri/tauri.appstore.conf.json
# 2. Create signed PKG
xcrun productbuild --sign "3rd Party Mac Developer Application" \
--component "target/universal-apple-darwin/release/bundle/macos/pH7Console.app" /Applications \
pH7Console.pkg
# 3. Upload to App Store
xcrun altool --upload-app --type macos --file "pH7Console.pkg" \
--apiKey $APPLE_API_KEY_ID --apiIssuer $APPLE_API_ISSUERPost-Upload:
- Monitor processing in App Store Connect
- Build appears in TestFlight after processing (10-60 minutes)
- Submit for App Store review when ready
- Track review status and respond to feedback
macOS:
# Universal binary (Intel + Apple Silicon)
npm run tauri build -- --target universal-apple-darwin
# Intel only
npm run tauri build -- --target x86_64-apple-darwin
# Apple Silicon only
npm run tauri build -- --target aarch64-apple-darwinWindows:
# Install Windows target (from macOS/Linux)
rustup target add x86_64-pc-windows-msvc
# Build for Windows
npm run tauri build -- --target x86_64-pc-windows-msvcLinux:
# Ubuntu/Debian (.deb)
npm run tauri build -- --target x86_64-unknown-linux-gnu
# AppImage (universal Linux)
npm run tauri build -- --bundles appimage
# RPM (Red Hat/Fedora)
npm run tauri build -- --bundles rpmProduction optimized builds:
# Maximum optimization
TAURI_ENV=production npm run tauri build -- --release
# Size optimization
npm run tauri build -- --release --bundles app,dmg --no-default-featuresDebug builds (faster compilation):
npm run tauri build -- --debugmacOS:
.dmginstaller insrc-tauri/target/release/bundle/dmg/.appbundle insrc-tauri/target/release/bundle/osx/- Code signing: Configure in
tauri.conf.json→bundle.macOS.signingIdentity
Windows:
.msiinstaller insrc-tauri/target/release/bundle/msi/.exeportable insrc-tauri/target/release/- Code signing: Configure certificate in
tauri.conf.json
Linux:
.debpackage insrc-tauri/target/release/bundle/deb/.AppImageinsrc-tauri/target/release/bundle/appimage/.rpmpackage insrc-tauri/target/release/bundle/rpm/
"show me all large files"
→ find . -type f -size +100M -exec ls -lh {} \;
"what's using the most CPU?"
→ top -o cpu
"check git status and stage changes"
→ git status && git add .
- Type
gitand press Tab for context-aware Git commands - Type
npmand get project-specific script suggestions - Type
dockerand get container-aware commands
- Command Explanation: Hover over any command for AI explanation
- Error Recovery: Automatic error analysis with suggested fixes
- Context Awareness: AI understands your project structure and suggests relevant commands
- Workflow Learning: AI learns your patterns and suggests optimized workflows
Create ~/.ph7console/config.json:
{
"ai": {
"primary_model": "phi3-mini",
"fallback_model": "tinyllama",
"temperature": 0.7,
"max_tokens": 512,
"privacy_mode": "local_only",
"learning_enabled": true,
"context_window": 4096
},
"performance": {
"battery_aware": true,
"adaptive_loading": true,
"max_memory_usage": "4GB",
"inference_timeout": 5000
},
"terminal": {
"default_shell": "/bin/zsh",
"history_size": 10000,
"session_persistence": true,
"auto_suggestions": true
},
"ui": {
"theme": "dark",
"font_family": "SF Mono",
"font_size": 14,
"transparency": 0.95
}
}Download specific models:
# Download Phi-3 Mini (recommended)
ph7console --download-model phi3-mini
# Download multiple models
ph7console --download-model tinyllama,llama32-1b
# List available models
ph7console --list-models
# Check model status
ph7console --model-statusModel switching:
# Switch primary model
ph7console --set-model phi3-mini
# Use specific model for session
ph7console --model tinyllama --session work-sessionLearning and Privacy:
{
"learning": {
"pattern_recognition": true,
"workflow_detection": true,
"error_learning": true,
"personalization": true,
"data_retention_days": 90
},
"privacy": {
"telemetry": false,
"local_only": true,
"encrypt_history": true,
"auto_cleanup": true,
"share_anonymous_patterns": false
}
}Performance Tuning:
{
"inference": {
"cpu_threads": "auto",
"memory_limit": "4GB",
"batch_size": 1,
"quantization": "4bit",
"cache_size": "1GB"
},
"optimization": {
"preload_models": ["tinyllama"],
"lazy_loading": true,
"memory_mapping": true,
"thermal_throttling": true
}
}- Type
gitand press Tab for context-aware Git commands - Type
npmand get project-specific script suggestions - Type
dockerand get container-aware commands
- Command Explanation: Hover over any command for AI explanation
- Error Recovery: Automatic error analysis with suggested fixes
- Context Awareness: AI understands your project structure and suggests relevant commands
- Workflow Learning: AI learns your patterns and suggests optimized workflows
{
"ai": {
"model": "phi3-mini",
"temperature": 0.7,
"max_tokens": 512,
"privacy_mode": "local_only"
}
}{
"performance": {
"battery_aware": true,
"adaptive_loading": true,
"max_memory_usage": "4GB"
}
}pH7Console/
├── src-tauri/ # Rust backend
│ ├── src/
│ │ ├── ai/ # AI/ML modules
│ │ ├── terminal/ # Terminal emulation
│ │ └── commands.rs # Tauri commands
├── src/ # React frontend
│ ├── components/ # UI components
│ ├── store/ # State management
│ └── types/ # TypeScript types
└── models/ # Local AI models (downloaded)
- Define the capability in Rust:
#[tauri::command]
pub async fn my_ai_feature(input: String) -> Result<AIResponse, String> {
// Implementation
}- Add to frontend store:
const useAIStore = create((set) => ({
myFeature: async (input: string) => {
return await invoke('my_ai_feature', { input });
}
}));# Run Rust tests
cd src-tauri && cargo test
# Run frontend tests
npm test
# Integration tests
npm run test:e2eWe welcome contributions! Please see our Contributing Guide for details.
- Fork the repository
- Create a feature branch:
git checkout -b feature/amazing-feature - Make your changes
- Add tests:
cargo test(Rust) andnpm test(Frontend) - Commit with conventional commits:
feat: add amazing feature - Push and submit a pull request
- Rust: Use
cargo fmtandcargo clippy - TypeScript: Use Prettier and ESLint
- Commits: Follow Conventional Commits
# Backend tests
cd src-tauri && cargo test
# Frontend tests
npm test
# Integration tests
npm run test:e2e
# AI model tests
cargo test --features ai-tests- Add model configuration:
// In src-tauri/src/models/local_llm.rs
LocalModelInfo {
name: "Your Model Name".to_string(),
size_mb: 2000,
model_type: ModelType::Custom,
capabilities: vec![Capability::CodeGeneration],
download_url: "huggingface-model-id".to_string(),
performance_tier: PerformanceTier::Fast,
}- Implement model loading:
// In src-tauri/src/ai/mod.rs
async fn load_custom_model(&mut self) -> Result<(), Error> {
// Implementation for your model
}- Add tests and documentation
- Privacy First: Never add telemetry or data collection
- Performance: Optimize for low-end hardware (MacBook Air)
- Local Only: All AI processing must happen locally
- Cross-Platform: Ensure compatibility with macOS, Windows, Linux
- Documentation: Update README and add inline docs
- Voice Commands - Local speech recognition
- Plugin System - Extensible AI capabilities
- Team Sharing - Encrypted workflow sharing
- Advanced Visualizations - Command impact visualization
- Multi-Language Support - Support for multiple languages
- Cloud Sync - Optional encrypted cloud synchronization
- Some AI models may require additional setup on certain hardware
- Terminal themes are currently limited (more coming soon)
- Windows: PTY handling has occasional quirks
pH7Console is part of the challenge #Privacy-First-AI-Tools, a collection of innovative AI projects focused on bringing artificial intelligence capabilities to developers while maintaining complete privacy and data sovereignty. Built with a commitment to local processing and zero telemetry. Hope you enjoy 🤗
Feel free to connect, and reach me at my LinkedIn Profile 🚀
pH7Console is generously distributed under MIT license 🎉 Wish you happy, happy productive time! 🤠