# Use npx instead of global installationnpx @nicomatt69/nikcli# Or create an aliasecho 'alias nikcli="npx @nicomatt69/nikcli"' >> ~/.bashrcsource ~/.bashrc
Solution 2: Fix npm Permissions
Copy
# Create a directory for global packagesmkdir ~/.npm-globalnpm config set prefix '~/.npm-global'# Add to PATHecho 'export PATH=~/.npm-global/bin:$PATH' >> ~/.bashrcsource ~/.bashrc# Install NikCLInpm install -g @nicomatt69/nikcli
Solution 3: Use Local Installation
Copy
# Install locally in your projectnpm install @nicomatt69/nikcli# Run with npxnpx nikcli# Or add to package.json scriptsecho '"scripts": {"nikcli": "nikcli"}' >> package.jsonnpm run nikcli
Installation fails with different package managers
Inconsistent behavior
Solutions:
Copy
# Clear all cachesnpm cache clean --forceyarn cache cleanpnpm store prune# Use specific package managercurl -fsSL https://raw.githubusercontent.com/nikomatt69/nikcli-main/main/installer/install.sh | bash -s npm# Or force specific managernpm install -g @nicomatt69/nikcli --force
# Check current usage/tokens# Switch to different model/model claude-3-5-sonnet/model gpt-4o-mini# Reduce concurrency/parallel-config --max-concurrent 2# Use local models/set-key ollama http://localhost:11434/model ollama:llama3.1:8b
# Test network connectivity/run ping google.com# Check DNS resolution/run nslookup api.anthropic.com# Test API endpoints/debug --network
Configure Proxy
Copy
# Set proxy settings/config --proxy http://proxy.company.com:8080# Or use environment variablesexport HTTP_PROXY=http://proxy.company.com:8080export HTTPS_PROXY=http://proxy.company.com:8080
Use Local Models
Copy
# Switch to local models/set-key ollama http://localhost:11434/model ollama:llama3.1:8b# Or use offline mode/config --offline-mode
# Run full diagnostics/diagnostic# Check system status/monitor# View detailed logs/logs --level debug# Check configuration/config --validate# Test all components/debug --full