Generated: 2026-04-27
Executive summary
The module documents describe a broad malware-support ecosystem composed of loaders, lightweight bots, backdoors, backconnect and VPN components, scanner modules, credential and data collection modules, file-search and exfiltration modules, spam tooling, crypting/packing workflows, ransomware-style lockers, operator panels, testing procedures, and build/cleaning automation.
The overall design is modular. A small core bot or loader provides identity, communication, update, payload launch, and module-control capabilities. Specialized modules then implement individual functions: remote command execution, backconnect access, VPN bridging, internal reconnaissance, account dumping, file discovery, browser cookie collection, web vulnerability scanning, RDP/OWA probing, spam dispatch, and ransomware-like file locking.
The documents show a consistent engineering style:
- Use small, statically linked Windows binaries or DLL modules.
- Build x86 and x64 variants.
- Avoid unnecessary runtime dependencies.
- Avoid saving state or payload bodies to disk unless explicitly required.
- Prefer in-memory execution and minimal persistent artifacts.
- Use obfuscation and anti-analysis hygiene in release builds.
- Separate debug/logged builds from production/no-log builds.
- Integrate modules with a parent bot through a defined Start/Control/Release-style interface.
- Use panels and backend APIs for tasking, reporting, sample testing, scanner management, and build validation.
The set is not a single product specification. It is a portfolio of module requirements, operator manuals, and testing procedures that together describe how an operator-facing malware platform is built, packaged, tested, and used.
Core module development model
The module development guide defines modules as DLLs that run inside arbitrary host processes. Modules are expected to operate in unusual contexts: interactive users, SYSTEM, non-interactive service contexts, and memory-only loading where the module cannot rely on its own filesystem path.
Core development constraints include:
- Minimize Visual C++ Runtime usage.
- Avoid STL where possible.
- Prefer WinAPI memory/string primitives over CRT functions.
- Statically link runtime and third-party libraries.
- Avoid temporary files by default.
- Avoid registry and filesystem state unless explicitly approved.
- Support x86 and x64 builds.
- Provide logged and non-logged release variants.
- Obfuscate strings and system API resolution in production builds.
- Remove debug output and debug metadata from release artifacts.
- Follow a fixed module API for parent-bot integration.
The module API follows the familiar pattern used in earlier reports:
- Start initializes the module and receives parent identity/context.
- Control delivers commands or configurations.
- Release shuts down the module and frees resources.
- FreeBuffer releases module-allocated output buffers.
The parent context provides fields such as client ID, group ID, and external IP address. This lets modules produce reports and communicate with backend infrastructure without implementing their own identity system.
Loader and bot requirements
The loader and bot documents distinguish several roles:
- Non-resident loader.
- Resident loader.
- Lightweight modular bot.
- Minimalist backdoor bot.
The non-resident loader is designed for delivery and launch rather than long-term presence. It avoids persistence, avoids local state, and can optionally generate a repeatable hardware-derived identifier. It collects basic system information, retrieves a payload, supports x86/x64 concerns, and can launch executable or DLL payloads.
The resident loader adds persistence and update behavior. It is expected to work on older Windows versions, retrieve architecture-appropriate payloads, validate servers, support client validation, perform update checks, and avoid writing downloaded network payloads to disk.
The lightweight modular bot separates a small core from implant modules. The core manages communication, command receipt, result submission, telemetry, and module/implant invocation. Commands such as executable/script launch are expected to move into implant-like components rather than remain in an oversized core.
The general bot requirements emphasize:
- Communication over protocols commonly allowed in corporate networks.
- Channel redundancy.
- Server verification to prevent takeover.
- Additional traffic encryption or obfuscation.
- Simple, extensible protocols.
- Stateless or re-entrant operation where possible.
- Duplicate persistence options.
- Separation of loader and payload.
- Automated updates.
Backdoor family
The backdoor documentation describes a deliberately lightweight, stealth-oriented bot with a two-stage model:
- Stage one: loader.
- Stage two: main bot functionality.
The backdoor is intended to preserve only the loader on disk while keeping the functional body more transient. It supports server redundancy, backup domain mechanisms, server validation by cryptographic key, selective per-bot updates, optional persistence, and regular cleaning.
The backdoor's functional scope is intentionally narrow. It provides enough capability to act as a remote command shell and launcher for external tooling:
- System information collection.
- Launch of executables, DLLs, batch files, PowerShell scripts, shellcode, and command-line commands.
- Download of bounded-size files.
- Process termination.
- Self-removal.
The operator guidance frames the backdoor as an alternative to full post-exploitation frameworks when those are unavailable or unsuitable. It relies on operator skill and external tools rather than embedding every stealing, scanning, or lateral-movement function.
Manual payload activation
One document specifies a manual payload-confirmation feature. The purpose is to prevent automated delivery of payloads to suspected researchers, antivirus systems, honeypots, or undesirable environments.
The concept adds a group-level "Manual payload confirmation" switch. When enabled, a loader does not receive a payload until an operator explicitly approves it. The administrative interface should show loaders awaiting payload delivery and provide controls to approve or deny payload release. Denial places the source into a honeypot-like block list. The loader can request or pass a short serial/key value, which becomes part of the operator's decision workflow.
This design introduces human-in-the-loop gating between initial loader contact and payload delivery.
Backconnect and VPN components
The backconnect server documents define infrastructure for relaying operator traffic through client-side connections. The server accepts client connections, identifies backconnect clients, handles SOCKS-style interaction from operators, and tunnels traffic between operator-side sessions and client-side data connections.
The VPN module and bridge documents describe a paired-client/bridge model using OpenVPN-like configuration exchange. A module on one side and a paired client connect through a bridge service. The bridge waits until both sides are present, generates configuration material, and returns readiness status. The operator workflow integrates this module through the parent bot's module-control mechanism.
The VPN client/admin document describes a more product-like VPN service with:
- Client installer.
- Client settings.
- HTTP API for authentication and server selection.
- Country, latency, and bandwidth metadata.
- Subscription and tariff management.
- Administrative panel functions.
Taken together, the backconnect and VPN documents describe multiple ways to turn compromised or controlled hosts into network access points.
Scanner and vulnerability-discovery modules
The documents include multiple scanner specifications. These modules share a common pattern: receive work from a backend, scan assigned targets, return mined results, and expose status/events to a scanner panel.
Scanner administration backend
The scanner admin/backend document defines a central management system for password and vulnerability scanners. It uses bot group and client ID as core routing identity. The backend distributes target ranges, domains, configurations, and scan settings. It receives mined data such as hosts, services, credentials, or vulnerability findings. The panel tracks scanner capacity, assigned scan ranges, result volume, and operational state.
masscan adaptation
The masscan adaptation document proposes integrating a high-speed port scanner into the scanner administration framework. The backend provides address and port ranges. The scanner reports results through a DPOST-like channel. Administrative settings expose scan parameters from the original tool.
RDP scanner
The RDP scanner document describes modes for idle operation, target retrieval, checking, and result submission. It includes host information collection, username discovery concepts, optional credential-checking behavior, event reporting, configuration retrieval, and backend interaction.
OWA brute/checking module
The OWA document describes distributed checking against Outlook Web Access-like services. It includes crawler, scanner, server-side mode/config retrieval, distributed target assignment, result reporting, and DPOST-style submission of successful data. The module is designed to be controlled by a backend and to operate with configurable threads and operating modes.
SQL-injection scanner
The web vulnerability scanner document describes crawling web targets, finding forms and parameters, applying fuzzing/check rules, and reporting identified vulnerable parameters. It is designed around configurable rules, crawler/detector separation, backend configuration, and result reporting.
Apache Tomcat scanner
The Tomcat scanner document focuses on finding vulnerable Tomcat management-like endpoints and validating exploitation potential through response patterns. It records HTTP request/response evidence, domain, software version, and scan results.
Printer scanner
The printer scanner document proposes combining external search data with a printer exploitation toolkit. It can be run as a standalone scanner or redesigned as a bot module. The output is a list of internet-exposed printers with validated behavior and command/result logs.
Local network scanner
The local network scanner collects system information from Windows hosts on a local network using legitimate Windows services rather than deploying agents. It takes domain-administrator credentials as input, minimizes requests, supports multithreading, and is packaged as a DLL compatible with the broader module build standards.
Scanner design pattern
Across scanner documents, recurring design elements include:
- Backend-controlled work assignment.
- Client group and client ID in requests.
- Multiple operating modes, including idle/checking/scanning.
- Configurable scan ranges and target lists.
- Periodic result submission.
- Event reporting.
- DPOST-like transport for mined results.
- Integration into scanner panels for monitoring and task management.
Credential, cookie, and account-data collection
Several modules focus on collecting credentials, cookies, account information, or account inventories.
Windows and Active Directory user dump
This module is intended to collect local Windows account information and Active Directory account data. It emphasizes deterministic hashing of sensitive values, configuration-driven behavior, and a file-transfer protocol that avoids peak bandwidth load by chunking or scheduling transfers.
Cookie grabber
The browser-cookie module collects browser cookie records and submits them using DPOST-style handlers. Records include user, domain, cookie name, cookie value, and path. The module receives a DPOST proxy/handler list as its main configuration.
Password and file-search collectors
The file-search modules search local or server-accessible resources for files, folders, or keywords matching configured patterns. Some are aimed at tax/accounting software directories and business-data discovery. Others search known server resources using supplied credentials, regular expressions, starting paths, and recursion-depth limits.
These modules generally report:
- Client ID.
- Group ID.
- IP addresses.
- Host/user/system metadata.
- File match metadata.
- Transferred file data or result records.
Spam module
The spam-bot specification describes two primary modes:
- Contact collection only.
- Contact collection plus message sending.
Inputs include message body, attachments, recipient list, and sending vectors. Recipient lists and SMTP vectors may be local, collected from host resources, or supplied by the server. The bot supports message and attachment randomization through macros for dictionaries, random numbers, and random strings.
Implementation requirements mirror other modules:
- Static executable packaging.
- No reliance on its own path.
- No self-restart.
- Operate as ordinary user or SYSTEM.
- Work in interactive and non-interactive sessions.
- Follow the broader build and anti-analysis standards.
Backdoor, Cobalt Strike, and operator tooling
Several documents reference compatibility with Cobalt Strike-style loading and post-exploitation workflows. The backdoor is positioned as a command-shell-like fallback or complement to larger frameworks. Module packaging standards include entry points compatible with external loaders. The operator manual and testing plan focus on command execution, output capture, file download, process termination, and self-removal.
This indicates that the platform was designed to interoperate with existing operator tooling rather than replace every function with custom code.
Crypto, packing, and AV-cleaning workflow
The cryptor and daily-crypt documents describe repeated production of modified binary builds with different hashes and detection profiles. The process includes automated build generation, staging, testing, browser-based download checks, antivirus state documentation, and screenshots/logging of antivirus behavior before and after execution.
The automated cleaning document proposes methods for identifying source-code or binary regions responsible for antivirus detections. It describes iterative reduction, build-map usage, binary-level function neutralization, and repeated static checking. This fits the management documents' emphasis on tracking antivirus detections and recording cleanup methods.
The polymorphic assembly preprocessor document describes a code-transformation utility intended to alter assembly layout while preserving behavior. The examples include changing data layout, variable allocation order, and similar transformations that alter resulting binary structure.
The cryptor-related theme is high-volume generation, testing, and cleaning of binaries to manage detection by antivirus products.
Ransomware and locker documents
The locker specifications describe two levels:
- A full cryptolocker with administrative panel, roles, victim-facing functions, chats, payment/business workflow, and bot-side locker behavior.
- A simpler cryptolocker focused on compact binary size and file coverage.
The full cryptolocker document includes victim/operator roles, administrative functions, victim identification, machine/user/OS/network metadata, and recovery/payment workflow. The bot-locker records system information and uses a client identity scheme similar to the broader platform.
The simple cryptolocker focuses on:
- Minimal binary size.
- Processing local files and accessible network shares.
- Maximizing disk and network-share processing throughput.
- Obfuscated strings and API calls.
- Startup anti-hooking and anti-injection measures.
- Repeatable device identity that is generated each run, stable across users, unique, and not stored on disk.
These documents describe destructive file-encryption tooling and related management infrastructure at a specification level.
File discovery and exfiltration modules
Multiple documents specify modules for locating and transferring files:
- Local keyword-based file search.
- Path/folder match search.
- Server resource search using known credentials.
- Upload/transfer protocols for results.
The file-search modules are designed to operate as standard modules with parent-provided identity. They search configured paths, match names or content using patterns or regular expressions, and submit metadata and/or file contents to backend infrastructure. The target examples include business, tax, and accounting software data folders.
The server-file scanner accepts credential/resource lists and receives configuration from a command server: access list, search regexes, starting directory, depth, and reporting interval. This turns known credentials into systematic data discovery across reachable servers.
Crypto panel and operator panel documents
The crypto-panel operator guide describes a panel used to store, prioritize, test, download, and delete sample files. It tracks reports from autotests, including modular bot and backdoor tests. It exposes file metadata such as name, size, upload time, priority, reports, and actions.
The panel also supports operational troubleshooting around slow VMs and autotest script failures. This connects the modules ecosystem to the prior testing-automation report: samples are staged, tested, reported, and promoted through panel workflows.
Superbrowser guide
The Superbrowser guide appears to describe an operator browser environment requiring Tor Browser and accounts for bot and storage administrative panels. It likely centralizes access to operator web interfaces through a controlled browser setup.
Testing and release validation
The backdoor testing plan defines acceptance checks before delivery. It validates core backdoor commands, output capture, file transfer, process termination, self-removal, and behavior across expected conditions.
The daily-crypt process and crypto-panel guide extend this into repeated operational validation:
- Generate or receive daily builds.
- Mark files for staging.
- Run autotests.
- Download through a browser.
- Observe antivirus behavior.
- Capture screenshots/logs.
- Record antivirus version, cloud-protection state, and update state.
- Maintain separate x86/x64 and EXE/DLL variants.
This reflects a pipeline where build production, AV testing, operator review, and panel staging are linked.
Inter-module communication and reporting
Most modules rely on the same conceptual reporting model:
- Parent bot supplies identity.
- Module receives configuration via Control or HTTP backend.
- Module performs assigned work.
- Module submits records through HTTP-like endpoints or DPOST-style handlers.
- Backend replies with a simple success marker.
- Panels aggregate status, results, and logs.
This model reduces the amount of custom communication logic each module needs and lets the backend treat many module families similarly.
System characterization
The documents describe a modular offensive platform with several layers:
1. Delivery and persistence layer
Loaders, resident loaders, and lightweight bots retrieve and launch payloads, maintain communication, and provide execution context for modules.
2. Control and access layer
Backdoors, backconnect servers, VPN bridges, and operator panels provide command execution and network access into target environments.
3. Discovery layer
Scanners collect hosts, services, vulnerabilities, usernames, credentials, directories, files, and network topology.
4. Collection layer
Cookie grabbers, account dumpers, file search modules, and DPOST collectors extract and submit data of interest.
5. Impact layer
Lockers and cryptolocker admin systems provide file-encryption/extortion capability.
6. Support layer
Cryptors, polymorphic preprocessing, AV-cleaning automation, daily crypt workflows, crypto-panel testing, and release validation support ongoing build production and detectability management.
7. Operator layer
Operator manuals, panels, Superbrowser, and testing plans provide human workflows around tasking, staging, quality control, and access management.
Overall assessment
The module documents show a mature, segmented toolchain. The platform is not monolithic; it is designed as a set of small, specialized components that share build standards, parent-bot integration, backend tasking, and reporting conventions.
The most important architectural themes are:
- Minimal loaders and bots that delegate specialized work to modules.
- Strong attention to x86/x64 packaging and Windows compatibility.
- Memory-only or low-artifact execution assumptions.
- Centralized panels for operators and testers.
- Backend-controlled scan and collection work queues.
- DPOST-like result submission across multiple module families.
- Repeatable build and AV-cleaning workflows.
- Clear separation between logged/debug and production/no-log builds.
- Emphasis on avoiding accidental artifacts such as local paths, debug symbols, unhidden strings, and unnecessary disk writes.
The documents also show a division of labor: developers build and package modules, testers validate behavior and detection status, operators use panels to task bots and review results, and backend systems coordinate queues, reports, and scanner output. This division is consistent with a larger organized software operation rather than ad hoc tooling.
Appendix: module family index
Core and build framework:
- module_HOWTO.txt
- требования к боту.txt
- требования к лоадеру.txt
- легковесный модульный бот.txt
Loaders and backdoors:
- backdoor руководство оператора.txt
- ТЗ бэкдор.txt
- ТЗ резидентный загрузчик.txt
- бк активация.txt
- план тестирования бк.txt
Backconnect and VPN:
- тз backconnect-сервер.txt
- ТЗ VPN-клиент и админка.txt
- ТЗ модуль и мост VPN.txt
- руководство к Superbrowser.txt
Scanning and vulnerability discovery:
- ТЗ админка сканеров.txt
- ТЗ портирование masscan.txt
- ТЗ сканер rdp.txt
- ТЗ брут OWA.txt
- ТЗ сканер sql инъекций2.txt
- сканер apache tomcat.txt
- ТЗ сканер локальной сети.txt
- ТЗ сканер принтеров.txt
Credential and data collection:
- ТЗ дамп пользователей Windows и AD.txt
- ТЗ модуль граб cookies.txt
- ТЗ модуль поиска файлов и папок по совпадению.txt
- ТЗ поиск файлов на серверах.txt
- ТЗ поиск файлов по ключевым словам.txt
Impact and spam modules:
- ТЗ криптолокер.txt
- ТЗ простой криптолокер.txt
- ТЗ спамбот.txt
Packing, polymorphism, and AV-cleaning:
- ТЗ криптер.txt
- ежедневные крипты.txt
- ТЗ автоматизация чистки.txt
- ТЗ полиморфный процессор Asm.txt
Operator panels:
- руководство оператора криптопанели.txt

No comments:
Post a Comment