Malformed IP error on duplicate IP
v0.3.4
This tool throws a malformed IP error on properly formatted IPs. After some troubleshooting I determined that the tool is actually erroring because the input file contained the same IP twice. This error needs to be more clear, and/or the tool should just skip over an IP that is a duplicate.
One example on how to reproduce:
test.txt
10.0.0.0/16
10.0.0.96/27
Running onesixtyone with the above as an input file yields the following:
$ onesixtyone -i ./test.txt -p161
Malformed IP address: 10.0.0.96/27
Expanding the IP ranges in test.txt from CIDR to a list, then grepping for the erroring IP, we can see it is the first duplicate entry the tool is erroring on:
$ cat test.txt.list | grep 10.0.0.96 -A2 -B2
10.0.0.94
10.0.0.95
10.0.0.96
10.0.0.97
10.0.0.98
--
10.0.255.254
10.0.255.255
10.0.0.96
10.0.0.97
10.0.0.98
Actually, this error might not only occur on duplicates....
./tmp file:
10.210.0.0/16
10.211.0.0/16
Error:
$ onesixtyone -p161 -i ./tmp
Malformed IP address: 10.211.0.0/16
When turned from a CIDR into a list
$ onesixtyone -p161 -i ./tmp.list
Malformed IP address: 10.211.0.0
Grepping for the IP:
$ cat tmp.list | grep 10.211.0.0 -A2 -B2
10.210.99.98
10.210.99.99
10.211.0.0
10.211.0.1
10.211.0.10
And yet, if I take this sole IP out and put it into ./test file:
10.211.0.0
I get no errors.
$ onesixtyone -p161 -i ./test
Scanning 1 hosts, 2 communities
IP validation seems a little broken? I know 10.211.0.0 isn't a typical IP you'd see, but when I give the tool two /16 ranges I expect it to scan them. For some reason, the tool accepts the first .0 address and then errors on the second...
$ head -n 3 tmp.list
10.210.0.0
10.210.0.1
10.210.0.10
Analysis of Issue #35: Misleading "Malformed IP" Error
I've thoroughly investigated this issue and confirmed it's a bug in onesixtyone's error reporting. Here's what's happening:
Root Cause
The "Malformed IP address" error is misleading. The IP range 10.0.0.96/27 is perfectly valid. The actual problem is that onesixtyone has hit its hardcoded limit of 65,536 hosts (MAX_HOSTS).
What's Really Happening
-
First range
10.0.0.0/16: Expands to exactly 65,536 IP addresses (10.0.0.0 to 10.0.255.255), completely filling thehostarray -
Second range
10.0.0.96/27: Attempts to add 32 more IPs (10.0.0.96 to 10.0.0.127) - The
add_host()function hits the array limit check at line 199:if (host_count >= MAX_HOSTS) - It returns
-1(the same error code used for invalid IPs) -
read_hosts()at line 239 interprets ANY-1return as "Malformed IP address"
Proof
# This fails with "Malformed IP address: 10.0.0.96/27"
$ echo -e "10.0.0.0/16\n10.0.0.96/27" | onesixtyone -i - -p161
# But the same IP range works fine on its own!
$ echo "10.0.0.96/27" | onesixtyone -i - -p161
Scanning 32 hosts, 2 communities
# ANY IP after a /16 triggers the same misleading error
$ echo -e "10.0.0.0/16\n192.168.1.1" | onesixtyone -i - -p161
Malformed IP address: 192.168.1.1
The Real Issues
-
No duplicate detection: The 32 IPs from
10.0.0.96/27are already included in10.0.0.0/16, but the code doesn't check -
Poor error handling: The
add_host()function uses the same-1return code for multiple failure conditions -
Misleading error message: Reports "Malformed IP" when the actual issue is hitting
MAX_HOSTSlimit
Proposed Fixes
Quick Fix (Minimal Changes)
// In read_hosts() at line 239, distinguish between error types:
if (add_host((const char*)&buf) == -1) {
if (host_count >= MAX_HOSTS) {
printf("Maximum host limit (%d) reached, skipping: %s\n", MAX_HOSTS, buf);
// Could either exit or continue processing
} else {
printf("Malformed IP address: %s\n", buf);
exit(1);
}
}
Better Fix (More Robust)
-
Change
add_host()to return different error codes:-
0: Success -
-1: Malformed IP -
-2: Host limit reached -
-3: Duplicate/overlapping range
-
-
Add duplicate detection to skip IPs already in the array
-
Consider making the host array dynamically allocated rather than fixed at 65,536
Workaround for Users
Until this is fixed, users encountering this error should:
- Check if their IP ranges overlap (the second range might be redundant)
- Process large ranges separately if they exceed 65,536 total IPs
- Use more specific ranges to avoid hitting the limit
Additional Notes
- The error occurs regardless of whether IPs overlap - it's purely about hitting the 65,536 host limit
- The issue affects any input that would result in more than
MAX_HOSTS(65,536) total IPs - This limitation should at minimum be documented in the README and man page