Hello, tech enthusiasts!
The inspiration for this task was to automate the annoying weekly Network Scan for my company every Sunday on all the public facing IPs (that are changed every week), so that we already know about known vulnerabilities and mitigate them just after the weekly prod deployment.
So, let's get started!
TL;DR
Nessus Expert does have a schedule scan feature, however we cannot schedule scans while changing the IPs automatically, so I utilized the official Nessus API.
But the catch here is due to some issue in Nessus's API it doesn't schedule the scan with the API keys given by Nessus itself, it needs some UI specific cookies, which is strange since you should be able use the API keys!
That's what I have tried a workaround for here. Hope you may find this helpful, as I will be showcasing some reverse code analysis and network tab analysis to create this automation, which might help you in future for creating automation!
P.S. I will be using code snippets for reference, the whole code will not be shared here.
OPENING NOTE:
Now, your first question will be that -
You: If you are doing weekly network scan, you must have Nessus license?
Me: Yep, I do. Nessus Expert :)
You: Then, it already has a feature to schedule the scans, then why in the world are you making this useless automation?
Me: Cuz the IPs are changed every week -_- and there is no target refresh feature on Nessus.
Me: So, the good news is Nessus Expert have a official API documentation :O
You: Lol it would be so easy, me don't need this blog :|
Me: So did I thought in the beginning.. However..
Me: They do have a API, and it's the foundation for this automation, but the API keys for some reason don't work on the endpoint that runs the scan :|
IDK why... Maybe they forgot to maintain the API XD
Anyways, let's actually get started!
I have used Python for making this tool, cuz why not!
Now, you would need the following two API keys for interacting with the Nessus API - Access Key
and Secret key
Which you can find here: > My Account
> API Key
Generate your keys below, and save 'em to your config.yaml
file.
Then test it using below code (basic API functioning)
# Function to test a simple API call to Nessus
def test_nessus_connection():
try:
response = requests.get(f"{NESSUS_URL}/scans", headers=headers, verify=False, timeout=60)
log_response(response)
except Exception as e:
print(f"Error testing Nessus connection: {e}")
Now, you need to pass the API Key in the headers.
Look at the below code for understanding the basic Nessus API structural working, which shows all the scans you have done in past with the different scan templates used.
import requests
import yaml
# Load configuration from YAML file
with open("config.yaml", "r") as file:
config = yaml.safe_load(file)
# Nessus API setup
NESSUS_URL = config["nessus"]["url"]
ACCESS_KEY = config["nessus"]["access_key"]
SECRET_KEY = config["nessus"]["secret_key"]
headers = {
"X-ApiKeys": f"accessKey={ACCESS_KEY}; secretKey={SECRET_KEY}",
"Content-Type": "application/json",
}
# Function to list all scans
def list_scans():
response = requests.get(f"{NESSUS_URL}/scans", headers=headers, verify=False)
response.raise_for_status()
scans = response.json()["scans"]
return scans
def list_scan_templates():
response = requests.get(f'{NESSUS_URL}/editor/scan/templates', headers=headers, verify=False)
response.raise_for_status()
templates = response.json()['templates']
return templates
# Main
if __name__ == "__main__":
scans = list_scans()
for scan in scans:
print(f"ID: {scan['id']}, Name: {scan['name']}, Status: {scan['status']}")
templates = list_scan_templates()
for template in templates:
print(f"Name: {template['name']}, UUID: {template['uuid']}")
Scan API request template
Below was my first noobie attempt to run the scan using the API
NOTE: You need to run a scan and get the SCAN_ID from network tab analysis using dev-tools (that you'll add in the
config.yaml
file), to automate this scan.
# Function to launch a scan with updated targets from ips.json
def launch_scan():
with open("ips.json", "r") as file:
targets = json.load(file)
scan_data = targets
try:
response = requests.post(
f"{NESSUS_URL}/scans/{SCAN_ID}/launch",
headers=headers,
json=scan_data,
verify=False, # Disable SSL verification
timeout=120 # Increase timeout to 120 seconds
)
log_response(response)
except requests.exceptions.RequestException as e:
print(f"Error launching scan: {e}")
except Exception as e:
print(f"Unexpected error launching scan: {e}")
# Main execution
if __name__ == "__main__":
updateIPList()
launch_scan()
But Sadly I got the following error every time I ran the above code...Error launching scan: ("Connection broken: ConnectionResetError(10054, 'An existing connection was forcibly closed by the remote host', None, 10054, None)", ConnectionResetError(10054, 'An existing connection was forcibly closed by the remote host', None, 10054, None))
Bonus - In such cases you can add a proxy and check the response on you BurpSuite or any other network proxy tool.
Now, I had to check and debug the issue, so I added a proxy to my code to check if the request is even correct or not!
def launch_scan():
with open("ips.json", "r") as file:
targets = json.load(file)
scan_data = targets
print(scan_data)
try:
# added proxy urls
proxy = {"http": "http://127.0.0.1:8080", "https": "http://127.0.0.1:8080"}
response = requests.post(
f"{NESSUS_URL}/scans/{SCAN_ID}/launch",
headers=headers,
json=scan_data,
verify=False, # Disable SSL verification
proxies=proxy, # traffic Proxy enabled
timeout=120 # Increase timeout to 120 seconds
)
log_response(response)
except requests.exceptions.RequestException as e:
print(f"Error launching scan: {e}")
except Exception as e:
print(f"Unexpected error launching scan: {e}")
I got the below result -
As you can see that I am using access key
and secret key
, on the official suggested API endpoint by Nessus Documentation, that can be seen below but still it shows API is not available
error.
Now, from here onwards it got really annoying!
I had wasted my ample time retrying and debugging the same endpoint, but no luck.
At this point most of you'll think that why not use selenium...
Well, for starters I didn't wanted to, cuz it's not efficient and prone to failures and might fail on successive app updates. Also, it'll need periodic maintainance.
Now, we'll see how I solved this problem by not using selenium but still able to run and schedule the scan!
Problem and Solution
Identifying the problems
I began by exploring the frontend UI using the NETWORK TAB in DevTools and then switched to BurpSuite. I identified two headers for authentication: X-Cookie
and X-Api-Token
.
Initially, I copied these values from the UI to BurpSuite, and both headers needed to be used together for the request to work.
Problem 1: Obtain X-Cookie
and X-Api-Token
after login.
X-Cookie
was straightforward as it appeared in the login response. However, X-Api-Token
was trickier, appearing only after 2-3 requests post-login, suggesting it's set client-side.
I navigated to the SOURCE tab in DevTools, searched for X-Api-Token
in the JavaScript source code (nessus6.js
file), and found it being set dynamically.
Where the second occurrence show it is setting the X-Api-Token
on the fly and we can see the token is returned as well just above the function.
Now, Problem 2: Identify or generate the client-side X-Api-Token
.
PROBLEM 3: The ultimate goal of the automation script, which will be determined once the script is complete.
Creating a solution
For Problem 1, I logged in using credentials obtained from the Network tab request to get the X-Cookie
.
For Problem 2, I couldn't find a direct source to generate the X-Api-Token
. Instead, I speculated that the TOKEN
, returned in the nessus6.js
file (which we saw earlier), must be from a dynamic JS source. I used regex to fetch the X-Api-Token
from this file after login and saved it in the config.yaml
along with the X-Cookie
.
This allowed the request to launch the scan with the necessary headers. Here's the working solution:
def fetch_and_update_api_token():
js_url = f"{NESSUS_URL}/nessus6.js"
try:
response = requests.get(js_url, verify=False, timeout=60)
if response.status_code == 200:
pattern = (
r"\b[0-9a-f]{8}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{12}\b"
)
matches = re.findall(pattern, response.text)
if matches:
new_x_api_token = matches[
0
] # Assuming the first match is the required token
update_config(new_x_api_token=new_x_api_token)
print("---------------------------------------")
print(new_x_api_token)
print("---------------------------------------")
else:
print("No X-Api-Token found in the JavaScript file")
else:
print(
f"Failed to fetch JavaScript file: {response.status_code} - {response.text}"
)
except requests.RequestException as e:
print(f"Request failed: {e}")
And here itself the script was done and working perfectly!
config.yaml
file -
nessus:
scan_id:
access_key:
secret_key:
url: https://localhost:8834
username:
password:
x_api_token:
x_cookie:
Problem 3 (the actual Goal)
The goal was to automate the weekly scan scheduling, freeing myself from manual scheduling every Sunday.
So, Now, how should I schedule it?
You: Bruh! Now it's fairly simple, use schedule library in python to schedule it, lol.
Me: Inefficient, as it requires the script to run 24/7, which is impractical on a VPS with potential downtime.
You: Oh! Then maybe try Task Scheduler or schtasks.exe?
Me: So, did I thought, but didn't work due to the task's fast execution (when using task scheduler or schtasks.exe) and my script's longer run time.
SOLUTION: I found a freeware workaround called Task Till Dawn, which simplifies scheduling on Windows and Mac.
SOLUTION 3 -
Just create a simple batch script
@echo off
REM Change directory to the location of your Python script
cd $PATH/
REM log the output to a file
python nessus_automation.py > output.log 2>&1
Drag and drop the script into Task Till Dawn and edit the task. Actions will be set automatically on the tool's home dashboard.
Add your schedule -
Save and Close
Now, your script will run automatically on schedule, even after a machine restart. The schedule will remain unaffected unless the machine is shut down during execution or the code is deleted.
Thanks for reading!
If this blog helped, give it a like! For any improvements or feedback, feel free to contact me on LinkedIn. Always open to constructive feedback!
Thanks and see you in the next blog!