spot-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Dimitris Papadopoulos <>
Subject Captured netflow v9 fields/tags different from those on public datasets
Date Sat, 08 Jul 2017 23:30:05 GMT
Hi all,

I 'm posting this here, in case it's more visible than in the general Slack

We have installed Spot on a testbed (Ubuntu 14.04, CDH 5.11), trying to
simulate a DDoS attack in order to test the platform's detection

We are using a DDoS simulation tool to attack one of our websites, while
capturing netflow traffic (nfcapd) which should normally be ingested and
passed to the hdfs and to hive tables.

Unfortunately, while the flow worker tries to output the nfdump command to
.csv, it fails , probably due to the fact that the netflow fields provided
by our captured traffic are different than those expected.

More specifically, our *nfdump -r -o csv *command outputs files with the
following headers:

while the public AWS datasets that Spot works with, output just the
following headers:

I would like to know the suggested procedure to capture netflow traffic
with the correct format, as it seems that a simple nfcapd command is not
My colleague is getting the .nfcapd files from a pfsense firewall and he
seems to have matched the correct format (although some issues with the
timestamp of the records have emerged - 1/1/1970 is displayed, probably due
to null values).

I would really appreciate your help, either by replying to this mail, or
via Slack.

Best Regards,

  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message