Hi all. I was wondering if anyone has any experien...
# help
d
Hi all. I was wondering if anyone has any experience shipping the
audit.log
file to a SIEM like Wazuh? We want to decouple authorization and ship all authorization-based decisions to a central point for auditing purposes. Would love to get some feedback.
c
Hi, I haven't deployed Wazuh myself so I can't help you with the specifics. Generally, we expect the Cerbos audit log file to be read by a log aggregation agent and forwarded on to the destinations it needs to go to. I see that Wazuh has a component called Agent that fits that description. Other popular log aggregation agents include Vector, Logstash, Fluentd, DataDog etc. Most cloud providers automatically aggregate output to
stdout
and
stderr
as well so you might even be able to set the
audit.file.path
of Cerbos to
stdout
and let the audit logs be captured that way.
d
Great, thanks! Seems like it should be compatible. I am going to try this in the future and will update you. 🙂
So here I am again. Shipping audit logs to Wazuh was really easy. Just setup a JSON decoder and custom rule and enable the audit.log file on the ossec.conf on the agent. One thing I was wondering: the audit.log captures the peer adres but that seems to be the IP adres from the docker container running cerbos. Is there any way to receive the IP of the end-user executing the requests?
c
That's great to hear! How are your applications calling Cerbos? If it's possible to set the
x-forwarded-for
or
x-forwarded-host
header, the peer IP will be derived from that. Otherwise it'd be the caller IP address as reported by the TCP connection. It shouldn't be the same IP address as the Cerbos container though (unless the requests are coming from the same container)
d
The current infrastructure is as followed: Browser -> Apache -> Java application From the Java application it goes to the Cerbos docker container. Is it possible to pass the X-forwarder-for header from the java application to the Cerbos container?
c
The new version of the Java SDK (0.9.0) has a
withHeaders
method that you can use to forward the
x-forwarded-for
header or any other header for that matter. So, if there's anything you'd like to be recorded in the audit log, you can send them as headers. By default, Cerbos doesn't log all headers because some of them contain sensitive data like access tokens. So, if you use a custom header, make sure to add it to the
audit.includeMetadataKeys
configuration of Cerbos (https://docs.cerbos.dev/cerbos/latest/configuration/audit)
d
Amazing! Will try in the future and give you an update. 🙂 Thanks for the fast reply.
Hi. So I configured the X-Forwarded-For header on the Java application and it now gives back the IP of the client submitting the requests. I also added the
'X-Forwarded-For'
in the audit configuration (includeMetadataKeys). Are there any other steps I need to configure so that the peer address uses the client IP?
c
Hi. Sorry, I probably wasn't quite clear in my previous answer. The peer address is always the address of the host that sends the request to Cerbos. It's not possible to change that with a header because it's important to know where the request was sent from. The
x-forwarded-for
header is the way to identify the originator of the request. I think that is the usual way of interpreting request logs in most systems.
d
Ah I get that. So including the
X-Forwarded-For
header in the
includeMetadataKeys
adds the header to the audit log. Still, I dont seem to get it working. I added the header to the
includeMetadataKeys
block but it seems to not add it to audit.log. I added a screenshot so you can see my config.yaml.
c
That looks correct. Could you check the Cerbos log (the stdout one, not the audit) to see if the log line contains the x-forwarded-for information?
d
I use docker. Do you mean the logs the container produces?
c
Yes
d
Unfortunately..
{"log.level":"info","@timestamp":"2024-01-31T09:36:14.580Z","log.logger":"cerbos.grpc","message":"Handled request","protocol":"grpc","grpc.component":"server","grpc.service":"cerbos.svc.v1.CerbosService","grpc.method":"CheckResources","grpc.method_type":"unary","cerbos":{"call_id":"01HNFE0K7H31K0K5AJHXEQPKH4"},"grpc.request.meta":{"request_id":"55b42ed1-519d-4cf0-9735-3b3b036dcfe6"},"peer.address":"dummy_ip:49988","grpc.start_time":"2024-01-31T09:36:14Z","grpc.request.deadline":"2024-01-31T09:36:15Z","grpc.code":"OK","grpc.time_ms":"2.327"}
c
Oh my mistake! I forgot it was a grpc request which doesn't log the xff. Let me just try something out and get back to you.
So, this is how my audit log looks:
Copy code
{"log.logger":"cerbos.audit","log.kind":"decision","callId":"01HNFE9M9C8RCDN7S0J34ABHGD","timestamp":"2024-01-31T09:41:10.583806308Z","peer":{"address":"127.0.0.1:47050","userAgent":"grpcurl/1.8.9 grpc-go/1.57.0","forwardedFor":"xxx"}, ...
Does yours not contain
peer.forwardedFor
?
d
{"log.logger":"cerbos.audit","log.kind":"decision","callId":"01HNFEH9NC8M47SM05AJSNHYVM","timestamp":"2024-01-31T09:45:21.843917215Z","peer":{"address":"dummyip:dummyport","userAgent":"grpc-java-netty/1.61.0"},"checkResources":{"inputs":[{"requestId":"d5d694a9-fd54-4511-b761-90fff9fe635e","resource":{"kind":"klant","id":"_NEW_","attr":{"id":"1"}},"principal":{"id":"1","roles":["ROLE_USER"],"attr":{"rekeningen":"[\"NL00RABO00000000\"]"}},"actions":["read"],"auxData":{}}],"outputs":[{"requestId":"d5d694a9-fd54-4511-b761-90fff9fe635e","resourceId":"_NEW_","actions":{"read":{"effect":"EFFECT_ALLOW","policy":"resource.klant.vdefault"}},"effectiveDerivedRoles":["ROLE_KLANT_OWNER"]}]},"auditTrail":{"effectivePolicies":{"resource.klant.vdefault":{"attributes":{"driver":"disk","source":"klant.yaml"}}}}}
It looks like the application does not ship the header to the cerbos application. Not sure why.
c
You're using the Java SDK, aren't you? How do you set the header? I just tried with this:
Copy code
client.withHeaders(Map.of("x-forwarded-for", "xxx")).check(...)
Copy code
{
  "log.logger": "cerbos.audit",
  "log.kind": "access",
  "timestamp": "2024-01-31T09:58:07.178950633Z",
  "callId": "01HNFF8N2AFKQX50CWNKPZ0G1Q",
  "peer": {
    "address": "127.0.0.1:56062",
    "userAgent": "grpc-java-netty/1.61.0",
    "forwardedFor": "xxx"
  },
  "method": "/cerbos.svc.v1.CerbosService/CheckResources"
}
d
What did you define in your
.check
? I'm trying to configure the method the same as you.
Also, I'm calling this method in my
CerbosAuthorizationService.java
file. This is where my global java configuration of Cerbos is defined. Is this the correct place to call the
.withHeaders
method?
c
This is the full invocation:
Copy code
CheckResult have =
        client.withHeaders(Map.of("x-forwarded-for", "xxx")).check(
            Principal.newInstance("john", "employee")
                .withPolicyVersion("20210210")
                .withAttribute("department", stringValue("marketing"))
                .withAttribute("geography", stringValue("GB")),
            Resource.newInstance("leave_request", "xx125")
                .withPolicyVersion("20210210")
                .withAttribute("department", stringValue("marketing"))
                .withAttribute("geography", stringValue("GB"))
                .withAttribute("owner", stringValue("john")),
            "view:public",
            "approve");
I don't know exactly how your code is structured. I would have a global Cerbos client created from configuration and then wherever I handle the incoming requests to the app, I'd call
withHeaders
on the global client to obtain a request-scoped client instance with the correct headers
d
Hi again Charith. First off, thank you for taking the time to reply to all my questions. I'm trying to invocate the
.withHeaders
method but I'm receiving a pretty weird error:
error: cannot access Metadata
cerbosClient.withHeaders(Map.of(headerKey, headerValue));
^
class file for io.grpc.Metadata not found
I tried adding gRPC to my build.gradle file but then there seems to be a dependency conflict as Cerbos doesn't function anymore.
c
Oh, can you check
./gradlew dependencies
to see whether there's another dependency that's pulling in a conflicting version of gRPC?
d
Seems to be a version issue:
> BUG! exception in phase 'semantic analysis' in source unit '_BuildScript_' Unsupported class file major version 65
c
Which version of Java are you on?
d
I use Java 11
c
The Cerbos SDK is built with 11 too but I suspect some dependencies have moved to newer versions of Java. IIRC, Java 11 is EOL now.
Probably need to use an older version that was built with Java 11
d
So I finally got it with the help of a developer. Thanks again Charith.
c
Happy to help. I am glad it worked out.