-
Notifications
You must be signed in to change notification settings - Fork 4.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add simplified integration framework #42450
Conversation
This pull request does not have a backport label.
To fixup this pull request, you need to add the backport labels for the needed
|
|
c9bcb84
to
efd3b0d
Compare
dafbfd4
to
125a560
Compare
Pinging @elastic/elastic-agent-data-plane (Team:Elastic-Agent-Data-Plane) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Overall it's a very nice initiative. The only thing I'm missing is a readme/documentation about which framework to sue and the features this one has.
You wrote a very detailed PR description with examples, etc. That could become the readme. What do you think?
// NewTest creates a new integration test for Filebeat. | ||
func NewTest(t *testing.T, opts TestOptions) Test { | ||
return &test{ | ||
BeatTest: integration.NewBeatTest(t, integration.BeatTestOptions{ | ||
Beatname: "filebeat", | ||
Config: opts.Config, | ||
Args: opts.Args, | ||
}), | ||
} | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[Suggestion]
Either rename NewTest
to something like NewFilebeatTest
or add Beatname
as argument, so it's clear whether this function is specific to a Filebeat test or generic.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Since the full import is already filebeat/testing/integration
it does not seem necessary to repeat that again in the function name.
cfgPath := filepath.Join(t.TempDir(), fmt.Sprintf("%s.yml", opts.Beatname)) | ||
homePath := filepath.Join(t.TempDir(), "home") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It's not a big deal, but each call to t.TempDir
returns a different directory. Why not use a single directory?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
On a different topic, t.TempDir
is very convenient, however when debugging failing (or flaky) tests often I find myself needing to look at the files the test is using, that's why our other integration tests framework keeps the test folder on failure.
What do you think about implementing a similar logic here?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I can't come up with a test case when you would need to look at the configuration file you generated and at the registry file. Logging is going to stderr instead.
If the test fails it prints the last N lines from the output, it's configurable.
Everything should be observed through the output of the Beat. That's my idea of integration tests, we should test everything like our customers run it. For example, if you want to see the stored offset in the state, it should be tested through ingested lines, not inspecting the registry. The registry implementation should be covered by its unit tests instead.
Good catch with the temporary directory though, I was not aware it returned a different directory every time.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I can't come up with a test case when you would need to look at the configuration file you generated and at the registry file.
It's not a test case, it's us (human engineers) debugging/writing a test, where something is not working as expected and we need to debug/investigate manually rather than just running the tests.
At least for me, this is a very common workflow.
Anyways, it does not need to be implemented or discussed in depth on this PR. We can leave it as it is and later discuss/iterate on it.
Everything should be observed through the output of the Beat. That's my idea of integration tests, we should test everything like our customers run it.
I agree!
For example, if you want to see the stored offset in the state, it should be tested through ingested lines, not inspecting the registry. The registry implementation should be covered by its unit tests instead.
That I disagree because having every possible state we need to test in the logs can be very verbose and heavy. Anyways, that's not a discussion for now.
That can be used for running Beats binaries in integration tests. This framework has a simplified API comparing to the existing one and uses a more efficient way to search for logs in the output of the command.
125a560
to
7ec435f
Compare
@belimawr I've added a README and addressed your feedback. |
31bb31d
to
bb39651
Compare
@AndersonQ suggested a good idea to always rebuild the binary when we run |
Add simplified integration framework That can be used for running Beats binaries in integration tests. This framework has a simplified API comparing to the existing one and uses a more efficient way to search for logs in the output of the command. (cherry picked from commit a6ab04f)
Add simplified integration framework That can be used for running Beats binaries in integration tests. This framework has a simplified API comparing to the existing one and uses a more efficient way to search for logs in the output of the command. (cherry picked from commit a6ab04f) Co-authored-by: Denis <[email protected]>
That can be used for running Beats binaries in integration tests.
This framework has a simplified API comparing to the existing one and uses a more efficient way to search for logs in the output of the command.
Checklist
- [ ] I have made corresponding changes to the documentation- [ ] I have made corresponding change to the default configuration files- [ ] I have added an entry inCHANGELOG.next.asciidoc
orCHANGELOG-developer.next.asciidoc
.Motivation
Recently I needed to write a few integration tests that run the Filebeat binary and I found the existing integration framework very cumbersome. It requires a lot of boilerplate code and the purpose of any framework is to illuminate that.
Since we're going to eventually migrate all the integration tests written in Python, we need to have a very simple tool for writing integration tests that should cover most of the test-cases.
Both frameworks (old and new) can co-exist for now.
Pros
This design has a few advantages over the previous one:
For example, test similar to the one written in the older framework:
https://github.com/elastic/beats/blob/main/filebeat/tests/integration/filebeat_test.go#L37-L94
Can be replaced with this:
Additionally, it also includes validation that:
Another example, this time we expect Beat to crash:
Cons
Current functionality
Basic Assertions
Filebeat-specific Assertions
Reporting
Config