Tweet |
Image generated by DALL-E 3 from a prompt by Bob Buzzard
Introduction
In my last blog post (A Tale of Two Contains Methods) I mentioned that I'd spent quite a bit of December taking part in
Advent of Code.
Each day there were two challenges - a (relatively) straightforward one, that
could potentially be brute forced, and an extended version where brute forcing
would take days so using the a more thoughtful approach was required. As I was
tackling these challenges using Apex, brute forcing wasn't really an option,
so my solution typically involved building structures of complex objects in
memory in order to be able to process them quickly. Pretty much every extended
version required batch Apex to handle the volumes, and in a few cases the
(relatively) straightforward one did too.
The combination of the complex object structure and batch Apex threw up some
interesting errors, so I decided to blog about one of these. A couple of
things to note:
- This isn't a moan about batch Apex - I was using it in a way that I'm pretty sure it wasn't intended for, and there was a simple workaround
- By complex object I just mean one that is made up of primitives, simple(r) objects and collections - it doesn't mean it was a particularly difficult structure to comprehend or change.
The Challenge
(Some of the challenge detail has been removed for clarity - you can see it in
its full glory
here)
Part 1 of the challenge in question was around bricks of varying length in a
3-dimensional structure (essentially a large cube) that had landed on top of
each other like a weird Jenga puzzle. Based on the starting coordinate and
dimensions of each brick, I needed to figure out how the bricks were supported
in the structure.
The approach I took was to represent a brick as an object and hold two
associated collections for each Brick instance:
- Supporters - these are the Bricks that are directly beneath this Brick and in contact with it.
- Supporting - these are the Bricks that this brick is directly beneath and supporting.
The answer I had to calculate to complete the challenge was number of bricks
that I could remove without causing any other bricks to fall. This could be
accomplished by iterating the bricks and adding up all of those where all of
the Supporting bricks are also supported by others.
Part 2 was to find sum of the bricks that would fall if each of the bricks
were removed. With the structure that I had in place, this was actually quite
simple. I iterated the bricks, found all of the Supporting entries where that
brick was the only Supporter, and then found all of their Supporting entries
where they were the only Supporter and so on until I reached the end. This
would definitely need batch Apex though, as there were 1,500 bricks in the
actual challenge input.
Each challenge includes a small example with the workings and answers - 6
bricks in this case - so I was able to test my batch Apex before
executing with the larger volume of data.
My Brick class was as follows:
public class Brick { public String brickNo; public Point3d startPoint; public Point3d endPoint; public Integer width; public Integer depth; public Integer height; public Set<Brick> supporters=new Set<Brick>(); public Set<Brick> supporting=new Set<Brick>(); public Integer totalSupporters=0; }
The start method of the Batch class converted the input into a collection of
Bricks and then returned a collection of Integers, one per Brick. I
implemented Database.Stateful so that the collection of Bricks was
available across each execute method, and then processed the Bricks who's
brickNo appeared in the scope. Essentially I'd broken up my iteration of the
Bricks across a number of transactions, while ensuring I only had to build the
Bricks structure once at the start.
When I ran this with the example, it worked fine and gave me the correct
answer.
The Problem
I then fired it off with the (much larger) challenge input, and was initially
pleased to see that I was able to build the in-memory structure without
running into any issues around heap or CPU. Sadly this pleasant sensation was
short lived, as the first batch that executed generated the following output:
Based on the debug that I had in the class, it was clear that the batch job
was failing before it was getting to any of my code. After some binary chop
style debugging, where retried the batch with various parts of the code
commented out, it turned out that the issue was my collections:
public Set<Brick> supporters; public Set<Brick> supporting;
As I already had the full collection of Bricks stored in a Map keyed by
brickNo, turning these into sets of Strings and storing the brickNo rather
than a reference to the Brick itself didn't need much in terms ot changes to
the code, and allowed the batch to complete without issue.
So why were Sets of Strings okay by Sets of Bricks not? Once I was into a
large cube with 1,500 bricks in it, it looked like the sets got pretty big. As
the Bricks were stored in an instance variable, they were part of the state of
the batch and thus de/serialised for each batch processed. Obviously I'm not
privy to exactly how the batch processing in Apex works, but I'd imagine that
serialising ended up with a pretty huge structure with a lot of repetition, as
the same Brick instances were expanded many times as part of the
Supporters and Supporting collections. Deserialising this structure clearly
proved too much, hence the internal error.
In Conclusion
As mentioned earlier, this isn't intended to throw shade on batch Apex.
Storing large collections of complex objects that contain collections of other
complex objects so they can be accessed across transactions really isn't a
valid use case. This kind of information belongs in the database rather than
in the batch class, while Database.stateful is more appropriate for managing
things like running totals.
This is one of the reasons that I really enjoyed taking on Advent of Code with
Apex - I'm trying to solve problems that (a) I'd never encounter in a customer implementation and (b) the Salesforce platform is really not suited to handling.
This was also a lesson in the need to test with indicative data - everything worked fine with the small amount of test data I had available, but once I hit the real data the flaws were revealed!
No comments:
Post a Comment