Azure Data Factory utcNow() nuances

I've been working in and around SQL for over 3 decades.
I am currently working on making our business intelligence systems self-driven by using an event-based architecture. This involves providing curated data in some of the largest SQL Azure Analysis Services cubes ever built (over 6 billion records in 1 table alone).
I've developed several APIs leveraging a serverless Azure architecture to provide real-time data to various internal processes and projects.
Currently working on Project Dormouse; Durable select messaging queue that enables polling processes to extract specific messages.
For those of you who work with Azure Data Factories I thought I’d help you out with, what I would consider a bug in how pipelines work. For the record, I work with pipelines on almost a daily basis, but I am generally pushing data into Microsoft SQL Server. In this specific instance, I am pushing data into Snowflake and that data includes dates.
Microsoft SQL Server is very lenient when it comes to the format of a date it will be ingesting. It can be:
‘6/3/2025’
‘6/03/2025’
‘06/03/2025’
‘2025-6-3’
‘2025-06-3’
The list goes on. Snowflake however, is very specific: ‘YYYY-MM-DD’.
The pipeline I was building needed to add a FILEDATE column to the .csv file it was creating that would be ingested by Snowflake. I added the column into the source of the Copy data activity as follows:

The expected result would be: “2025-06-03T13:11:00” as per the documentation of the utcNow() function.
The result provided:

As I mentioned before, this would be an acceptable datetime in Microsoft SQL Server, so it was never an issue. On ingestion to Snowflake, this blew up. I tried the FormatDate function, I tried utcNow(‘yyyy-mm-ddTHH:mm:ss’) and numerous variations of that, all to no avail.
If however, you put the same function call in a variable:

You get the correct result:

Then simply use the variable vs the function when adding it to your source.



