Code Monkey home page Code Monkey logo

Comments (17)

janober avatar janober commented on August 26, 2024 2

In this case you would not really need a forEach node you would need more a node which splits the array data in different entries.
The reason is that the nodes have kind of a forEach built in (at least the most, still have to find a proper way to make clear which ones). They execute once for every entry they receive.

So the data which gets passed through the nodes is not a single object, it is an array of objects and each of it is an entry. And if add an expression on a node it will always reference the data of the current entry which gets processed.

I create an example bellow to make that hopefully clearer how to achieve what you want to do. It calls also an API which returns an array. The data gets then split apart in separate entries with a Function-Node (which simply allows to execute any kind of JavaScript). So 1 entry goes into the node and 10 exit. The HTTP Request-Node which comes afterwards makes then 10 requests and each request will send different data. (it sends data to an API which simply responds with the data it receives).

So currently there is no node yet which does the splitting of the entires. Still did not finish thinking about what functionality the node should also include and how it should be called. So will be kind of a Data-Transform-Node. Which should be able to do different things like splitting out data from an array (what in the example the Function-Node does) and also combine multiple entries into one in case somebody has for example 10 items and wants to send them with one call to an API.

You can simply copy the data bellow, click once into the n8n window and then paste it. It will then create the nodes and connections automatically.

{
  "nodes": [
    {
      "parameters": {},
      "name": "Start",
      "type": "n8n-nodes-base.start",
      "typeVersion": 1,
      "position": [
        250,
        300
      ]
    },
    {
      "parameters": {
        "url": "https://jsonplaceholder.typicode.com/posts?userId=1",
        "headerParametersUi": {
          "parameter": []
        }
      },
      "name": "Http Request",
      "type": "n8n-nodes-base.httpRequest",
      "typeVersion": 1,
      "position": [
        400,
        300
      ]
    },
    {
      "parameters": {
        "functionCode": "const newItems = [];\nfor (const item of items[0].json) {\n  newItems.push({json: item});\n}\nreturn newItems;"
      },
      "name": "Function",
      "type": "n8n-nodes-base.function",
      "typeVersion": 1,
      "position": [
        550,
        300
      ]
    },
    {
      "parameters": {
        "url": "https://postman-echo.com/get",
        "responseFormat": "string",
        "queryParametersUi": {
          "parameter": [
            {
              "name": "title",
              "value": "={{$node[\"Function\"].data[\"title\"]}}"
            }
          ]
        }
      },
      "name": "Http Request1",
      "type": "n8n-nodes-base.httpRequest",
      "typeVersion": 1,
      "position": [
        700,
        300
      ]
    }
  ],
  "connections": {
    "Start": {
      "main": [
        [
          {
            "node": "Http Request",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "Http Request": {
      "main": [
        [
          {
            "node": "Function",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "Function": {
      "main": [
        [
          {
            "node": "Http Request1",
            "type": "main",
            "index": 0
          }
        ]
      ]
    }
  }
}

from n8n.

janober avatar janober commented on August 26, 2024 2

Hello Thomas!

Thanks a lot for your questions/feedback!

I'm missing some documentation on the form messages take out of a node and into a node
Yes, that documentation (and a lot of other) is currently still missing and will hopefully come soon. Want to offer both written and also a basic video one which goes over basics like that. Sadly is my time quite limited right now (part-time job, another startup, family with my second child been born 21h ago) but I hope to be able to get such basic things up soon!

I'm puzzled by why in a function node item is an array if nodes are to treat messages item per item
Yes, a good question. The problem is that sometimes it is necessary to look at all the items at once and sometimes really a per-item basis is enough. That is also why currently two different function nodes exist. Your use case from above would for example not have been possible for a function node which only works on a per-item basis. Also if somebody wants to for example initialize the data and create a lot of items (like in example bellow) or merge them it has to be possible to receive and return all items at once.

Such an application will attract users with low technical knowledge. For those users...
People without technical knowledge are for sure one of the main target audiences but also people like myself with are a main focus. n8n is not just supposed to make it possible to for example connect different applications for people that can not program, it should also help people which can save a lot of time. I can for example for sure create some code which syncs data between two Google Spreadsheets. But till I found the documentation, implemented it, write code which informs me when something goes wrong, more code to make sure it stays up, deploy it on a server, ... will take at least a day and then, in the end, it is still not very well tested nor sure that it can handle different situations well. If I do the same however with n8n I really just have to just create two nodes (no code and no documentation to read, assuming I know n8n) and all the error-handling, uptime, setup and deploy work does not have to be done at all.
The function nodes are by far the most powerful and most complicated ones. So they are for sure not meant for the people without technical knowledge but they do not just allow technical people to really more or less anything, they can also temporarily close holes where n8n still has no node yet for. The great thing is also that technical people can write such nodes/workflows and then can simply publish them on n8n.io (to hopefully be released very soon, similar like what I did above) and the none technical people can then paste them into their workflows without the need to really understand what the node does under the hood.
With all that said nodes which do such important things like your use case above will still come and I am sure we will discover many more important nodes in the future, would be great to have you onboard for that. The function nodes are in this case just a "temporary solution" till the proper nodes are in place.
Not really sure how a LoopOverJsonArrayNode would work without changing completely how n8n currently works. It could maybe work similarly to the "Split In Batches" node (which releases with each iteration a new batch of items). Pasted an example workflow at the end. (after the "Set" node also an "If" node could be added which has as value an expression set like this: {{$node["SplitInBatches"].context["noItemsLeft"]}} It would so also be possible to keep on running other nodes after all the items have been processed).

I wonder what goals are on your roadmap and what use cases this app will target
My roadmap is very long and I want to make it accessible somewhere soon that people who want to contribute already know was would be needed. Also, want to make it possible that people can vote on what things they actually need. But it ranges from integrations for different services (Twitter, Google Drive, Typeform, ...) over adding tags to workflows (to be able to organize them easily if people have a lot) to packaged sub workflows that people can share and easily use and build upon the work of others.
The target audience of n8n is definitely not IoT. Think they already have a great tool with NodeRed with which I can and do not want to compete. NodeRed is meant for lower-level applications that is why they focus on very generic nodes with which you can build everything with (so they do not have a node for Pipedrive because you can do that already with a kind of HTTP Request node). n8n is more on a higher level where even though you could theoretically build it with other nodes it still gives you an easily usable node to achieve your goal easily.
So n8n is mainly meant as a free, more powerful, easily extendable and self hostable alternative to Zapier/Tray.io. The audience is supposed to start from small startups which do not want to spend a lot of money, over bigger companies which want to easily integrate own inhouse-tools up to big cooperations/banks/governments which do not want (and often can not have) that their data is in the cloud at some company in the USA.

The reason I'm not using NodeRed ...
Yes, that makes sense. And that is also what is important for me to have in n8n. There are already some functionalities which allow that but not there where it should be. So is it for example possible to set an "Error Workflow" which executes when another one fails (and so send an Email/Slack message or do whatever) for the basic error reporting but other things are still missing. So can you, for example, tell a node to retry 3 more times and if it then still fails to simply proceed. But there is no way yet to ever capture it that it did happen. So definitely still work there.

I'm looking for something half-way between you app or NodeRed and Celery or Python-RQ
Honestly, never thought about things like queueing of jobs within n8n. Thought that would happen more outside of it. That is why it is possible to start workflows directly via the command line by supplying the workflow-id or the workflow-json itself. Input data can currently sadly only be supplied via environment variables or by reading in a file in the workflow. But it is planned to make it also possible later to supply arguments.

I will not waste my time with Talend Open Studio (free raging here)
Interesting I have a very long list of other players in that field but never came across this one. As it does sound like you are not a big fan of them I guess it is maybe not that bad ;-)

I guess I would want a redis queue behind each link between nodes
Right now I tried to avoid adding outside dependencies like redis (the only one now is a database and would love to keep it that way to keep it simple, actually currently is also the database optional right now as it none is supplied it simply uses Sqlite). But not losing data is also important for me. Right now n8n is for sure not there yet. Till like 2 weeks ago everything did still run in the same node process. Now it creates a new process for each workflow execution so for now at least better CPU utilization and lower risk that one workflow brings down n8n in general or things blocking each other. But now also has the disadvantage that workflows take like 1 second to start up when before it was more or less instantaneous (not sure if there should always be a process on standby which can start to process directly or whatever). Also would not mind separating that even more to make it also for example possible to send workflows to things like Lambda.
A lot of code is still missing or not that great in the first place. The main reason is that it simply evolved over time, changed a lot and many things turned out to be more complicated than I anticipated. Was also all together way more work than I thought. So incredible much to improve and also a lot to rewrite at some point. But for now, I am already happy if everything works fine (even when the underlying code is not always that great).

The example workflow for batch processing:

{
  "nodes": [
    {
      "parameters": {},
      "name": "Start",
      "type": "n8n-nodes-base.start",
      "typeVersion": 1,
      "position": [
        250,
        300
      ]
    },
    {
      "parameters": {
        "batchSize": 2
      },
      "name": "SplitInBatches",
      "type": "n8n-nodes-base.splitInBatches",
      "typeVersion": 1,
      "position": [
        650,
        300
      ]
    },
    {
      "parameters": {
        "functionCode": "const newItems = [];\nfor (let i=0;i<10;i++) {\n  newItems.push({json: {number: i}})\n}\nreturn newItems;"
      },
      "name": "Function",
      "type": "n8n-nodes-base.function",
      "typeVersion": 1,
      "position": [
        450,
        300
      ]
    },
    {
      "parameters": {
        "values": {
          "string": [
            {
              "name": "myTest",
              "value": "=Number is: {{$node[\"SplitInBatches\"].data[\"number\"]}}"
            }
          ]
        }
      },
      "name": "Set",
      "type": "n8n-nodes-base.set",
      "typeVersion": 1,
      "position": [
        850,
        350
      ]
    }
  ],
  "connections": {
    "Start": {
      "main": [
        [
          {
            "node": "Function",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "SplitInBatches": {
      "main": [
        [
          {
            "node": "Set",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "Function": {
      "main": [
        [
          {
            "node": "SplitInBatches",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "Set": {
      "main": [
        [
          {
            "node": "SplitInBatches",
            "type": "main",
            "index": 0
          }
        ]
      ]
    }
  }
}

from n8n.

janober avatar janober commented on August 26, 2024 1

Thanks a lot for both!

Nothing to thank for. Contributors are the single most important thing for a project so if people take the time to ask good and insightful questions the least I can do is to answer them properly.

About your use cases:

fetch Advertising stats from [Bing Ads Api, Facebook Ads Insights api, ...], normalizing data (date format, etc) and finally uploading those to Google Analytics API
That should currently be possible with the HTTP Request node. But an own node for each of them would probably not be wrong in the long term.

read excel files (or csv, tsv) from a folder and import the data into a postgres database
The reading in spreadsheet files part should already be possible without a problem (Read File and Spreadsheet File node are already there). Writing it to Postgres is definitely still missing. But such a node should not be too complicated. So if I know that it is important for you I could have a look.

query a postgres view and generate a tsv/Excel file
The same like above. Postgres still missing. Saving data in Spreadsheet is already there.

from n8n.

janober avatar janober commented on August 26, 2024

Just wanted to check-in if it did work for you or if there is anything else I can help you with.

from n8n.

thomasleveil avatar thomasleveil commented on August 26, 2024

Hi, thank you for the example, it works well.

Here are my thoughts after trying it out:

  • I'm missing some documentation on the form messages take out of a node and into a node
  • I'm puzzled by why in a function node item is an array if nodes are to treat messages item per item
  • Such an application will attract users with low technical knowledge. For those users the FunctionNode and FuntionItemNode will be the most confusing ones, and they will seek whatever other node to fit their use case before resigning on trying to use the FunctionNode. That re-enforces my belief that a simple LoopOverJsonArrayNode would be useful and would broaden such an app adoption
  • I wonder what goals are on your roadmap and what use cases this app will target (knowing that NodeRed has similar functionalities so far, but mainly targeted towards IoT's community needs)
  • The reason I'm not using NodeRed is that it misses features regarding the exploitation of a workflow in production (healthchecks, tracing of actions, metrics, ...) means to know that things went ok or not, and means to retry failed actions basically
  • I'm looking for something half-way between you app or NodeRed and Celery or Python-RQ
  • I will not waste my time with Talend Open Studio (free raging here)
  • I guess I would want a redis queue behind each link between nodes and another one behind each node to hold messages that failed to be processed (so no message is ever lost in case of failures)

So I might not be your typical user I guess. Don't take my use cases and remarks to make your strategic decisions ;)

from n8n.

thomasleveil avatar thomasleveil commented on August 26, 2024

Thank you for taking the time to answer my questions and congrats for your newborn! I set up Github to watch this repo for new releases and will follow your progress.

If I was to use this tool at work, the use cases would be :

  • fetch Advertising stats from [Bing Ads Api, Facebook Ads Insights api, ...], normalizing data (date format, etc) and finally uploading those to Google Analytics API
  • read excel files (or csv, tsv) from a folder and import the data into a postgres database
  • query a postgres view and generate a tsv/Excel file

from n8n.

janober avatar janober commented on August 26, 2024

Created now a basic Postgres-Node and released a new version. Bellow you can find a workflow with some examples:

  • Read data from Postgres, converting it to XLS and save it to disk
  • Read XLS from file, convert it to JSON and then insert it in Postgres
  • Simple update of data

Hope that helps!

{
  "nodes": [
    {
      "parameters": {
        "functionCode": "const newItems = [];\nfor (let i=1;i < 6; i++) {\n  newItems.push({\n    json: {\n      id: i,\n      name: `New name ${i}`,\n      ean: `New EAN ${i}`,\n    }\n  });\n}\nreturn newItems;"
      },
      "name": "Function1",
      "type": "n8n-nodes-base.function",
      "typeVersion": 1,
      "position": [
        450,
        600
      ]
    },
    {
      "parameters": {
        "operation": "executeQuery",
        "query": "SELECT name, ean FROM product"
      },
      "name": "Run Query",
      "type": "n8n-nodes-base.postgres",
      "typeVersion": 1,
      "position": [
        450,
        200
      ],
      "credentials": {
        "postgres": ""
      }
    },
    {
      "parameters": {
        "operation": "update",
        "table": "product",
        "columns": "name,ean"
      },
      "name": "Update Rows",
      "type": "n8n-nodes-base.postgres",
      "typeVersion": 1,
      "position": [
        600,
        600
      ],
      "credentials": {
        "postgres": ""
      }
    },
    {
      "parameters": {
        "operation": "toFile"
      },
      "name": "Spreadsheet File",
      "type": "n8n-nodes-base.spreadsheetFile",
      "typeVersion": 1,
      "position": [
        600,
        200
      ]
    },
    {
      "parameters": {
        "fileName": "spreadsheet.xls"
      },
      "name": "Write Binary File",
      "type": "n8n-nodes-base.writeBinaryFile",
      "typeVersion": 1,
      "position": [
        750,
        200
      ]
    },
    {
      "parameters": {
        "filePath": "spreadsheet.xls"
      },
      "name": "Read Binary File",
      "type": "n8n-nodes-base.readBinaryFile",
      "typeVersion": 1,
      "position": [
        450,
        400
      ]
    },
    {
      "parameters": {},
      "name": "Spreadsheet File1",
      "type": "n8n-nodes-base.spreadsheetFile",
      "typeVersion": 1,
      "position": [
        600,
        400
      ]
    },
    {
      "parameters": {
        "table": "product",
        "columns": "name,ean"
      },
      "name": "Insert Rows1",
      "type": "n8n-nodes-base.postgres",
      "typeVersion": 1,
      "position": [
        750,
        400
      ],
      "credentials": {
        "postgres": ""
      }
    }
  ],
  "connections": {
    "Function1": {
      "main": [
        [
          {
            "node": "Update Rows",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "Run Query": {
      "main": [
        [
          {
            "node": "Spreadsheet File",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "Spreadsheet File": {
      "main": [
        [
          {
            "node": "Write Binary File",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "Read Binary File": {
      "main": [
        [
          {
            "node": "Spreadsheet File1",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "Spreadsheet File1": {
      "main": [
        [
          {
            "node": "Insert Rows1",
            "type": "main",
            "index": 0
          }
        ]
      ]
    }
  }
}

from n8n.

thomasleveil avatar thomasleveil commented on August 26, 2024

Thank you for the update. I guess we should close this issue as it's getting off topic.

Maybe you could make a workflow example directory to store those handy samples 🎉

from n8n.

janober avatar janober commented on August 26, 2024

You are welcome.

I am currently working on the n8n.io website. It will allow exactly that (at least the first version, later will allow more). People can publish there very easily different workflows. It links then automatically the used nodes so that users can also find how they can be used.
Hope I can publish a first simple version soon. Sadly did not have much time to work on it the last days, hope will have a little bit more the next ones.

from n8n.

auntieyi avatar auntieyi commented on August 26, 2024

Hi @janober,
In my case, the output of the previous node is an array.
Then use MySQL or Postgres nodes, there is no problem if I use INSERT or UPDATE operations.
But if use "Execute Query", will encountering different errors if the length of the array of the previous node is > 1.
I am not sure the problem only happening in MySQL or Postgres of "Execute Query".

from n8n.

janober avatar janober commented on August 26, 2024

What error do you get?
Be aware that if the length is > 1 it will execute the query for each of the items. That means if you have 10 items it will execute the query 10x. So it is quite possible that this causes the error.
If you just want to execute the query only once, you probably have to use a Function-Node for now which only returns one item.

from n8n.

auntieyi avatar auntieyi commented on August 26, 2024

One example on Postgres:

Query:
INSERT INTO table_a (col1, col2) VALUES ('{{$node["Function"].data["col1"]}}', '{{$node["Function"].data["col2"]}}') ON CONFLICT (col1) DO UPDATE SET updated_at = to_timestamp({{$node["AddCurTS"].data["curTS"]}} / 1000.0);

Error if the length of array > 1 :
function to_timestamp() does not exist

from n8n.

janober avatar janober commented on August 26, 2024

Do all the items produce valid queries? Like do for example all contain the property "curTS"?

from n8n.

auntieyi avatar auntieyi commented on August 26, 2024

I think should have property "curTS".
Can you help to look into my workflow?

table_a in MySQL:

CREATE TABLE `table_a` (
  `id` int(10) NOT NULL AUTO_INCREMENT,
  `col1` text DEFAULT NULL,
  `created_at` timestamp NOT NULL DEFAULT current_timestamp(),
  `updated_at` timestamp NOT NULL DEFAULT current_timestamp(),
  PRIMARY KEY (`id`),
  UNIQUE KEY `uk_col1` (`col1`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8;

MySQL Workflow:

{
  "nodes": [
    {
      "parameters": {},
      "name": "Start",
      "type": "n8n-nodes-base.start",
      "typeVersion": 1,
      "position": [
        250,
        300
      ]
    },
    {
      "parameters": {
        "url": "https://jsonplaceholder.typicode.com/posts?userId=1",
        "options": {},
        "headerParametersUi": {
          "parameter": []
        }
      },
      "name": "Http Request",
      "type": "n8n-nodes-base.httpRequest",
      "typeVersion": 1,
      "position": [
        340,
        490
      ]
    },
    {
      "parameters": {
        "functionCode": "const newItems = [];\nconst ts = new Date($node[\"Set\"].data[\"curTS\"]).toISOString().slice(0, 19).replace('T', ' ');\nfor (const item of $node[\"Http Request\"].data) {\n  newItems.push({json: {id: item.id, title: item.title, updated_at: ts}});\n}\nreturn newItems;"
      },
      "name": "Function",
      "type": "n8n-nodes-base.function",
      "typeVersion": 1,
      "position": [
        670,
        490
      ]
    },
    {
      "parameters": {
        "operation": "executeQuery",
        "query": "=REPLACE INTO table_a (id, col1, updated_at) VALUES ({{$node[\"Function\"].data[\"id\"]}},'{{$node[\"Function\"].data[\"title\"]}}','{{$node[\"Function\"].data[\"updated_at\"]}}');"
      },
      "name": "MySQL",
      "type": "n8n-nodes-base.mySql",
      "typeVersion": 1,
      "position": [
        830,
        490
      ],
      "credentials": {
        "mySql": "MariaDB"
      }
    },
    {
      "parameters": {
        "keepOnlySet": true,
        "values": {
          "number": [
            {
              "name": "curTS",
              "value": "={{Date.now()}}"
            }
          ]
        }
      },
      "name": "Set",
      "type": "n8n-nodes-base.set",
      "typeVersion": 1,
      "position": [
        500,
        490
      ]
    },
    {
      "parameters": {
        "operation": "executeQuery",
        "query": "=SELECT * FROM table_a WHERE updated_at <= FROM_UNIXTIME(FLOOR({{$node[\"Set\"].data[\"curTS\"]}}/1000.0))"
      },
      "name": "MySQL1",
      "type": "n8n-nodes-base.mySql",
      "typeVersion": 1,
      "position": [
        980,
        490
      ],
      "credentials": {
        "mySql": "MariaDB"
      }
    }
  ],
  "connections": {
    "Http Request": {
      "main": [
        [
          {
            "node": "Set",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "Function": {
      "main": [
        [
          {
            "node": "MySQL",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "MySQL": {
      "main": [
        [
          {
            "node": "MySQL1",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "Set": {
      "main": [
        [
          {
            "node": "Function",
            "type": "main",
            "index": 0
          }
        ]
      ]
    }
  },
  "active": false,
  "settings": {}
}

Error occurred:

ERROR: You have an error in your SQL syntax; check the manual that corresponds to your MariaDB server version for the right syntax to use near '/1000.0))' at line 1

from n8n.

janober avatar janober commented on August 26, 2024

There are two problems:

  1. Like written above will it execute the "SELECT" query of node "MySQL1" multiple times (in this example 10x) even though you probably want it executed only once as all 10 would return the same result.
  2. The Node "Set1" has only 1 item and you then reference it in "MySQL1" which has 10. So it will work for the first item only (as that one has a corresponding item on both nodes). But it can not work on the others because as soon as it gets to item number 2 there is nothing it can reference on Node "Set".

The solution is to simply reorganize the nodes. You, for example, do not need "curTS" till the last node. So lets only set it there. And let's do it with a Function-Node which only returns one item as that is really all that is needed.

So it would then look like this:

{
  "nodes": [
    {
      "parameters": {
        "url": "https://jsonplaceholder.typicode.com/posts?userId=1",
        "options": {},
        "headerParametersUi": {
          "parameter": []
        }
      },
      "name": "Http Request",
      "type": "n8n-nodes-base.httpRequest",
      "typeVersion": 1,
      "position": [
        450,
        350
      ]
    },
    {
      "parameters": {
        "functionCode": "const newItems = [];\nconst ts = new Date($node[\"Set\"].data[\"curTS\"]).toISOString().slice(0, 19).replace('T', ' ');\nfor (const item of $node[\"Http Request\"].data) {\n  newItems.push({json: {id: item.id, title: item.title, updated_at: ts}});\n}\nreturn newItems;"
      },
      "name": "Function",
      "type": "n8n-nodes-base.function",
      "typeVersion": 1,
      "position": [
        650,
        350
      ]
    },
    {
      "parameters": {
        "operation": "executeQuery",
        "query": "=REPLACE INTO table_a (id, col1, updated_at) VALUES ({{$node[\"Function\"].data[\"id\"]}},'{{$node[\"Function\"].data[\"title\"]}}','{{$node[\"Function\"].data[\"updated_at\"]}}');"
      },
      "name": "MySQL",
      "type": "n8n-nodes-base.mySql",
      "typeVersion": 1,
      "position": [
        850,
        350
      ],
      "credentials": {
        "mySql": ""
      }
    },
    {
      "parameters": {
        "operation": "executeQuery",
        "query": "=SELECT * FROM table_a WHERE updated_at <= FROM_UNIXTIME(FLOOR({{$node[\"Function1\"].data[\"curTS\"]}}/1000.0))"
      },
      "name": "MySQL1",
      "type": "n8n-nodes-base.mySql",
      "typeVersion": 1,
      "position": [
        1250,
        350
      ],
      "credentials": {
        "mySql": ""
      }
    },
    {
      "parameters": {
        "functionCode": "return [\n  {\n    json: {\n      curTS: Date.now()\n    }\n  }\n];"
      },
      "name": "Function1",
      "type": "n8n-nodes-base.function",
      "typeVersion": 1,
      "position": [
        1050,
        350
      ]
    }
  ],
  "connections": {
    "Http Request": {
      "main": [
        [
          {
            "node": "Function",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "Function": {
      "main": [
        [
          {
            "node": "MySQL",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "MySQL": {
      "main": [
        [
          {
            "node": "Function1",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "Function1": {
      "main": [
        [
          {
            "node": "MySQL1",
            "type": "main",
            "index": 0
          }
        ]
      ]
    }
  }
}

If you however really want to execute the last query 10x for some reason you can also specifically reference a fixed item which is now possible in the latest version as discussed in the forum here:
https://community.n8n.io/t/need-items-in-expressions-got-created/605/3?u=jan

Btw. please ask all questions always in the forum. The GitHub-Issues are only to report bugs. Thanks!

from n8n.

auntieyi avatar auntieyi commented on August 26, 2024

Thanks for the quick response.
The $item(0).$node["MyUpstreamNode"].data["property"] is what I looking for.
I will ask questions in the forum. Thanks again.

from n8n.

janober avatar janober commented on August 26, 2024

No problem. Happy to hear that I could help! But again be aware that it will run the query 10x unless it gets changed like in the example I did send.

from n8n.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.