Comments (24)
I guess – perhaps call it 2.0 and make a big warning.
Sounds good with the webinar! It would be good to have a release out that fixes these usability concerns.
from pq.
I think the documentation here is lacking.
You're supposed to use the queue instance as a context manager and thereby obtain a cursor:
with some_queue as cursor:
do_stuff(queue)
do_more_stuff(cursor)
When you exit this block the transaction is committed.
Alternatively you can instantiate the Queue
with an active cursor by passing the cursor
keyword argument:
queue = Queue("test", cursor=cursor)
I'll try and fix up the documentation about this. Thanks for reporting.
from pq.
So you're saying the wildly valuable feature I need is not provided?
Can you see the value of using pq with other applications?
Honestly, the main point is to integrate with existing Postgres transactions. Otherwise, why not just use celery?
from pq.
Imagine Plone using RelStorage and adding an item to a queue as part of a larger transaction.
from pq.
OK, so it sounds like maybe it is a feature. :)
from pq.
So, may I suggest a simpler approach and just let put
work outside a queue provided context manager? That would be consistent with the current documentation, but would be backward incompatible.
Alternatively, add a more explicit flag, like no_commit
or an alternative method for adding to a queue using an existing transaction.
from pq.
FWIW, I see pq
being used in a much more lightweight way that maybe you imagine.
At least in web apps, connections are managed in pools and so application code is given them via some application mechanism. So, in that context, apps aren't going to keep queues around for any length of time, because the connections are going to be reused for other things.
Perhaps there should just be a module-level function:
def enqueue(connection, queue_name, **data):
...
This would:
- Create a cursor for it's use and toss it.
- Do no transaction management.
I'd be happy to implement this on top of the existing machinery and update the README in a PR if you like.
from pq.
Here's a snippet of code I'm using to queue a task in a Flask app that uses sqlalchemy:
conn = flask.g.session.connection().connection
cursor = conn.cursor()
pq.PQ(conn, cursor=cursor)['email'].put(dict(to=to, **kw))
cursor.close()
This illustrates the fleeting nature of connections and how heavy the current API is for this case.
This would be better:
pq.enqueue(flask.g.session.connection().connection, 'email', to=to, **kw)
:)
from pq.
But I can live with what you suggested, especially if it gets documented.
I suspect that when you document it, you may decide you want something else. That's what documenting things tends to do for me.
from pq.
@jimfulton I'm not sure I follow your concerns.
Having the queue connection aside from your app connections pool is how things should work.
Consider the situation where you need to deploy a connection pooler (which is common for large-scale deployments). Your queue will require session pooling while generally your app will require cheap transaction pooling.
Hope this makes sense :)
from pq.
I think I may have misunderstood something if you got the feeling that I don't think it's valuable to use PQ as part of a bigger picture.
I think it definitely is valuable to simply use PQ in a subtransaction and you can definitely do this using the cursor
argument.
I think module-level functions can be a good thing to make it clear that you don't need a stateful queue object.
from pq.
I believe this is how you can write it today:
conn = flask.g.session.connection().connection
queue = pq.PQ(conn)['email']
with queue:
queue.put(dict(to=to, **kw))
But this would be better:
conn = flask.g.session.connection().connection
with pq.PQ(conn)['email'] as queue:
queue.put(dict(to=to, **kw))
It's awkward that the context manager returns a cursor object. It really should be a QueueContext
– meaning that it's a queue with a cursor that has an open savepoint.
from pq.
from pq.
from pq.
from pq.
from pq.
I think you're right that the current enter/exit behavior is just weird.
I propose this:
- If used as a context manager, you'll get SAVEPOINT/ROLLBACK functionality (and reuse of a cursor object).
- If used directly, you'll get a new cursor for each operation.
But in both cases, no transaction will happen!
Perhaps unless you ask for it explicitly:
with queue as q:
q.transaction = True
q.put({...})
Note that enter/exit would return a new object in any case (unlike today).
from pq.
from pq.
Well, I defer, but my original suggestions seems similar and would be consistent with the docs:
-
context manager manages transactions
-
Use outside the context manager doesn't manage transactions (or uses a subtransaction).
This seems less likely to be backward incompatible. <shrug>
from pq.
I have trying to rework some of this logic but got held up by both psycopg2
and psycopg2cffi
segfaulting.
from pq.
from pq.
Yeah I got one fix merged: chtd/psycopg2cffi#79.
I obviously have a bug in my code to the effect of exposing these library issues because fixing the above just led to other issues – even a segfault in the "standard" psycopg2 library, which is weirder still.
from pq.
I made an attempt to rectify this but ran into various issues and unfortunately – out of time, for the time being.
from pq.
I think just documenting the use of the cursor argument to use an existing transaction would be a step forward. Your telling me about that unblocked me. For example:
from pq.
Related Issues (20)
- 1.6? HOT 2
- pq as a "jobs queue" - items can not be read concurrently when using transactional get HOT 19
- Support for schema qualified tables HOT 3
- Hardwired json.dumps introduces double encoding problem HOT 7
- connection pool exhausted HOT 3
- DuplicatePreparedStatement error HOT 10
- Prepared statement name not properly escaped HOT 1
- Question: why am I seeing "NOTICE: function pq_notify does not exist" in the logs? HOT 3
- Delete executed tasks?! HOT 1
- Wait on a job? HOT 3
- get() timeout not honoured HOT 1
- Any interest in porting to cockroach HOT 3
- Pipenv using old create.sql HOT 5
- Get id of current task HOT 2
- Performance with a very large queue HOT 9
- lost trigger when setting up multiple queue-tables within the same schema
- Adding recurring tasks? HOT 4
- queue.put() inside a transaction sets enqueued_at to the transaction start time, not the current time HOT 4
- Adding name of the job producer HOT 2
- Dashboard for PQ HOT 3
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from pq.