Code Monkey home page Code Monkey logo

fairness-in-ai's Introduction

๐Ÿ‘‹ Hi, Iโ€™m Deepak John Reji

  • ๐Ÿ‘€ Iโ€™m interested in NLP and working with Social and Environment domain
  • ๐ŸŒฑ Iโ€™m currently learning topics in Sociology and Environmental Sustainability
  • ๐Ÿ’ž๏ธ I'm an open-source contributor and I share my models in the Huggingface and PyPI repository. Recently I have been researching the topic "Immigration & Integration", "Bias & Fairness in AI Models" & "Environmental Remediation"
  • ๐ŸŽ Download my Python packages and give me feedback
  • ๐ŸŽฅ Watch my YouTube channel to learn more about me
  • ๐Ÿ“ซ Reach out to me via Linkedin

My Repositories:

dreji18 | Pypi dreji18 | Huggingface


Connect with me:

https://github.com/dreji18 dreji18 | LinkedIn dreji18 | Twitter dreji18 | Instagram dreji18 | Facebook


fairness-in-ai's People

Contributors

dreji18 avatar shainaraza avatar sugatoray avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar

fairness-in-ai's Issues

Question

Thanks for this great work and library, I will dive deeper later, but wanted to ask authors about the pros & cons vs. e.g. IBM Fairness 360?

Not able to get prediction for the text below:

'1. Field of the Invention\nThis invention relates to the preparation of amides of fatty acids having from about 8 to about 20 carbon atoms. In particular, the invention relates to a process for producing the foregoing amides from esters by reaction thereof with amine reactant in an anhydrous system.\n2. Description of the Prior Art\nThe preparation of fatty acid amides from fatty acid esters can be accomplished by several processes known in the prior art. In the process of U.S. Pat. No. 3,253,006, the reaction is performed in the presence of what is described as a highly critical amount of water and under high pressures of above 1000 psig. Unfortunately the high pressures used necessitate expensive equipment capable of withstanding high pressure operation and the presence of water produces a very corrosive system requiring special materials of construction.\nIn other prior art the use of solvents other than water is disclosed. For example, U.S. Pat. No. 2,464,094 discloses the use of alcohol solvents such as methanol fed to the reaction system. Although the patent does discuss the subsequent removal of methanol, it does not suggest removal to the extent or in the manner disclosed herein. The problem of slow reaction rate in amidation of esters is evident in the prior art search for catalysts as disclosed, for example, in the main force of U.S. Pat. No. 2,464,094. Another patent dealing with solvents deliberately or fortuitously present is U.S. Pat. No. 2,504,427. Although this patent speaks of distilling off the by-product water or alcohol or using complexing agents, such is not undertaken until after the reaction is terminated.\nIn some instances, the use of catalysts such as salts or alkali metals is regarded as very much undesired. Not only is this an item of expense but also there is the problem of removal of the catalyst after its presence is no longer desired. A process that can be enhanced with catalyst yet which can be performed satisfactorily without catalyst can be useful in various ways.\nOther prior art includes processes in which operation is at low pressures and in the absence of water; however, as discussed in the aforementioned U.S. Pat. No. 3,253,006, the prior art operations under anhydrous conditions have been characteristically slow requiring reaction times of as much as several days. Such long reaction times are undesired for obvious reasons because of the adverse effect thereof upon the ability to produce amides at low cost.\nIt is accordingly an object of the present invention to provide a process for producing amides which does not require either high pressure of operation or catalysts.\nAnother object of the present invention is to provide a process for producing amides using anhydrous conditions. Another object of the present invention is to provide a process for producing amides that does not require solvents.\nAnother object of the present invention is to provide a process for producing amides by reaction of ester and amine reactant wherein high reaction rate is obtained in an anhydrous system at low pressure and in which amine reactant is used as a stripping agent to remove reaction by-products.\nAnother object of the present invention is to provide a process for producing amides from esters of fatty acids and an amine reactant wherein alcohol liberated from the esters in the course of the reaction is removed from the system by stripping with excess amine reactant.'

I am trying to get the bias score for the above text and library is breaking or not able to give the desired results.

`from transformers import AutoTokenizer, TFAutoModelForSequenceClassification
from transformers import pipeline
tokenizer = AutoTokenizer.from_pretrained("d4data/bias-detection-model")
model = TFAutoModelForSequenceClassification.from_pretrained("d4data/bias-detection-model")
Some layers from the model checkpoint at d4data/bias-detection-model were not used when initializing TFDistilBertForSequenceClassification: ['dropout_19']

  • This IS expected if you are initializing TFDistilBertForSequenceClassification from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model).
  • This IS NOT expected if you are initializing TFDistilBertForSequenceClassification from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).
    Some layers of TFDistilBertForSequenceClassification were not initialized from the model checkpoint at d4data/bias-detection-model and are newly initialized: ['dropout_59']
    You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.

classifier = pipeline('text-classification', model=model, tokenizer=tokenizer) # cuda = 0,1 based on gpu availability
classifier(data['text'][1181])
Traceback (most recent call last):

File "", line 2, in
classifier(data['text'][1181])

File "C:\Users\Rajneesh Jha\AppData\Roaming\Python\Python38\site-packages\transformers\pipelines\text_classification.py", line 65, in call
outputs = super().call(*args, **kwargs)

File "C:\Users\Rajneesh Jha\AppData\Roaming\Python\Python38\site-packages\transformers\pipelines\base.py", line 676, in call
return self._forward(inputs)

File "C:\Users\Rajneesh Jha\AppData\Roaming\Python\Python38\site-packages\transformers\pipelines\base.py", line 693, in _forward
predictions = self.model(inputs.data, training=False)[0]

File "C:\Users\Rajneesh Jha\anaconda3\lib\site-packages\keras\utils\traceback_utils.py", line 70, in error_handler
raise e.with_traceback(filtered_tb) from None

File "C:\Users\Rajneesh Jha\AppData\Roaming\Python\Python38\site-packages\transformers\models\distilbert\modeling_tf_distilbert.py", line 800, in call
distilbert_output = self.distilbert(

File "C:\Users\Rajneesh Jha\AppData\Roaming\Python\Python38\site-packages\transformers\models\distilbert\modeling_tf_distilbert.py", line 415, in call
embedding_output = self.embeddings(

File "C:\Users\Rajneesh Jha\AppData\Roaming\Python\Python38\site-packages\transformers\models\distilbert\modeling_tf_distilbert.py", line 119, in call
position_embeds = tf.gather(params=self.position_embeddings, indices=position_ids)

InvalidArgumentError: Exception encountered when calling layer "embeddings" " f"(type TFEmbeddings).

{{function_node _wrapped__ResourceGather_device/job:localhost/replica:0/task:0/device:CPU:0}} indices[0,529] = 529 is not in [0, 512) [Op:ResourceGather]

Call arguments received by layer "embeddings" " f"(type TFEmbeddings):
โ€ข input_ids=tf.Tensor(shape=(1, 732), dtype=int32)
โ€ข position_ids=None
โ€ข inputs_embeds=None
โ€ข training=False`

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.