Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

MLP Regressor: ValueError: shapes (110,4) and (2,110) not aligned: 4 (dim 1) != 2 (dim 0) #1460

Open
raul-parada opened this issue Nov 26, 2023 · 4 comments

Comments

@raul-parada
Copy link

I'm trying to use the MLPRegressor model to learn a batch (learn_many, pred_many) of dataframes with variable dimensions in the number of rows, however the number of columns is 2 (x) and target y of only one.
If I configure the model as:

model = (
    pp.StandardScaler() |
    nn.MLPRegressor(
        hidden_dims=(110,),
        activations=(
            nn.activations.ReLU,
            nn.activations.ReLU,
            nn.activations.ReLU
        ),
        optimizer=optim.SGD(1e-4),
        seed=42
    )
)

I got this error:

File "<__array_function__ internals>", line 200, in dot
ValueError: shapes (110,4) and (2,110) not aligned: 4 (dim 1) != 2 (dim 0)

How should I modify the above model to suite my data? Should I put in inside the loop to control each time the number of rows?

@MaxHalford
Copy link
Member

MaxHalford commented Nov 26, 2023

Hey there. We'll need a minimal reproducible example to debug this.

@raul-parada
Copy link
Author

Here the model.learn_many(x, y)
And below data:
x = {'latitude(m)': {2687426: 41.39042133911506, 2687427: 41.38954084417816, 2687428: 41.39004263366844, 2687429: 41.39026115897981, 2687430: 41.3903534209147, 2687431: 41.39168720075243, 2687432: 41.389906005812335, 2687433: 41.3896067998804, 2687434: 41.38962954069514, 2687435: 41.38966463228027, 2687436: 41.38997196128503, 2687437: 41.38970996702829, 2687438: 41.389857918626056, 2687439: 41.39035987920118, 2687440: 41.39059462971834, 2687441: 41.38980863933558, 2687442: 41.39039964788776, 2687443: 41.38975909163339, 2687444: 41.39020360204856, 2687445: 41.389946461758434, 2687446: 41.39120275118262, 2687447: 41.39005686132916, 2687448: 41.38984822987522, 2687449: 41.39114608699223, 2687450: 41.39081869581249, 2687451: 41.390114637310354, 2687452: 41.3896941581337, 2687453: 41.39140800627263, 2687454: 41.39083467671437, 2687455: 41.39183585479083, 2687456: 41.39045462873719, 2687457: 41.39069555407563, 2687458: 41.390779516537926, 2687459: 41.39050044746961, 2687460: 41.39062791908504, 2687461: 41.39013500822785, 2687462: 41.39084700910927, 2687463: 41.39033056040013, 2687464: 41.39130939788125, 2687465: 41.39042995908144, 2687466: 41.38992355946002, 2687467: 41.39072435675788, 2687468: 41.390874790073525, 2687469: 41.38940380953638, 2687470: 41.390639313834114, 2687471: 41.39017354847877, 2687472: 41.389994704308, 2687473: 41.39127694632597, 2687474: 41.39049730502261, 2687475: 41.3904772642558, 2687476: 41.38974230585069, 2687477: 41.39076890573515, 2687478: 41.39048219358096, 2687479: 41.38957127695049, 2687480: 41.39024996924423, 2687481: 41.3905056274347, 2687482: 41.39131419104382, 2687483: 41.3897904535093, 2687484: 41.39054944857269, 2687485: 41.39014415611732, 2687486: 41.39029699654753, 2687487: 41.38985319841511, 2687488: 41.39027932244997, 2687489: 41.39000063648752, 2687490: 41.39090635134317, 2687491: 41.39012566949913, 2687492: 41.39045088858986, 2687493: 41.39193497121499, 2687494: 41.38991058086917, 2687495: 41.38996378190495, 2687496: 41.38955852648304, 2687497: 41.39093580497119, 2687498: 41.391344362822366, 2687499: 41.3896609948518, 2687500: 41.39148154265635, 2687501: 41.39057872597164, 2687502: 41.38949408892791, 2687503: 41.39088983692174, 2687504: 41.38951031516973, 2687505: 41.39063055699967, 2687506: 41.39125762058165, 2687507: 41.39066176128346, 2687508: 41.391454391375255, 2687509: 41.39240696408756, 2687510: 41.39064978508938, 2687511: 41.391251168564594, 2687512: 41.39058486834821, 2687513: 41.39021028676462, 2687514: 41.39072150750746, 2687515: 41.38948338317169, 2687516: 41.38962991360429, 2687517: 41.39069871702501, 2687518: 41.3900297569778, 2687519: 41.39079750542168, 2687520: 41.39074822072524, 2687521: 41.389761936109146, 2687522: 41.391325253181954, 2687523: 41.39039120218556, 2687524: 41.38978565904199, 2687525: 41.390070230226335, 2687526: 41.38978120289239, 2687527: 41.392132444031, 2687528: 41.389824580179415, 2687529: 41.39136679996845, 2687530: 41.38982296365283, 2687531: 41.39286353759682, 2687532: 41.39139614270374, 2687533: 41.38926616242303, 2687534: 41.38844039121088, 2687535: 41.39310210021111}, 'longitude(m)': {2687426: 2.1645387830545704, 2687427: 2.165901386822197, 2687428: 2.1642446040808894, 2687429: 2.164761949403105, 2687430: 2.1642810366173086, 2687431: 2.1654142681625914, 2687432: 2.1652276434184423, 2687433: 2.1656185975582214, 2687434: 2.1636992086118414, 2687435: 2.1655431510168746, 2687436: 2.1664681005496824, 2687437: 2.1661237136298235, 2687438: 2.166318192942456, 2687439: 2.166867792789403, 2687440: 2.1642935081041355, 2687441: 2.1662534161899254, 2687442: 2.164215604931173, 2687443: 2.1661882867612774, 2687444: 2.1648378567619373, 2687445: 2.1641179093469374, 2687446: 2.163714544294265, 2687447: 2.165030453232293, 2687448: 2.165303164641005, 2687449: 2.163637554435297, 2687450: 2.163662631646569, 2687451: 2.16495493122646, 2687452: 2.1655045569523703, 2687453: 2.1638761925863275, 2687454: 2.1639502254304595, 2687455: 2.1643708848142955, 2687456: 2.1641364855078304, 2687457: 2.1641491802611053, 2687458: 2.1640291083946472, 2687459: 2.164070550537502, 2687460: 2.1665435107831303, 2687461: 2.166682425007083, 2687462: 2.1664541129023367, 2687463: 2.1669340502512395, 2687464: 2.1638538120976465, 2687465: 2.1669692177670843, 2687466: 2.1664044768788124, 2687467: 2.1641079905788363, 2687468: 2.1635849955542765, 2687469: 2.1635036615691305, 2687470: 2.166710358074336, 2687471: 2.164877510009361, 2687472: 2.164181053060099, 2687473: 2.1637577488216104, 2687474: 2.166885560854316, 2687475: 2.1644608669167185, 2687476: 2.16544162161937, 2687477: 2.1637232627628653, 2687478: 2.166719783654896, 2687479: 2.165941346624584, 2687480: 2.1644227487649648, 2687481: 2.164420786079659, 2687482: 2.163751930705867, 2687483: 2.165378686223096, 2687484: 2.164358119602697, 2687485: 2.164916343237991, 2687486: 2.1643583523337457, 2687487: 2.1639958392041976, 2687488: 2.166872126714503, 2687489: 2.164296960805789, 2687490: 2.166380898665088, 2687491: 2.164474271308847, 2687492: 2.166757650734248, 2687493: 2.164964247316191, 2687494: 2.1641246230905247, 2687495: 2.165152121791671, 2687496: 2.1656813616654937, 2687497: 2.1638056042977727, 2687498: 2.163791894329425, 2687499: 2.1660593410056923, 2687500: 2.164027791066545, 2687501: 2.1666030160139265, 2687502: 2.1658366697012057, 2687503: 2.1638713422089086, 2687504: 2.165744044913534, 2687505: 2.1638917341516, 2687506: 2.165889931463341, 2687507: 2.16385373585268, 2687508: 2.163937631636369, 2687509: 2.164356900787214, 2687510: 2.164214632726351, 2687511: 2.1637780702917246, 2687512: 2.166777530096569, 2687513: 2.166781378833949, 2687514: 2.1637809811166275, 2687515: 2.163556482620449, 2687516: 2.166018475962697, 2687517: 2.166637069306896, 2687518: 2.166544072540592, 2687519: 2.1665151885599965, 2687520: 2.1665759939017475, 2687521: 2.1620764678200546, 2687522: 2.163820809919434, 2687523: 2.166829915167404, 2687524: 2.163907438868404, 2687525: 2.1643956820129606, 2687526: 2.162122615855465, 2687527: 2.1648019090650377, 2687528: 2.1619646474072742, 2687529: 2.1639298434809486, 2687530: 2.163956265763196, 2687531: 2.1636626454841603, 2687532: 2.163968709196234, 2687533: 2.163208704625358, 2687534: 2.162120857635756, 2687535: 2.166216303869489}}

y= {2687426: 14215280005135267334, 2687427: 14215280004681986824, 2687428: 14215279982014841758, 2687429: 14215280004977614736, 2687430: 14215279982216274169, 2687431: 14215280017101476860, 2687432: 14215280005017827787, 2687433: 14215280004315897140, 2687434: 14215279981902924563, 2687435: 14215280005030779204, 2687436: 14215280005593751519, 2687437: 14215280005430167992, 2687438: 14215280005452569596, 2687439: 14215280005830354591, 2687440: 14215279982236608587, 2687441: 14215280005449636502, 2687442: 14215279982217458002, 2687443: 14215280005431922240, 2687444: 14215280004980597014, 2687445: 14215279981960776565, 2687446: 14215279993732558229, 2687447: 14215280005059108555, 2687448: 14215280005016534805, 2687449: 14215279993718879831, 2687450: 14215279982243479269, 2687451: 14215280004970584696, 2687452: 14215280005025342287, 2687453: 14215279993790856001, 2687454: 14215279982265711690, 2687455: 14215280016939085801, 2687456: 14215279982217877330, 2687457: 14215279982280744323, 2687458: 14215279982277169680, 2687459: 14215279982223342251, 2687460: 14215280005823598253, 2687461: 14215280005619421060, 2687462: 14215280005879186037, 2687463: 14215280005835669080, 2687464: 14215279993744387170, 2687465: 14215280005841504034, 2687466: 14215280005546685808, 2687467: 14215279982275301032, 2687468: 14215279982165960380, 2687469: 14215279981077496421, 2687470: 14215280005848827175, 2687471: 14215280004980134601, 2687472: 14215279982012074358, 2687473: 14215279993737211019, 2687474: 14215280005851733438, 2687475: 14215280005141367274, 2687476: 14215280005027850549, 2687477: 14215279982244377109, 2687478: 14215280005843282550, 2687479: 14215280004683877565, 2687480: 14215280004944002243, 2687481: 14215280005141129043, 2687482: 14215279993738234220, 2687483: 14215280005038123832, 2687484: 14215280005140478253, 2687485: 14215280004981401816, 2687486: 14215280004943395013, 2687487: 14215279981953799259, 2687488: 14215280005650039995, 2687489: 14215279982014350565, 2687490: 14215280005881039261, 2687491: 14215280004928876975, 2687492: 14215280005832330887, 2687493: 14215280016995769654, 2687494: 14215279981959811125, 2687495: 14215280005057490296, 2687496: 14215280004673312534, 2687497: 14215279982267390265, 2687498: 14215279993788847180, 2687499: 14215280005425039533, 2687500: 14215279993819972871, 2687501: 14215280005823021169, 2687502: 14215280004672665011, 2687503: 14215279982264896121, 2687504: 14215280004670554080, 2687505: 14215279982203411590, 2687506: 14215280017230983570, 2687507: 14215279982246738021, 2687508: 14215279993796011239, 2687509: 14215280017739183931, 2687510: 14215279982280347257, 2687511: 14215279993737062374, 2687512: 14215280005848370216, 2687513: 14215280005643970356, 2687514: 14215279982241829200, 2687515: 14215279981091034521, 2687516: 14215280005401518460, 2687517: 14215280005871618010, 2687518: 14215280005601309734, 2687519: 14215280005868027302, 2687520: 14215280005872857420, 2687521: 14215279981278336594, 2687522: 14215279993744205454, 2687523: 14215280005832934191, 2687524: 14215279981918383142, 2687525: 14215280004927065192, 2687526: 14215279981279121258, 2687527: 14215280017040872308, 2687528: 14215279981282308749, 2687529: 14215279993791785473, 2687530: 14215279981930449370, 2687531: 14215279994859874981, 2687532: 14215279993815131476, 2687533: 14215279980995520310, 2687534: 14215279980216847709, 2687535: 14215280018586660863}

import pandas as pd
from river import metrics
from river import neural_net as nn
from river import optim
from river import preprocessing as pp
model.learn_many(pd.DataFrame(x), pd.DataFrame(y))

please, let me know if you require more data. Note the number of rows varies. Thanks

@raul-parada
Copy link
Author

Did you have the change to replicate the issue? thanks

@MaxHalford
Copy link
Member

I haven't had time yet. I'll let you know when I do.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants